News
The featured The People’s Internet post for the next month is on The People’s Bid for TikTok.
The People’s Bid is a once-in-a-generation opportunity for Americans to reclaim a voice, choice, and stake in the future of the internet. In April 2024, Congress passed legislation forcing a ban or sale of TikTok in the U.S.
Project Liberty is building a broad consortium of technologists, investors, community leaders, and creators to purchase TikTok and migrate the platform to new infrastructure that allows people to control their own data.
- Throughout the month, we will be adding to this post articles, livestreams, and videos about the latest DSNP related projects, organizations, and events.
- You can also participate in discussions in all these posts as well as share your top news items and posts (for onAir members – it’s free to join).
When fake news sounds like you
It’s never been easier to clone one’s voice. An online search will yield dozens of companies offering the service. What once required sophisticated equipment and expertise can now be done on a smartphone with a free app.
While there are harmless use cases of voice clones (e.g., a voice-over for an audiobook or the creation of a podcast), voice cloning technology has also been harnessed for nefarious purposes.
The rise of voice-based deepfakesImposter scams are not new, but AI technology makes them more believable
Making it harder to create audio deepfakes
Addressing audio deepfakes requires a multifaceted approach at all levels, from individual awareness to technical solutions to policy changes.
Gen AI based digital ads will make the internet almost unrecognizable in the next 9 month
The future of advertising is going to change significantly in 2026. I’m not even sure the internet is ready for it.
Fully automated AI ads are on the roadmap for 2026 and we are already seeing signs of this by what the likes of ByteDance and Meta are testing. ByteDance’s virtual influencers are already here. Google of course is also bringing Ads to its AI Mode.
Ads may appear “where relevant” below and “integrated into” AI Mode responses Google has said. I asked
As text-to-video improves, emotive promoting in audio refines itself, small businesses and brands will have more options for automated Ads that could cut down the need for Ad managers and salespeople in advertising drastically. I believe that 2026 is the big year for this, literally just months away. Generative AI will change how Ads are made, where they are seen and how personalized they are to us and even their delivery timing in social commerce journeys.
The Conversation, – July 21, 2025
Champions of the almost entirely party-line vote in the U.S. Senate to erase US$1.1 billion in already approved funds for the Corporation for Public Broadcasting called their action a refusal to subsidize liberal media.
“Public broadcasting has long been overtaken by partisan activists,” said U.S. Sen. Ted Cruz of Texas, insisting there is no need for government to fund what he regards as biased media. “If you want to watch the left-wing propaganda, turn on MSNBC,” Cruz said.
Accusing the media of liberal bias has been a consistent conservative complaint since the civil rights era, when white Southerners insisted news outlets were slanting their stories against segregation. During his presidential campaign in 1964, U.S. Sen. Barry Goldwater of Arizona complained that the media was against him, an accusation that has been repeated by every Republican presidential candidate since.
But those charges of bias rarely survive empirical scrutiny.
As chair of a public policy institute devoted to strengthening deliberative democracy, I have written two books about the media and the presidency, and another about media ethics. My research traces how news institutions shape civic life and why healthy democracies rely on journalism that is independent of both market pressure and partisan talking points.
Trusting independence
Ad Fontes Media, a self-described “public benefit company” whose mission is to rate media for credibility and bias, have placed the reporting of “PBS NewsHour” under 10 points left of the ideological center. They label it as both “reliable” and based in “analysis/fact.” “Fox and Friends,” by contrast, the popular morning show on Fox News, is nearly 20 points to the right. The scale starts at zero and runs 42 points to the left to measure progressive bias and 42 points to the right to measure conservative bias. Ratings are provided by three-person panels comprising left-, right- and center-leaning reviewers.
A 2020 peer-reviewed study in Science Advances that tracked more than 6,000 political reporters likewise found “no evidence of liberal media bias” in the stories they chose to cover, even though most journalists are more left-leaning than the rest of the population.
A similar 2016 study published in Public Opinion Quarterly said that media are more similar than dissimilar and, excepting political scandals, “major news organizations present topics in a largely nonpartisan manner, casting neither Democrats nor Republicans in a particularly favorable or unfavorable light.”
Surveys show public media’s audiences do not see it as biased. A national poll of likely voters released July 14, 2025, found that 53% of respondents trust public media to report news “fully, accurately and fairly,” while only 35% extend that trust to “the media in general.” A majority also opposed eliminating federal support.
Contrast these numbers with attitudes about public broadcasters such as MTVA in Hungary or the TVP in Poland, where the state controls most content. Protests in Budapest October 2024 drew thousands demanding an end to “propaganda.” Oxford’s Reuters Institute for the Study of Journalism reports that TVP is the least trusted news outlet in the country.
While critics sometimes conflate American public broadcasting with state-run outlets, the structures are very different.
Safeguards for editorial freedom
In state-run media systems, a government agency hires editors, dictates coverage and provides full funding from the treasury. Public officials determine – or make up – what is newsworthy. Individual media operations survive only so long as the party in power is happy.
Public broadcasting in the U.S. works in almost exactly the opposite way: The Corporation for Public Broadcasting is a private nonprofit with a statutory “firewall” that forbids political interference.
More than 70% of the Corporation for Public Broadcasting’s federal appropriation for 2025 of US$1.1 billion flows through to roughly 1,500 independently governed local stations, most of which are NPR or PBS affiliates but some of which are unaffiliated community broadcasters. CPB headquarters retains only about 5% of that federal funding.
Stations survive by combining this modest federal grant money with listener donations, underwriting and foundation support. That creates a diversified revenue mix that further safeguards their editorial freedom.
And while stations share content, each also has latitude when it comes to programming and news coverage, especially at the local level.
As a public-private partnership, individual communities mostly own the public broadcasting system and its affiliate stations. Congress allocates funds, while community nonprofits, university boards, state authorities or other local license holders actually own and run the stations. Individual monthly donors are often called “members” and sometimes have voting rights in station-governance matters. Membership contributions make up the largest share of revenue for most stations, providing another safeguard for editorial independence.

Broadly shared civic commons
And then there are public media’s critical benefits to democracy itself.
A 2021 report from the European Broadcasting Union links public broadcasting with higher voter turnout, better factual knowledge and lower susceptibility to extremist rhetoric.
Experts warn that even small cuts will exacerbate an already pernicious problem with political disinformation in the U.S., as citizens lose access to free information that fosters media literacy and encourages trust across demographics.
In many ways, public media remains the last broadly shared civic commons. It is both commercial-free and independently edited.
Another study, by the University of Pennsylvania’s Annenberg School in 2022, affirmed that “countries with independent and well-funded public broadcasting systems also consistently have stronger democracies.”
The study highlighted how public media works to bridge divides and foster understanding across polarized groups. Unlike commercial media, where the profit motive often creates incentives to emphasize conflict and sensationalism, public media generally seeks to provide balanced perspectives that encourage dialogue and mutual respect. Reports are often longer and more in-depth than those by other news outlets.
Such attention to nuance provides a critical counterweight to the fragmented, often hyperpartisan news bubbles that pervade cable news and social media. And this skillful, more balanced treatment helps to ameliorate political polarization and misinformation.
In all, public media’s unique structure and mission make democracy healthier in the U.S. and across the world. Public media prioritizes education and civic enlightenment. It gives citizens important tools for navigating complex issues to make informed decisions – whether those decisions are about whom to vote for or about public policy itself. Maintaining and strengthening public broadcasting preserves media diversity and advances important principles of self-government.
Congress’ cuts to public broadcasting will diminish the range and volume of the free press and the independent reporting it provides. Ronald Reagan once described a free press as vital for the United States to succeed in its “noble experiment in self-government.” From that perspective, more independent reporting – not less – will prove the best remedy for any worry about partisan spin.
That independence in the United States – enshrined in the press freedom clause of the First Amendment – gives journalists the ability to hold government accountable, expose abuses of power and thereby support democracy.
The Conversation, – July 7, 2025
Every design choice that social media platforms make nudges users toward certain actions, values and emotional states.
It is a design choice to offer a news feed that combines verified news sources with conspiracy blogs – interspersed with photos of a family picnic – with no distinction between these very different types of information. It is a design choice to use algorithms that find the most emotional or outrageous content to show users, hoping it keeps them online. And it is a design choice to send bright red notifications, keeping people in a state of expectation for the next photo or juicy piece of gossip.
Platform design is a silent pilot steering human behavior.
Social media platforms are bringing massive changes to how people get their news and how they communicate and behave. For example, the “endless scroll” is a design feature that aims to keep users scrolling and never reaching the bottom of a page where they might decide to pause.
I’m a political scientist who researches aspects of technology that support democracy and social cohesion, and I’ve observed how the design of social media platforms affects them.
Democracy is in crisis globally, and technology is playing a role. Most large platforms optimize their designs for profit, not community or democracy. Increasingly, Big Tech is siding with autocrats, and the platforms’ designs help keep society under control.
There are alternatives, however. Some companies design online platforms to defend democratic values.
Optimized for profit
A handful of tech billionaires dominate the global information ecosystem. Without public accountability or oversight, they determine what news shows up on your feed and what data they collect and share.
Social media companies say they are in the business of connecting people, but they make most of their money as data brokers and advertising firms. Time spent on platforms translates to profit. The more time you spend online, the more ads you see and the more data they can collect from you.
This ad-based business model demands designs that encourage endless scrolling, social comparison and emotional engagement. Platforms routinely claim they merely reflect user behavior, yet internal documents and whistleblower accounts have shown that toxic content often gets a boost because it captures people’s attention.
Tech companies design platforms based on extensive psychological research. Examples include flashing notifications that make your phone jump and squeak, colorful rewards when others like your posts, and algorithms that push out the most emotional content to stimulate your most base emotions of anger, shame or glee.
Optimizing designs for user engagement undermines mental health and society. Social media sites favor hype and scandal over factual accuracy, and public manipulation over designing for safety, privacy and user agency. The resulting prevalence of polarizing false and deceptive information is corrosive to democracy.
Many analysts identified these problems nearly a decade ago. But now there is a new threat: Some tech executives are looking to capture political power to advance a new era of techno-autocracy.
Optimized for political power
A techno-autocracy is a political system where an authoritarian government uses technology to control its population. Techno-autocrats spread disinformation and propaganda, using fear tactics to demonize others and distract from corruption. They leverage massive amounts of data, artificial intelligence and surveillance to censor opponents.
For example, China uses technology to monitor and surveil its population with public cameras. Chinese platforms like WeChat and Weibo automatically scan, block or delete messages and posts for sensitive words like “freedom of speech.” Russia promotes domestic platforms like VK that are closely monitored and partly owned by state-linked entities that use it to promote political propaganda.
Over a decade ago, tech billionaires like Elon Musk and Peter Thiel, and now Vice President JD Vance, began aligning with far-right political philosophers like Curtis Yarvin. They argue that democracy impedes innovation, favoring concentrated decision-making in corporate-controlled mini-states governed through surveillance. Embracing this philosophy of techno-autocracy, they moved from funding and designing the internet to reshaping government.
Techno-autocrats weaponize social media platforms as part of their plan to dismantle democratic institutions.
The political capture of both X and Meta also have consequences for global security. At Meta, Mark Zuckerberg removed barriers to right-wing propaganda and openly endorsed President Donald Trump’s agenda. Musk changed X’s algorithm to highlight right-wing content, including Russian propaganda.
Designing tech for democracy
Recognizing the power that platform design has on society, some companies are designing new civic participation platforms that support rather than undermine society’s access to verified information and places for public deliberation. These platforms offer design features that big tech companies could adopt for improving democratic engagement that can help counter techno-autocracy.
In 2014, a group of technologists founded Pol.is, an open-source technology for hosting public deliberation that leverages data science. Pol.is enables participants to propose and vote on policy ideas using what they call “computational democracy.” The Pol.is design avoids personal attacks by having no “reply” button. It offers no flashy newsfeed, and it uses algorithms that identify areas of agreement and disagreement to help people make sense of a diversity of opinions. A prompt question asks for people to offer ideas and vote up or down on other ideas. People participate anonymously, helping to keep the focus on the issues and not the people.
Taiwan used the Pol.is platform to enable mass civic engagement in the 2014 democracy movement. The U.K. government’s Collective Intelligence Lab used the platform to generate public discussion and generate new policy proposals on climate and health care policies. In Finland, a public foundation called Sitra uses Pol.is in its “What do you think, Finland?” public dialogues.
Barcelona, Spain, designed a new participatory democracy platform called Decidim in 2017. Now used throughout Spain and Europe, Decidim enables citizens to collaboratively propose, debate and decide on public policies and budgets through transparent digital processes.
Nobel Peace Laureate Maria Ressa founded Rappler Communities in 2023, a social network in the Philippines that combines journalism, community and technology. It aims to restore trust in institutions by providing safe spaces for exchanging ideas and connecting with neighbors, journalists and civil society groups. Rappler Communities offers the public data privacy and portability, meaning you can take your information – like photos, contacts or messages – from one app or platform and transfer it to another. These design features are not available on the major social media platforms.
CEO Tony Stubblebine reined in spending and figured out how to actually reward high quality writing.
In August 2023, Stubblebine announced that, moving forward, anyone who wanted to receive payouts from the Medium partnership program needed to be a paying member themselves. The thinking behind this decision was that it jettisoned all the “growth hackers” who didn’t have skin in the game. “If you aren’t part of the Medium community, you tend not to understand what does well,” he explained. “You don’t engage with people. I think part of what makes Medium valuable is getting to talk to the writers themselves [in the comments section]. Everything we were reading on Medium felt more high quality, more valuable if the person was actually active on Medium.”
Next, Stubblebine needed a way to identify the highest quality content published by Medium’s writers and get it in front of the platform’s users, and the only way to do that, in his view, was to insert more humans into the curation process. “I think a lot of people think an algorithm can judge quality,” he said. “And I just think that they’re wrong.”
Of course, relying entirely on volunteers would still make the partnership program vulnerable to gaming, so Medium also employs a staff of about 30 curators whose job it is to comb through these publications and “Boost” the best writing. A Boosted article is not only weighted higher by the recommendation algorithm, but also receives a higher payout. “It could be like a 4X multiplier on the payment,” Stubblebine said. “So the biggest payments are going to stories that are boosted, and that means we’re much more likely to be paying for something that two levels of human editors looked at and loved.”
(1) Congratulations to Substack on their Series C funding round. The company now has over a $1 billion valuation. Casey Newton has the details on what this likely means for the company and its product line. Tl;DR, for unmonetized newsletter writers like myself:
The underly finances of Substack’s core product do not scale well at all. As Casey puts it:
That business has never entirely made sense to me. The company takes a 10 percent cut of subscription revenue, meaning that the more your business grows, the worse a bargain it is for you. (The level of service you get paying the company $1,000 a year is more or less the same as it is if you pay it $50,000 a year.)
The higher the valuation, the greater the investor pressure to juice those numbers somehow. Substack is already warming up to adding programmatic advertising to newsletters. We’ll see how it goes, but I suspect newsletters like this one are going to face serious headwinds. I’m a Substack free-rider. One way or another, they’re probably going to try to make money off of my writing. And once they start trying to squeeze it for revenues, they’ll probably just keep squeezing.
The Society of Problem Solvers, – June 29, 2025
Communities Act Best When They Perceive Clearly
To meet this challenge, we need more than new media models. We need new ways of being together. This essay proposes a reorientation: from journalism as product to signal as relationship, from centralized broadcasting to decentralized coordination, and from consumer media logic to Power With information infrastructures. Drawing from the theory of Coordination as Power, and using human swarm intelligence as a generative model, we explore the concept of Civic Signal Hubs: modular, community-driven systems that support visibility, coherence, and mutual accountability without reproducing hierarchies. These hubs are not technological solutions, though technology may assist them. They are cultural practices made visible.
The Civic Signal Hub: A Pattern for Emergence
If Power With is the missing ingredient in our information systems, then the Civic Signal Hub is a pattern for reintroducing it. A Civic Signal Hub is not a singular technology or a physical space, but a living, adaptive structure. It exists wherever people come together to share, interpret, and act upon local signals in a way that sustains mutual visibility and collective agency. Unlike legacy media organizations or content platforms, a Civic Signal Hub is not designed to broadcast from the center. It is designed to weave from the margins, allowing coordination to emerge through a distributed network of relationships.
To bring the idea of a Civic Signal Hub into practice, we must move from metaphor to structure. Although each hub will necessarily reflect the distinct needs, culture, and rhythms of its locality, there are core components that make the model function. These elements are not rigid requirements. They are dynamic functions that, when present in some form, allow the flow of Power With coordination to emerge and sustain itself. Think of them less as a checklist and more as a constellation: loosely held, deeply interconnected, and adaptable to change.
A study published last month by researchers at MIT’s Media Lab found that using ChatGPT to write essays negatively influenced the learning process.
The study divided 54 subjects, aged 18 to 39, into three groups, with the assignment to write an SAT essay.
- The first group was assigned to use OpenAI’s ChatGPT to write the essay.
- The second group was assigned to use Google’s search engine to write the essay.
- The third group was given no tools at all.
Every semester in New York City, a quiet experiment unfolds: 19-year-olds gather in a classroom at NYU to explore what it means to live a good life. The course is called “Flourishing.”
The premise of the course is simple: Your personal and professional flourishing is directly related to your ability to control your attention.
The course is taught by Professor Jonathan Haidt, author of The Anxious Generation. When his students begin to reclaim their focus, Haidt sees transformational results: They excel academically, experience fewer distractions, and form deeper, more meaningful connections with their peers.
The Flourishing course taps into an idea that social media—and the constant stimuli of algorithmically engineered digital spaces—has fractured our capacity for sustained focus and presence:
- Haidt told Ezra Klein on a podcast earlier this year that TikTok is “the greatest demolisher of attention in human history.”
- A recent article in The Atlantic cited widespread lamentations by professors that today’s college students don’t have the attention span to read books, let alone a brief sonnet.
- A 2023 study by Common Sense Media found that a typical adolescent now receives 237 notifications a day, or about 15 for every waking hour.
Group chats: Today’s private internet
We live in the age of the group chat. Consider WhatsApp as an example.
- Between 2012 and 2023, WhatsApp gained 2.5 billion monthly active users (or 30% of the global population). Today, the most popular messaging app in the world has over 3 billion monthly active users, and is growing at about 8% per year.
- One study found that fewer than 2% of WhatsApp users use the app exclusively for one-on-one messaging. “The group chat feature is used frequently by nearly every WhatsApp user,” the study concluded. The app is more of a group chat messaging app than a one-to-one messaging app.
- WhatsApp is dominant in places like India and Brazil, with WeChat as the messaging app of choice in China (it has grown from 50 million users in 2012 to over 1.4 billion today).
- The end-to-end encrypted Signal messaging app has been gaining traction and users from WhatsApp, as well. It has 70 million monthly active users. Meanwhile, iMessage has over 1 billion monthly active users.
The content shared in private one-on-one or group messages is considered “dark social” content. Unlike the “dark web,” dark social content is considered “dark” because it is out of sight on public platforms and difficult to track. According to analysts, 95% of all content shared online is dark social content. What we see on feeds on social media platforms like TikTok, Instagram, or X accounts for only 5%.
Let’s stop pretending this is just drama between two oversized egos. It’s not just about Trump calling Musk a lunatic or Musk firing back with Epstein-coded slime. This week’s meltdown between the former president and the world’s richest man is a symptom of something much bigger — and much more dangerous.
These aren’t just two men with platforms. They own them.
Let’s talk scale. X has about 650 million monthly active users worldwide — with around 60% under 35 — and it dominates news and cultural conversation in ways that no newspaper or network ever could. Truth Social is much smaller, hovering around 6 million monthly users, nearly all of them in the U.S., but what it lacks in reach, it makes up for in ideological purity. These aren’t just “apps.” They are fully functioning media ecosystems, operating without editors, without fact-checkers, without rules.
X and Truth Social don’t compete with traditional media — they drown it out. They out-shout Instagram, Reddit, or even YouTube in political influence, especially in election cycles. But they don’t just broadcast content. They algorithmically amplify it — injecting bias, bile, and personal agendas directly into the bloodstream of public discourse. No newsroom. No standards. No accountability. Just the unfiltered whims of two egomaniacs with vendettas and loyal followings.
This isn’t a fight between two guys online. It’s a battle for the infrastructure of truth itself.
Every headline covering their feud danced around the real story. Reporters gamed out how Trump might unleash government agencies to punish Musk. Pundits speculated Musk might push anti-Trump content down your feed. No one seemed shocked.
Few asked the real questions:
Why do two individuals have this kind of power over information in the first place? How did we allow truth itself to become a privately-owned asset?
New Initiative to Advance Responsible Data and AI Investment in Venture Capital Launches at SuperVenture 2025 in Berlin
Project Liberty Institute announced a strategic partnership with VentureESG, a leading network of over 550 venture capital firms and 100+ limited partners committed to integrating environmental, social, and governance factors into investment, and ImpactVC, the world’s largest community of over 700 VCs investing for both financial returns and positive societal outcomes.
Launched at SuperVenture 2025, the leading global gathering of VCs and LPs, this collaboration comes when the industry is under increasing pressure to define responsible investment in the data and AI space. The initiative aims to help establish shared frameworks that guide responsible governance practices before regulation and market shifts make them imperative. The first step: a sector-wide survey to benchmark current approaches and identify actionable pathways toward more accountable, resilient, and forward-looking investment models.
“We’re seeing the early signs of a shift in venture,” said Paul Fehlinger, Director of Policy, Governance Innovation & Impact at Project Liberty Institute, who leads engagement with investors. “Most VCs are just starting to grapple with responsible data and AI governance, but some forward-looking LPs are already asking tougher questions. As the sector races ahead, this is a rare window to jointly develop the standards before regulation and market dynamics force everyone’s hand. Investors who move early won’t just mitigate risk—they’ll be better positioned to win over institutional capital and attract founders who see responsible AI as a competitive edge. Ultimately, it’s how we build more resilient companies and ensure the next wave of tech creates real value for users, entrepreneurs, and investors alike.”
Two initiatives to create a more open web, where users are in control of their own digital identities and data, may be coming together. At SXSW 2025, entrepreneur Frank McCourt, whose Project Liberty is developing open internet infrastructure (and is throwing its hat in the ring as a potential buyer for TikTok), announced that his organization has been in discussions with internet pioneer Tim Berners-Lee about an integration with Solid, his open source project aimed at giving people control over their own data.
In a panel at SXSW, McCourt shared that his team had “talked to Tim Berners-Lee about Solid,” adding that “Project Liberty is compatible with Solid.”
Though he didn’t announce an official partnership, McCourt suggested that discussions were underway on a future collaboration.
“We’re debating, or talking, right now about how to incorporate that — him and Solid, his Solid Pods — into the project,” McCourt teased.
Project Liberty founder Frank McCourt sat down with Semafor’s Max Tani at the World Economy Summit on April 23, 2025.
Project Liberty Institute Contributes to Responsible Data and AI Dialogue at the EU-UN-OECD Conference
On May 12, 2025, Project Liberty Institute’s Director of Policy, Innovation & Impact, Paul Fehlinger, joined international leaders at a special conference at the Organisation for Economic Co-operation and Development (OECD) headquarters in Paris on the occasion of the EU Day. Speaking on the opening panel with ambassadors from the EU and African Union as well as the OECD and the Office of the UN High Commissioner for Human Rights (OHCHR), Fehlinger participated in important discussions about responsible approaches to data and AI investment. The event brought together government officials, investors, and innovation experts to explore how investment strategies can support responsible tech development.
Who was at the table?
This timely dialogue was a collaborative effort between three key organizations. The OECD, which sets global economic standards across its 38 member countries, partnered with the EU—known for groundbreaking tech regulations like the General Data Protection Regulation Act (GDPR), the Digital Markets Act (DMA), and the AI Act—and the UN B-Tech Project an OHCHR initiative focused on human rights in technology and business. The OHCHR contributed crucial insight into rights-based governance approaches. The event also focused on EU-Africa collaboration to scale responsible data and AI practices.
Spearheading Project Liberty Institute’s work at the intersection of governance, entrepreneurship, and capital, Fehlinger highlighted the critical role of both public actors and private market investors in building a sustainable and high-performing data and AI economy. To unlock a fair data economy, he argued, public-private collaboration should focus on infrastructure that supports democracy, improves market dynamics, and enables long-term value creation in the digital era.
A blueprint for citizen advocacy
Schmill’s story provides a blueprint for anyone seeking to make change on the biggest stages. Here are five takeaways for anyone seeking to champion policy change.
- Find the leverage point. Schmill focused on what was feasible. The timing, the awareness, the momentum, it was all leading to phone-free schools. So she concentrated there because legislation was possible. She cautioned that you have to be careful not to just spin your wheels on a bill that has no chance of passing. Not every issue is ready for legislative action.
- Focus on learning and seek collaboration. It sounds like a poster you’d see in an office hallway. But Schmill was relentless. She studied legislative processes, pored over bills, and obsessed with the details. Then she didn’t go alone. “Learn from other people who’ve done it and let them show you the way,” she said.
- Embrace compromise. “When I was working on KOSA, some of my favorite people to work with were on the other side of the aisle. That was a surprise and provided valuable perspective,” she said. “Despite the times we live in, there is opportunity to work together when both sides are willing to listen and are open to reasonable compromises.”
- Start with passion, not experience: “Experience is always helpful but it’s your passion for an issue that is most important. The issue is what motivates me every day to go outside of my introvert comfort zone, to meet with many different people, to be open to other points of view, and to find solutions that will get us to the end goal of protecting children. You don’t need to be an expert—you need to be committed.”
May 13, 2025 Project Liberty Newsletter:
Fifteen years ago, Public Benefit Corporations (PBCs) didn’t exist. Today, they have become a popular legal structure for some of the biggest tech companies in the world.
In the 200+ year history of U.S. corporate law, PBCs are a recent legal invention. The first state to pass PBC legislation was Maryland in 2010. Today, 41 states (and the District of Columbia) have laws that enable PBCs.
Unlike traditional corporate structures like C-Corps and S-Corps, which are designed to maximize shareholder value, PBCs promise an alignment between profit and a defined public benefit to society.
PBCs have been making news recently, with OpenAI’s recent decision to convert its for-profit business to a PBC controlled by a nonprofit parent entity.
Becoming a PBC has many benefits:
- Mission alignment: By legally embedding its social mission into its company’s DNA, a PBC structure can help tech firms stay focused on long-term societal impact.
- Public goodwill: A PBC structure can lead to enhanced consumer, employee, and investor trust in the brand. For AI companies responsible for the development of disruptive technologies, becoming a PBC is a step (though a small one) in assuaging the public that those in power have broader societal concerns in mind.
- Greater transparency: PBCs are required to adhere to regular and transparent reporting requirements. However, these requirements do not require AI companies to reveal how their AI algorithms work (a complaint that many have raised). It’s unclear if a shift in legal structure will lead to the type of transparency critics seek.
// Apply now: McCourt TPP Visiting Fellows Program
Deadline: June 6
The McCourt School’s Tech & Public Policy program is now accepting applications for its Fall 2025 Visiting Fellows cohort. This semester-long opportunity invites policy professionals to share their expertise with Georgetown students through lectures, discussions, and events in Washington, DC.
May 6, 2025 Project Liberty Newsletter
For & Against
The current regulation-innovation debate is not new. It’s a dynamic that has shown up in various industries and markets worldwide. Here are a few examples of the relationship between regulation-innovation:
- Regulation can hinder: In most countries, regulations vary based on firm size. The bigger the company, the greater the regulation. Researchers found that companies in France that were nearing a 50-employee size (a headcount threshold that leads to greater labor regulations) innovated less.
- Regulation can enable: Section 230 in the U.S. gave online platforms legal immunity for user content moderation. This protection allowed countless online business models to flourish and is credited with helping create today’s internet (for better and worse).
- Regulation can hinder: America’s Nuclear Regulatory Commission has been recently criticized for regulatory overreach. Those who see nuclear energy as necessary to build a clean-energy economy believe “crushing regulation” has made it virtually impossible for new nuclear reactors to be built in the U.S. (Only three new reactors have been completed in nearly three decades, reflecting widespread criticism of regulatory overreach.)
- Regulation can enable: Antitrust regulation to break up monopolies has created a level playing field upon which new technologies could emerge. Two separate antitrust lawsuits against IBM in 1969 and AT&T in 1974 helped create the conditions for Microsoft and Apple to launch the personal computer revolution.
There are countless other examples highlighting the complex relationship between regulation and innovation, from net neutrality laws in the U.S. to environmental rules that incentivized the formation of new industries like carbon capture, to government regulation that contributed to a boom in domestic manufacturing (in the case of China’s electric vehicle sector).
On Jan 15, 2025 at Stiftung Mercator in Berlin, RadicalxChange Foundation, along with partners Global Solutions Initiative and Sciences Po Technology and Global Affairs Innovation Hub, co-hosted a side event to the Paris AI Action Summit. We focused on the future of collective bargaining in the context of the AI revolution. The discussions helped to advance our thinking in several important ways. Here are some quick initial reflections.
History suggests that following significant technological breakthroughs, individuals and communities often endure temporary but harmful losses of economic bargaining power. (For example, real living standards declined in industrializing countries between the mid-18th and the early-to-mid 19th centuries, in part because individuals’ contributions to vital productive processes became more interchangeable and therefore lacked bargaining power.) On a longer arc of history, new technology’s benefits usually accrue to whole societies, but such short-term social disruptions partly offset those benefits and frequently destabilize societies. It is therefore important to strategize toward achieving social equilibrium quickly, robustly, and without undermining the processes of technological development.
Power rebalancing after technological breakthroughs occurs through at least three pathways: technological, political, and social. Technological rebalancing occurs when the dissemination or cheapening of the relevant technology undermines the advantage of the technology’s owners (as in the personal computer and software revolutions). Political rebalancing occurs when direct state interventions check the rights of businesses to exploit the new technology (as in the 18th century, when speech controls and intellectual property statutes limited the power of printing press owners). Social rebalancing occurs when social or labor organizations form a collective counterpower, achieving an economic foothold vis-a-vis the technology’s owners (as in the late part of the industrial revolution). These pathways are not mutually exclusive, possess unique benefits and drawbacks, and are more or less suitable in different societal and technological situations.
What might these modes of rebalancing look like in the nascent AI revolution? Which are likeliest to mitigate losses of bargaining power and/or uphold the integrity of individuals and communities? We will first define, then critique and evaluate three pathways.
In this episode, Matt Prewitt sits down with Audrey Tang, Taiwan’s Cyber Ambassador-at-large and 1st Digital Minister, as well as the star of the new short documentary Good Enough Ancestor. It is a fascinating conversation exploring the profound intersections of technology, spirituality, and democracy.
In today’s episode, renowned academic and legal scholar Professor Joseph H.H. Weiler speaks with Matt about The Trial of Jesus – connecting the historical event as a lens for understanding justice, religious pluralism, and democracy. The examination leads us through the limits of state neutrality in matters of faith, the balance between freedom of and from religion, and the evolving role of digital platforms. Professor Weiler shares perspectives from his extensive legal scholarship while reflecting on the intersection of theology, democracy, and technological change in our modern world. An incredibly poignant episode that is a must-listen.
Note: This episode was recorded in Dec 2024.
This hub focuses on the People’s Internet that has been catalyzed and supported by Project Liberty and Frank McCourt.