Fall 2025 News

News

Who pays for the future of the web?
Project LibertySeptember 3, 2025

We’re at the start of the next rebundling. New business models, like those outlined in Project Liberty Institute’s report on the Fair Data Economy, are emerging that balance data rights with innovation and growth.

 

Consider the following models:

License & syndication. Media companies license content directly to AI firms. In 2024, News Corp signed a $250 million deal with OpenAI to use its content for training and queries. The New York Times struck a deal with Amazon while continuing its lawsuits against Microsoft and OpenAI. The Associated Press, Financial Times, and Dotdash Meredith have inked similar agreements.

Pay-per-crawl & API access. Cloudflare’s pilot program lets publishers decide whether to allow, block, or charge AI crawlers each time they request content (via the public protocol HTTP response code 200).

Attribution-based revenue sharing. Perplexity shares ad revenue with publishers when its chatbot uses their content, with partners like the Los Angeles Times and Adweek. Zendy compensates academic publishers based on citation frequency in AI responses.

Digital asset business models. On the individual level, people are beginning to sell their personal data, and a range of digital ownership models are emerging, including the Frequency blockchain.

Closed content ecosystems. Paywalls and subscriptions protect content from scraping while generating direct revenue. The New York Times has doubled digital subscribers since 2020 to nearly 12 million. Platforms like Substack offer smaller creators similar protection.

Community-supported content. Wikipedia’s and Signal’s donation model and Patreon’s creator memberships show the enduring power of direct audience support. Patreon has 290,000 creators, who collectively earn $25 million every month from their fans.

When chatbots fuel delusions
Project LibertyAugust 19, 2025

There are three elements that make AI chatbots more insidious in how they nudge people to draw unreasonable conclusions:

  1. They are personalized. Chatbots engage in highly personal, one-on-one-like dialogue. They tailor replies to what has been shared in the conversation, and newer models can even remember selected details across sessions. This sense of personalization has led some people to become emotionally overreliant on chatbots—treating them as mentors, confidants, or even arbiters in their lives.
  2. They are also sycophantic. AI chatbots are trained to optimize for user satisfaction, which often means mirroring rather than challenging ideas—a design feature researchers call sycophancy. Instead of probing assumptions or offering critical pushback, chatbots tend to validate, agree with, and even praise a person’s contributions. The result is a conversational partner that feels affirming but can quietly reinforce biases, encourage overconfidence, and create self-reinforcing loops.
  3. They are “improv machines.” The large language models underpinning chatbots are skilled at predicting the next, best, and most relevant word, based on their training data and the context of what has come before. Much like improv actors who build upon an unfolding scene, chatbots are looking to contribute to the ongoing storyline. For this reason, Helen Toner, the director of strategy and foundational research grants at Georgetown University’s Center for Security and Emerging Technology (CSET), calls them “improv machines.”
Welcome to the vibe coding revolution
Project LibertyAugust 12, 2025

The vibe coding revolution

Vibe coding has the potential to unlock new forms of creativity and democratize access to software development.

Democratizing technology
The barrier to creating apps, websites, and even entire businesses has been significantly reduced as conversational AI chatbots replace the need for deep technical expertise in coding languages. Kids as young as eight years old can now vibe code.

Increased speed
Vibe coding increases the speed of development and prototyping. What used to take days now takes hours. What used to take hours now takes minutes. The time between idea and workable prototype has shrunk, and the experience has improved.

  • In educational settings, students can rapidly prototype ideas and receive immediate visual feedback, leading to a more engaging and motivational approach to learning.
  • In professional settings, 84% of developers are using AI coding tools in their workflows, according to the 2025 Developer Survey.

Greater creativity
Instead of spending time mastering the precise rules, structures, and syntax of programming languages (such as debugging semicolon placement and memorizing function signatures), people can now focus on computational thinking—the ability to break down complex problems, recognize patterns, and design logical solutions using technology. Builders can outsource the burdensome cognitive load of coding to software, allowing them to stay focused on the bigger picture.

The data privacy risks
The rise of vibe coding could lead to substantial data privacy risks. We might be at the dawn of an explosion of software created by individuals that lacks proper security protocols and data privacy settings. As we observed with 23andMe (by no means a small or vibe-coded company), the bankruptcy of a company could expose users to losing control of their data or it being sold.

Building a robust data privacy infrastructure is more complicated than vibe coding a website. As the number of solopreneur vibe-coded tools grows exponentially, so too could the gaps and vulnerabilities around data privacy and security.

// The risks of cognitive offloading

Tools that democratize access and accelerate development can also encourage us to hand over too much of our thinking to machines.In a July newsletter, we explored the implications of “cognitive offloading” when leaning on AI to do too much of our thinking for us. A similar disengagement occurs when AI tools handle the heavy lifting in the coding process.

Lisa Barceló, a staff data scientist at Gusto, a payroll software company, is one of the top users of Cursor on the data team.

“It’s a difficult balance to strike between what to offload and what to hold tightly,” she said. “There’s a temptation to outsource too much work to AI tools. But when we do, we abdicate our role as strategists and true data scientists.”

// The human role in building technology

With tools that help us to outsource the technical work to AI, how should the education of technologists like software engineers and data scientists change?

 

At the University of Washington, the curriculum is already evolving. Magdalena Balazinska, head of the Paul G. Allen School of Computer Science & Engineering, put it starkly:

Discuss

OnAir membership is required. The lead Moderator for the discussions is onAir Curators. We encourage civil, honest, and safe discourse. For more information on commenting and giving feedback, see our Comment Guidelines.

This is an open discussion on the contents of this post.

Home Forums Open Discussion

Viewing 1 post (of 1 total)
Viewing 1 post (of 1 total)
  • You must be logged in to reply to this topic.
Skip to toolbar