AI is being built on the largest unregulated extraction of knowledge in history. The datasets powering today’s AI systems are drawn largely from publicly available information: research papers, educational resources, cultural works, images, and data shared by millions of people and institutions around the world. A significant portion of this knowledge exists because of CC licenses and the global movement for open sharing.
For a quarter century, the commons has grown through shared norms, legal tools, and collective stewardship. But the governance systems that made open sharing possible have not kept pace with the speed and scale of AI development. There are few mechanisms for attribution, few protections for community knowledge, and little alignment between the intentions of creators and the ways their work is used in training data.
This misalignment poses a profound risk. If the commons (and the internet at large) becomes merely a reservoir of raw material for private AI systems, creators will stop sharing, institutions will close access, and communities will withdraw. When sharing declines, the commons shrinks and access to knowledge becomes at risk.
This is not the future we want. To meet this moment, we must rebuild the rules of sharing for the AI era.
Creative Commons as Critical Infrastructure for a Fair AI Ecosystem
The commons did not emerge spontaneously. It was built through careful design: legal frameworks that enable reuse, technical standards that enable discovery, and community norms that sustain trust. The AI era now requires the next generation of that infrastructure.
CC is working to sustain continued access to human knowledge. CC licenses remain a critical piece of infrastructure in this effort. In order to ensure continued incentives to share in the age of AI, we need to supplement those tools with additional ways for communities to share human knowledge that are mutually beneficial. That’s where CC signals come in.
The CC Signals Framework
Without a healthy commons, knowledge becomes more closed, power becomes more concentrated, and both society and technology suffer. The result is a world where knowledge is controlled by a few instead of shared by everyone.
The alternative is a future where the global knowledge commons continues to grow and remains accessible so people, institutions, and AI systems can learn from shared knowledge in ways that benefit everyone, not just a few companies.
To get there, a few things are required:
- Clear rules for the AI era: New norms and governance so shared knowledge isn’t simply extracted without credit, consent, or responsibility.
- Tools that give creators agency: Ways for people and institutions to communicate how their work can be used in AI systems.
- Transparency and attribution: Systems that recognize where knowledge comes from and who contributed it.
- Strong public infrastructure for sharing: Legal, technical, and community systems that keep knowledge accessible while protecting trust.
CC signals offers a flexible, values-driven framework for communicating expectations around AI use of content or data. Just as our licenses offer choices between all and no rights reserved, our hope is that CC signals will offer creators a spectrum of preferences for when their work is used in AI. True openness means participation with fairness and respect, not unchecked extraction.
Engage With Us
This is a defining moment for the future of knowledge. The decisions made now will shape whether AI reinforces open access or accelerates its decline.
Creative Commons invites creators, institutions, policymakers, and technologists to take part in building a fair and sustainable AI ecosystem.
Join us to stay up to date on events and conversations around CC Signals! Fill out our intake form to join the CC community, access our Zulip chat platform, and receive our community newsletter.