AI and Wikipedia collaboration article banner.

Can AI Companies and Wikipedia Collaborate? Democratization as Common Ground [18 Nov 2025]


Author: MikeTurkey, in conversation with chatgpt
Date: 18 Nov 2025

Introduction

In November 2025, the Wikimedia Foundation issued a statement protesting the unauthorized use of data by AI companies. According to reporting by Asahi Shimbun, access to Wikipedia by AI companies has surged, increasing server load, while human readership has decreased by 8% compared to the previous year. This decline in readership could also lead to decreased donations, threatening Wikipedia's operational foundation.
I personally use both Wikipedia and AI on a daily basis. Both are indispensable as modern knowledge infrastructure. That's precisely why I want to avoid a situation where they clash and both suffer.
This article explores pathways for Wikipedia and AI companies to collaborate.

Actually Allies? Finding Common Ground


At first glance, Wikipedia and AI companies appear to have conflicting interests.
However, upon closer examination, they share important commonalities.

Financial Constraints

As is well known, the Wikimedia Foundation is a nonprofit organization suffering from chronic funding shortages. Consequently, the Wikimedia Foundation is seeking paid dataset usage, but considering the reality of AI companies, this solution is likely difficult to realize. AI companies including Anthropic and OpenAI already require enormous amounts of funding amid massive computational costs, research and development expenses, and intense competitive pressure.
Many companies cannot afford to pay additional data licensing fees.

However, if we leave this problem unresolved, we'll face a situation where "bad money drives out good." If high-quality information sources like Wikipedia deteriorate, AI companies will lose the high-quality data needed for training, ultimately degrading AI output quality.
This is a classic "tragedy of the commons."
Everyone uses the shared resource for their own benefit, no one bears the maintenance costs, and eventually the resource is depleted.

The Shared Ideal of "Democratization"

Interestingly, both Wikipedia and AI companies share the ideal of "democratization." Wikipedia advocates for "democratization of access to knowledge," while companies like Anthropic, OpenAI, Amazon, and Microsoft advocate for "AI democratization." Both are based on the philosophy that "everyone, not just specific power holders or the wealthy, should benefit."

This shared ideal could be the key to a solution. If we can shift from the current "AI companies vs. Wikipedia" conflict framework to one where "the common enemy is information monopolization and enclosure, and both parties are on the same side," a path to collaboration opens up.

Can AI Become a Member of Wikipedia? The Possibility of Editorial Support


A more proactive form of collaboration would be for AI companies to support Wikipedia editing. However, this requires careful consideration.

The Wikipedia editor community consists of people who have spent years focusing on writing and engaging in discussion and consensus-building. For them, AI beginning to write articles might feel like their value is being denied. Moreover, because they have the ability to detect AI's "plausible but inaccurate" writing, skepticism toward AI runs deep.

Therefore, I propose an approach where AI only "offers opinions," while all editing and decisions are made by humans.

Specifically, AI would analyze Wikipedia articles and offer "opinions" such as:

  • "This statement appears to contradict source A"

  • "This paragraph may have neutrality issues"

  • "The following perspectives may be missing from this topic"

  • "Compared to similar articles, this structure has room for improvement"

These are merely information to assist editors' judgment, not mandates. Final editing, decisions, and consensus-building are all done by humans. AI merely offers opinions as "one participant," and editors' sovereignty is completely respected.

This approach aligns with Wikipedia's culture. Wikipedia has always welcomed "multiple perspectives" and emphasized discussion on Talk pages. If AI is just "one participant offering opinions," it can naturally blend into this culture.

What's important is making it opt-in.
AI opinions and suggestions would be off by default, enabled only by editors who want to use them.
If functions can be selected gradually (translation only, review only, etc.), editors' autonomy can be fully respected.

Leave Minor Languages to AI: The Possibility of AI Article Generation

Separate from editorial support, there's another possibility: AI generating articles in language versions with very few editors.

There are many articles that exist in English Wikipedia but not in other language versions.
Especially in minor language versions with fewer than 10 editors, human resources are severely lacking, and understaffing is a serious problem. In such situations, the choice is between "AI writes" or "nothing exists." Even if imperfect, there is value in having some information.

Several conditions would make AI article generation acceptable:

  • Clear labeling: Clearly indicate "This article was generated by AI"

  • Human priority: Editors have the authority to modify or delete

  • Gradual introduction: Start with language versions with the fewest editors

  • Limit article types: Geography, biology, astronomy, etc.; avoid controversial topics like politics, history, religion

  • Community dialogue: Obtain consensus from each language version

The combination of "language versions with few editors" × "articles that don't exist" is the sweet spot with the least resistance. Starting here and gradually expanding if successful—this is the realistic path.

Conclusion


The conflict between Wikipedia and AI companies is essentially a "tragedy of the commons." However, if we recognize that both share the ideal of "democratization," a path of collaboration rather than conflict becomes visible.

I propose the following concrete solutions:
  • Link display: When AI uses Wikipedia information, always display links to return traffic

  • Infrastructure support: Companies like Amazon provide CDN to reduce server load

  • Editorial support AI: AI only "offers opinions," while all editing and decisions are made by humans (opt-in approach)

  • AI article generation: Limited to language versions with very few editors and articles that don't exist

All of these are achievable without financial burden and respect the autonomy of the editor community.

I believe both Wikipedia and AI are indispensable to modern society. Neither benefits from conflict and mutual destruction.

As someone who also runs a small website, Wikipedia's challenges are not someone else's problem. The difficulty of continuing to create quality content and the question of how to deal with AI are themes that all content creators, regardless of scale, should consider. I hope this article becomes a catalyst for constructive dialogue.

License

2023-2025 Copyright Mike Turkey All rights reserved.
Scope: This license applies to all non-code text content on miketurkey.com
- Unauthorized copying of this document is prohibited.
- Direct linking to this URL is permitted.