Courtesy Matters: Returning Traffic Through Link Display
Even if financial support is difficult, there is a minimum courtesy AI companies can offer. That is to clearly indicate when Wikipedia information is used and display links to it.
When users ask AI questions and the response is based on Wikipedia information, display a link to the relevant Wikipedia page. This is technically easy and has already been implemented in some AI services like Perplexity AI and Microsoft Copilot.
This measure has multiple benefits:
Ensuring transparency: Making it clear which information sources AI is using
Returning traffic: Recovering declining human readership
Answering the criticism of "stealing knowledge": Clearing ethical issues by citing sources
Wikipedia's problem is that automated access by AI companies increases server load while human readership decreases.
Link display directly addresses this issue. We should distinguish between mass scraping during training and natural access via users. The latter would mean "human readers" that Wikipedia originally expects are returning.
Furthermore, if cloud infrastructure leaders like Amazon provide infrastructure support such as CDN (Content Delivery Network), the server load problem could be greatly reduced. Amazon already has support programs for nonprofit organizations, and the Wikimedia Foundation could well be eligible.
Can AI Become a Member of Wikipedia? The Possibility of Editorial Support
A more proactive form of collaboration would be for AI companies to support Wikipedia editing. However, this requires careful consideration.
The Wikipedia editor community consists of people who have spent years focusing on writing and engaging in discussion and consensus-building. For them, AI beginning to write articles might feel like their value is being denied. Moreover, because they have the ability to detect AI's "plausible but inaccurate" writing, skepticism toward AI runs deep.
Therefore, I propose an approach where AI only "offers opinions," while all editing and decisions are made by humans.
Specifically, AI would analyze Wikipedia articles and offer "opinions" such as:
"This statement appears to contradict source A"
"This paragraph may have neutrality issues"
"The following perspectives may be missing from this topic"
"Compared to similar articles, this structure has room for improvement"
These are merely information to assist editors' judgment, not mandates. Final editing, decisions, and consensus-building are all done by humans. AI merely offers opinions as "one participant," and editors' sovereignty is completely respected.
This approach aligns with Wikipedia's culture. Wikipedia has always welcomed "multiple perspectives" and emphasized discussion on Talk pages. If AI is just "one participant offering opinions," it can naturally blend into this culture.
What's important is making it opt-in.
AI opinions and suggestions would be off by default, enabled only by editors who want to use them.
If functions can be selected gradually (translation only, review only, etc.), editors' autonomy can be fully respected.
Leave Minor Languages to AI: The Possibility of AI Article Generation
Separate from editorial support, there's another possibility: AI generating articles in language versions with very few editors.
There are many articles that exist in English Wikipedia but not in other language versions.
Especially in minor language versions with fewer than 10 editors, human resources are severely lacking, and understaffing is a serious problem. In such situations, the choice is between "AI writes" or "nothing exists." Even if imperfect, there is value in having some information.
Several conditions would make AI article generation acceptable:
Clear labeling: Clearly indicate "This article was generated by AI"
Human priority: Editors have the authority to modify or delete
Gradual introduction: Start with language versions with the fewest editors
Limit article types: Geography, biology, astronomy, etc.; avoid controversial topics like politics, history, religion
Community dialogue: Obtain consensus from each language version
The combination of "language versions with few editors" × "articles that don't exist" is the sweet spot with the least resistance. Starting here and gradually expanding if successful—this is the realistic path.
Conclusion
The conflict between Wikipedia and AI companies is essentially a "tragedy of the commons." However, if we recognize that both share the ideal of "democratization," a path of collaboration rather than conflict becomes visible.
I propose the following concrete solutions:
Link display: When AI uses Wikipedia information, always display links to return traffic
Infrastructure support: Companies like Amazon provide CDN to reduce server load
Editorial support AI: AI only "offers opinions," while all editing and decisions are made by humans (opt-in approach)
AI article generation: Limited to language versions with very few editors and articles that don't exist
All of these are achievable without financial burden and respect the autonomy of the editor community.
I believe both Wikipedia and AI are indispensable to modern society. Neither benefits from conflict and mutual destruction.
As someone who also runs a small website, Wikipedia's challenges are not someone else's problem. The difficulty of continuing to create quality content and the question of how to deal with AI are themes that all content creators, regardless of scale, should consider. I hope this article becomes a catalyst for constructive dialogue.