News Technologies 06-12-2025 at 21:09 comment views icon

Wikipedia suspends AI article summarization project — community of editors protests

author avatar

Andrii Rusanov

News writer

Wikipedia suspends AI article summarization project — community of editors protests

Wikipedia has suspended the experiment with AI summarization of article content after a mostly negative reaction from the community of Wikipedia editors.

Despite the so-called edit wars that occur on high-profile topics, Wikipedia has actually become a universally recognized source of information for millions of people. Its importance has only grown in the last few years of the generative artificial intelligence boom, as it is one of the few online platforms that has not been significantly affected by the flood of garbage and disinformation created by AI.

«This would cause immediate and irreversible damage to our readers and our reputation as a fully reliable and serious source. In a sense, Wikipedia has become synonymous with sober boredom, which is great. Let’s not insult the intelligence of our readers and join the mass publication of flashy, artificially created summaries. That’s exactly what they are, although here we use the word «machine-generated» instead, — writes one of the editors.

The detailed description of the «Simple article summaries» project states that it was proposed after a discussion at the Wikimania 2024 conference, where «people discussed ways to use artificial intelligence/machine remixing of already created content to make Wikipedia more accessible and easier to learn». Some speakers believed that these summaries could improve the experience of getting information from Wikipedia. In particular, AI could simplify complex jargon or terminology in some specific topics.

Wikimedia announced that it would launch an experiment with generated summaries on June 2, and immediately received dozens of comments from editors who thought it was «a very bad idea», «most strongly opposed», «Absolutely not», etc.

«By using simple, short article descriptions, you propose to give one single editor with known «reliability» a platform at the very beginning of any article, while giving no editorial control to others. This reinforces the idea that Wikipedia cannot be relied upon, undoing decades of policy work. It reinforces the belief that someone can add biased content without sourcing because it’s their platform. I don’t think I would feel comfortable contributing to such an encyclopedia. No other community has mastered collaboration to such an amazing degree, and it would be ruined,» says another editor.

A day later, Wikimedia announced the suspension of the experiment. However, the foundation is still interested in resumes generated by artificial intelligence.

«The Wikimedia Foundation is exploring ways to make Wikipedia and other projects more accessible to readers around the world. This two-week, participant-consented experiment focused on making complex Wikipedia articles more accessible to people with different reading levels. For the purposes of this experiment, annotations were generated using Cohere’s open-weighted Aya model. This was to gauge interest in such a feature and to help us think about the right community moderation systems to ensure that people are central to deciding what information appears on Wikipedia,» a spokesperson for the foundation told me in an email to the site 404 Media.

So, it seems that the AI project on Wikipedia is currently in a «suspended» state. Indeed, some articles, particularly their English versions, are extremely large and sometimes not very clear. But can anyone guarantee that AI does not «fantasize about» something in a short description?



Spelling error report

The following text will be sent to our editors: