If you’ve noticed fewer people linking to Wikipedia in the wild, you’re not imagining it. The Wikimedia Foundation — the nonprofit that helps run Wikipedia — says its latest traffic numbers show a meaningful drop in human visits, and it points the finger at two connected forces: smarter bot crawlers and the rise of AI-powered answers that hand people Wikipedia’s content without sending them to the site.
Around mid-2025, the Foundation reworked the way it separates bots from people. After that clean-up, it found that human pageviews were about 8% lower year-over-year for the March–August window than they were in the same months of 2024. That number, the Foundation’s senior director of product Marshall Miller writes, came into focus after investigators noticed an odd spike of apparently “human” traffic originating mainly from Brazil — traffic that, on inspection, turned out to be increasingly hard-to-detect bots.
What’s happening
There are two related dynamics at work:
- Generative AI as a middleman. Search engines and chatbots increasingly show quick, AI-written answers at the top of results pages or in chat windows. Those summaries often compile facts from multiple sources — and Wikipedia is a huge source. But when an AI displays the answer directly, fewer people click through to the original article. That “no-click” behaviour reduces visits even if Wikipedia’s information is still being used beneath the hood. Tech reporters say the Foundation sees this pattern across search engines and chatbots. Google, for its part, has disputed the idea that its AI features necessarily reduce click traffic.
- Bots that pretend to be people. Wikimedia’s updated detection logic found a wave of crawlers that were sophisticated enough to look human. Those bots inflate raw traffic numbers and hide the real trend. Once Wikimedia updated its filters, a clearer picture showed the human-visit decline. Miller and colleagues worry this combination — scraping plus no-click answers — means Wikipedia’s enormous public good is getting used more than it’s being visited.
Why the drop matters beyond headlines and ad counts
Wikipedia doesn’t sell subscriptions or premium access; it runs on volunteers and donations. People visit, read, maybe get inspired to edit, or to donate a small sum that keeps servers humming. Wikimedia’s argument is that sustained declines in visits could undermine that ecosystem: fewer visitors means fewer recruitable volunteers, less community energy, and smaller donation totals — and the outcome could be weaker coverage, slower corrections, and, eventually, less reliable articles. Miller framed it as an existential risk: a world that consumes Wikipedia’s outputs without visiting or crediting the site risks starving the thing that made the content trustworthy in the first place.
The community pushback and what Wikimedia tried
This isn’t news to the volunteer editors who spend hours defending the site from low-quality, AI-generated entries. Over the past year, Wikipedia’s community has launched cleanup projects to spot and remove “AI slop” (poorly sourced, AI-crafted articles), tightened deletion rules and flagged suspicious new pages. The Foundation has experimented with tools to help editors, and even floated putting AI-generated summaries on Wikipedia pages to capture readers’ attention — a plan it dropped after volunteers pushed back hard. The tension is obvious: the Foundation wants tech that helps people find and use Wikipedia, while editors want safeguards so the site doesn’t become a machine-churned catalog of errors.
Related /
- Wikipedia now uses generative AI to help, not harm, its community
- Wikipedia is fighting AI scraping with a new Kaggle dataset
Who benefits from Wikipedia being used but not visited?
Ironically, the very systems that rely on Wikipedia to spit out quick answers — large language models and search-engine AI snippets — may be extracting value without feeding the platform that supplies so much of their factual base. That’s a structural problem: an ecosystem where training data and source material are consumed at scale but the original creators aren’t credited, visited, or supported financially. Wikimedia’s plea is straightforward: if platforms want to use Wikipedia’s content to answer questions, they should be explicit about where the information came from and make it easy for users to visit the source. That transparency would give readers the option to verify and participate — and might help Wikipedia survive the economics of the AI age.
What might change (or what people are asking for)
- More visible sourcing and click-through opportunities in AI answers (links, “read the original” buttons).
- Industry agreements around how training data is credited and how source visits are encouraged.
- New Wikimedia tools that nudge users from an AI answer to the live article (something the community debated when the Foundation looked at internal summaries).
- Policy and research work to measure precisely how AI summaries affect referrals across many publishers — not just Wikipedia. Some studies and newsroom reports already hint that this is a systemic shift affecting news and niche sites, too.
Wikipedia remains one of the internet’s most important repositories of curated knowledge. The Foundation’s data suggests human readership is slipping — at least in the short run — and it traces that slump to both opaque bot activity and the rise of AI that answers questions without sending readers to the source. That’s not merely a traffic problem. It’s a threat to the model that keeps Wikipedia accurate and verifiable: a massive volunteer project funded by the people who value it. If platforms and model builders want to use Wikipedia’s work, Wikimedia argues, they should do so in ways that keep the site visible, credited and supported.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
