Vea También
Wikipedia is one of the world’s most influential knowledge platforms. Ranking among the top ten most-visited websites globally – just after Google and YouTube – it attracts over six billion monthly visits, and offers content in nearly 300 languages.
Often perceived as a democratic space where anyone can edit and contribute, Wikipedia remains a battleground for ideological debates. Despite right-wing criticism of its “woke” agenda, a persistent gender gap has shaped its content and participation for over a decade.
While its mission is to be the “sum of all human knowledge”, my research shows that only 19% of its biographies feature women, and just 10-15% of its editors are female. This disparity distorts the historical record and reinforces the invisibility of women’s contributions across fields such as science, politics, literature, and activism.
Now more than ever, it is crucial for Wikipedia to be unbiased. Beyond providing information to billions of people, it is also one of the most widely used sources for training AI systems like ChatGPT. Any biases in its content risk being amplified and perpetuated, further entrenching systemic inequalities as these technologies develop.
As we mark International Women’s Day, it’s high time we examined the barriers that keep Wikipedia from achieving true equity, and the efforts being made to close this digital divide.
A systemic issue
Recent research, including a scoping review conducted by the Women&Wikipedia project at the University of Barcelona, highlights three interconnected explanations for Wikipedia’s gender gap:
-
The “Women’s Problem” Hypothesis: This theory suggests that women are less likely to contribute due to lack of time. This is often because of family caregiving responsibilities, or a lack of confidence or interest in digital collaboration. However, taken alone, this perspective places the blame solely on women, and ignores structural barriers.
-
The “Mirror Effect” Hypothesis: Wikipedia reflects the inequalities present in society at large. The underrepresentation of women in mainstream media and academia means fewer notable women are written about and cited. However, Wikipedia’s decision-making community processes also amplify gender bias.
-
The “Systemic Problem” Hypothesis: Wikipedia’s culture, policies, and power dynamics favour established editors (predominantly men) and create an unwelcoming environment for newcomers, particularly women.
These factors contribute to the persistence of gender bias on the platform. The study found that women’s biographies are more frequently nominated for deletion, often with claims that they lack “notability”: a requirement that is more difficult for women to meet due to their historical exclusion from traditional sources of recognition.
Moreover, women face greater hurdles to appearing in the media, which is the primary source of information for Wikipedia articles. New Wikipedia entries cannot be created from scratch – they have to be based on what is already reported in external sources.
Wikipedia’s main page: who gets featured?
The Cover Women project – another ongoing study by the same University of Barcelona research group, funded by the Wikimedia Foundation’s Research Grant – analyses the representation of women and other marginalised groups on the main page of Wikipedia across seven language editions.
Over a ten-year period, the research analysed 22,924 biographies featured on the English and Spanish Wikipedia front page. It found alarming disparities, not only in gender but also in ethnicity, religion, language, profession, and race. The project’s key findings in each of these areas were:
-
Gender: Women accounted for only 29% of featured biographies on the English Wikipedia, and an even lower 18% on the Spanish Wikipedia. Non-binary individuals were virtually absent.
-
Ethnicity and race: Most individuals featured on Wikipedia’s front page were white. Racial labels were inconsistently applied – white individuals were rarely categorised by race, while black individuals were explicitly identified as such. This reveals an underlying bias in how race is perceived and labelled by Wikipedia editors.
-
Religion: Christian figures dominated the front page, with significantly fewer representations of Muslim, Hindu, or Buddhist individuals.
-
Native Language: English-speaking individuals were overwhelmingly featured, further emphasising a Western-centric bias in content selection.
-
Profession: Politicians, scientists, and writers were the most commonly featured professions, while fields traditionally associated with women, such as nursing or caregiving, were nearly absent. Interestingly, while one of the most common professions for women with Wikipedia articles is actress – including many from the adult film industry – this profession does not appear on the main page.
Complex guidelines
Wikipedia’s main page is curated by a small team of volunteer editors who follow community-driven guidelines to ensure quality and relevance. However, these guidelines can be difficult for new editors to navigate, as they are filled with acronyms and specialised terminology.
The selection process is largely shaped by the experience level of contributors, and only those with specific roles and sufficient expertise can actively participate in decision-making. In some cases, structured voting systems allow community involvement, but these too are often restricted to seasoned editors who meet certain criteria.
While certain sections actively attempt to counteract bias by promoting diversity and underrepresented topics, the final content ultimately reflects the interests and priorities of the most active editors, whose contributions shape the visibility of information on the platform. The biases and discrimination found by the Cover Women project show that current efforts to make content selection neutral are falling short overall.
A more equitable Wikipedia
Wikipedia’s gender gap is not just a reflection of existing inequalities – it is a site where these inequalities are either reinforced or challenged. If Wikipedia aspires to be the sum of all human knowledge, then it must address the systemic barriers that exclude women’s voices.
Achieving gender balance will require:
-
More female editors and general diversity among contributors.
-
Better documentation of women’s achievements in mainstream media.
-
Structural reforms in Wikipedia’s editing culture and policies. This would mean making its editorial guidelines more accessible, welcoming and encouraging new editors, and enabling greater community participation in choosing which articles to spotlight on the main page.
Despite its faults, Wikipedia is a huge achievement. It is the world’s largest and most widely accessed knowledge platform, and it belongs freely to nobody and everybody. It is therefore within everybody’s power to make it a place that reflects the diversity of human experience, where all voices are heard and valued.

Núria Ferran-Ferrer is the lead researcher of HerStory-NeSyAI (PID2023-147673OB-I00), the Cover Women project (G-RS-2402-15223), and the research team Women&Wikiepdia. Her research has received funding from the Wikimedia Foundation Research Grant. She also serves as the Director of the University of Barcelona's PhD Program in Information and Communication, and as the Rector's Delegate for the Direction of the Equality Unit.
Este artículo fue publicado originalmente en The Conversation. Lea el original.