• News
    • Latest news
    • News search
    • Health
    • Finance
    • Food
    • Career news
    • Content series
    • Try Devex Pro
  • Jobs
    • Job search
    • Post a job
    • Employer search
    • CV Writing
    • Upcoming career events
    • Try Career Account
  • Funding
    • Funding search
    • Funding news
  • Talent
    • Candidate search
    • Devex Talent Solutions
  • Events
    • Upcoming and past events
    • Partner on an event
  • Post a job
  • About
      • About us
      • Membership
      • Newsletters
      • Advertising partnerships
      • Devex Talent Solutions
      • Contact us
Join DevexSign in
Join DevexSign in

News

  • Latest news
  • News search
  • Health
  • Finance
  • Food
  • Career news
  • Content series
  • Try Devex Pro

Jobs

  • Job search
  • Post a job
  • Employer search
  • CV Writing
  • Upcoming career events
  • Try Career Account

Funding

  • Funding search
  • Funding news

Talent

  • Candidate search
  • Devex Talent Solutions

Events

  • Upcoming and past events
  • Partner on an event
Post a job

About

  • About us
  • Membership
  • Newsletters
  • Advertising partnerships
  • Devex Talent Solutions
  • Contact us
  • My Devex
  • Update my profile % complete
  • Account & privacy settings
  • My saved jobs
  • Manage newsletters
  • Support
  • Sign out
Latest newsNews searchHealthFinanceFoodCareer newsContent seriesTry Devex Pro
    • Opinion
    • Ctrl Shift Equality

    Opinion: It’s time to fix AI — affirmative action for algorithms

    Although there is more awareness of the biases at the core of AI, the political will to act remains weak. Here’s how to help promote an AI harnessed to deliver equitable outcomes.

    By Caitlin Kraft-Buchman // 13 September 2024
    Artificial intelligence may shape future rules based on outdated norms. Photo by: Alamy

    Artificial intelligence has an inclusion problem. Machines learn from data, and today’s data is implicitly biased against women and other marginalized groups. It defines yesterday, not tomorrow. That means the “rules” we will live by for years to come are being written to reinforce how things were, not how they could be in the future.

    Back in 2019, the A+ Alliance called for affirmative action for algorithms. Our goal was to ensure AI and emerging technologies are aligned with 21st century values and fundamental human rights. The A+ Declaration highlighted the urgent need for course correction: To create new tools acknowledging new social norms. We called this “feminist AI.”

    We talked about systemic gender, racial, and intersectional biases at the core of AI and algorithmic decision-making. These processes, born in the tech hubs of North America and Europe, are being replicated throughout the global south. We must combat and correct this before old ideas are wired into new digital ecosystems. So, we proposed, pushed for, and promoted an AI harnessed to deliver equitable outcomes. Although there is awareness of the issue now where there previously was none, the political will to act remains weak.

    Five years on, we continue to talk. The good news is that conversations around inclusive algorithms are finally being picked up by a wider audience on a larger global stage. The bad news is the situation is more urgent than ever as AI deployment barrels forward at unimagined speed and scale. If we don’t act soon, those who need it most will be left even further behind.

    We have three calls to action for government, business, and civil society organizations.

    1. Establish accountability and transparency

    Generation Equality

    Generation Equality is the world’s leading effort to unlock political will and accelerate investment and implementation on gender equality. Launched at the Generation Equality Forum in 2021, the action coalitions are innovative, multistakeholder partnerships mobilizing governments, civil society, international organizations, and the private sector around the most critical areas in gender equality to achieve concrete change for women and girls worldwide. Ctrl Shift Equality is a partnership with two of the coalitions: technology and innovation for gender equality, and gender-based violence.

    Global guidelines, which do not yet exist, are urgently needed to ensure accountability and transparency in AI and algorithmic decision-making. That means strong legal frameworks to promote accountability that include rigorous testing across the lifecycle of AI systems with integrated human rights and algorithmic impact assessments.

    We need public institutions to step up and show us what prosocial tech can look like. We need to create AI with 21st century social protections, incentives, subsidies, and scholarships, and focus on where women and girls have traditionally been left behind.

    We also need gender-responsive procurement guidelines with hard targets. An entire industry of innovation could be jump-started by opening up capital to invent at scale for women, girls, and historically marginalized groups.

    For example, Fr, the Feminist AI Research Network focused on incubating AI helped Manila’s SafeHer, which uses AI to make public transport safer for women. Or Buenos Aires’ E.D.I.A., which democratizes how datasets can be debiased, and Cairo’s AI-based tutoring system for community schools in upper Egypt. These provide vital proof of concept for needs-based, community-based solutions to great problems.

    We need improved datasets with controls to oversee collection processes, human-in-the-loop verification, and a focus on quality over quantity. Open, gender-disaggregated data will let us better understand the sources of AI bias and improve the performance of machine learning systems.

    2. Bring gender balance to AI design and coding

    Gender balance must be placed on the agenda of all involved with the funding, design, adoption, and evaluation of AI decision-making. To start with, a research fund would help explore the impacts of gender bias with a multidisciplinary approach beyond just computer science and economics.

    For design teams, diversity triggers creativity, as well as the detection and mitigation of bias. Companies should be required to report on gender balance in research and design teams, including upstream when applying for grants. They could do this in their corporate social responsibility reports or, in Europe, under the CSR directive. Governments and science foundations could and should allocate more points to teams with genuine gender balance when applying for grants. Or do as the Swiss National Science Foundation has done — introduce gender quotas in its evaluation bodies and the National Research Council.

    The challenge we face can only be met with innovative and inclusive thinking. The imagination and skill required can be provided by the largest untapped intellectual resource on the planet: women and girls.

    3. Ensure international cooperation grounded in human rights

    Mass-scale correction of skewed data will require multilateral and international cooperation to ensure no one is left behind.

    This can start with a United Nations agencies-wide review of the application of existing human rights laws and standards for algorithmic decision-making, machine learning, and gender. In the end, it should lead to the development of a new set of global metrics for digital inclusivity that are fit for purpose in the fast-changing digital age.

    AI-supported telemedicine brings health care to refugees

    Artificial intelligence-backed health services bringing virtual care to the continent.

    Imagine a world where AI champions social justice so every algorithm enriches lives and promotes human dignity, rather than an AI that perpetuates biases and enshrines the status quo.

    The time to stand up for inclusive, feminist AI harnessed to deliver equality outcomes, designed with inclusion at the core — in other words, an AI that creates new opportunities and innovative correction of inequities — is now.

    The author is one of the leaders of the Action Coalition. Learn more at aplusalliance.org or womenatthetable.net. 

    Visit Ctrl Shift Equality — a series produced by Devex in partnership with UN Women and the Generation Equality Action Coalitions on Technology and Innovation for Gender Equality, and Gender-Based Violence.

    To learn more about the multistakeholder commitments made to the Generation Equality Action Coalitions, click here.

    Explore the series.
    • Innovation & ICT
    • Social/Inclusive Development
    • Democracy, Human Rights & Governance
    • United Nations (UN)
    • Swiss National Science Foundation (SNSF)
    Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro members may share up to 10 articles per month using the Pro share tool ( ).
    The views in this opinion piece do not necessarily reflect Devex's editorial views.

    About the author

    • Caitlin Kraft-Buchman

      Caitlin Kraft-Buchman

      Caitlin Kraft-Buchman is the CEO and founder of Women at the Table and co-founder of the Alliance for Inclusive Algorithms.

    Search for articles

    Related Stories

    Sponsored by UN WomenOpinion: Feminist foreign policy in the digital age

    Opinion: Feminist foreign policy in the digital age

    Ctrl Shift Equality: Sponsored by UN WomenHow to eliminate gender disparities in STEM and ICT

    How to eliminate gender disparities in STEM and ICT

    Sponsored by The Pfizer FoundationOpinion: How community-led innovation can help drive equitable AI

    Opinion: How community-led innovation can help drive equitable AI

    Artificial intelligenceOpinion: Africa's AI future hinges on youth investment

    Opinion: Africa's AI future hinges on youth investment

    Most Read

    • 1
      Exclusive: A first look at the Trump administration's UNGA priorities
    • 2
      Opinion: AI-powered technologies can transform access to health care
    • 3
      AIIB turns 10: Is there trouble ahead for the China-backed bank?
    • 4
      Opinion: How climate philanthropy can solve its innovation challenge
    • 5
      WHO anticipates losing some 600 staff in Geneva
    • News
    • Jobs
    • Funding
    • Talent
    • Events

    Devex is the media platform for the global development community.

    A social enterprise, we connect and inform over 1.3 million development, health, humanitarian, and sustainability professionals through news, business intelligence, and funding & career opportunities so you can do more good for more people. We invite you to join us.

    • About us
    • Membership
    • Newsletters
    • Advertising partnerships
    • Devex Talent Solutions
    • Post a job
    • Careers at Devex
    • Contact us
    © Copyright 2000 - 2025 Devex|User Agreement|Privacy Statement