Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

  • dangblingus@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    3
    ·
    1 year ago

    I’m sick of the implication that computer programmers are intentionally or unintentionally adding racial bias to AI systems. As if a massive percentage of software developers in NA aren’t people of color. When can we have the discussion where we talk about how photosensitive technology and contrast ratio works?

    • pageflight@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      There’s still a huge racial disparity in tech work forces. For one example, at Google according to their diversity report (page 66), their tech workforce is 4% Black versus 43% White and 50% Asian. Over the past 9 years (since 2014), that’s an increase from 1.5% to 4% for Black tech workers at Google.

      There’s also plenty of news and research illuminating bias in trained models, from commercial facial recognition sets trained with >80% White faces to Timnit Gebru being fired from Google’s AI Ethics group for insisting on admitting bias and many more.

      I also think it overlooks serious aspects of racial bias to say it’s hard. Certainly, photographic representation of a Black face is going to provide less contrast within the face than for lighter skin. But that’s also ingrained bias. The thing is people (including software engineers) solve tough problems constantly, have to choose which details to focus on, rely on our experiences, and our experience is centered around outselves. Of course racist outcomes and stereotypes are natural, but we can identify the likely harmful outcomes and work to counter them.