14
irene
63d

Have someone heard about Amazon's ai constantly learned to discriminate?

http://fortune.com/2018/10/...

Comments
  • 5
    It's almost like a child learning from its parent, eh?
  • 4
    Lmao. Its actually the exact opposite from what they say in that article. Filthy liers. Just take a look at [this](https://youtu.be/MECcIJW67-M)
  • 4
    Fortune is pretty much fake news
  • 3
    A system is as good as the data you feed it.
    Reading the article, it also states that the bias came on how the resumes were written. And it's true, studies have shown that men are more assertive/confident on their abilities (even when it's not the case) than women.
    Honestly I wouldn't ditch the AI at all, I would make an intermediate platform instead, where people would just fill in their capabilities and let the AI do its work. No personal data other than contact details (so you get out the gender bias), no freewording (so you diminish personal assertion bias). And of course it wouldn't be self-taught in the beginning, to have better calibration on weights.
  • 4
    @Qaldim But it’s still a invalid argument to call it ”discriminating against women”
  • 2
    @HampusMa Yes, because it has no true sentience yet.
    On the same note, there used to be an issue not a long time ago when an AI couldn't recognise people from Far East Asia because of facial features.
    Another example of feeding the system very specific data.
  • 2
    From what I have learned about ML, this makes total sense. If most of the application and hires are males it will start to make the correlation that males are preferred. The only way to correct it would be to start with making the training set even. Which means you would have to find an equal number of submissions both good and bad by both male and female. Which could be a tall order considering how much data ML actually needs for training and the lack of females in the profession.
Your Job Suck?
Get a Better Job
Add Comment