Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
Jilano16014134dIt's almost like a child learning from its parent, eh?
Fortune is pretty much fake news
Qaldim2758134dA system is as good as the data you feed it.
Reading the article, it also states that the bias came on how the resumes were written. And it's true, studies have shown that men are more assertive/confident on their abilities (even when it's not the case) than women.
Honestly I wouldn't ditch the AI at all, I would make an intermediate platform instead, where people would just fill in their capabilities and let the AI do its work. No personal data other than contact details (so you get out the gender bias), no freewording (so you diminish personal assertion bias). And of course it wouldn't be self-taught in the beginning, to have better calibration on weights.
pain0486203133dFrom what I have learned about ML, this makes total sense. If most of the application and hires are males it will start to make the correlation that males are preferred. The only way to correct it would be to start with making the training set even. Which means you would have to find an equal number of submissions both good and bad by both male and female. Which could be a tall order considering how much data ML actually needs for training and the lack of females in the profession.
Your Job Suck?
Take a quick quiz from Triplebyte to skip the job search hassles and jump to final interviews at hot tech firms
Get a Better Job