Drudge Retort: The Other Side of the News
Sunday, October 28, 2018

Amazon's machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The team had been building computer programs since 2014 to review job applicants' resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters. Automation has been key to Amazon's e-commerce dominance, be it inside warehouses or driving pricing decisions. The company's experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars - much like shoppers rate products on Amazon, some of the people said. But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

More

Comments

Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.

Top talent needed to stuff boxes, tape them shut and stick a label on it.

#1 | Posted by bayviking at 2018-10-28 07:49 PM | Reply

Oops they programmed it to find the best person for the job vs being PC.

#2 | Posted by sawdust at 2018-10-28 07:56 PM | Reply | Newsworthy 1

In order to hire in a gender neutral manner for a based STEM job you have to put your thumb on the scale for the females. The AI didn't do that.

#3 | Posted by visitor_ at 2018-10-28 08:24 PM | Reply

Alternate headline:

AI Continues to Improve

#4 | Posted by nullifidian at 2018-10-28 08:34 PM | Reply

Nulli Continues His Mental Decline

#5 | Posted by JOE at 2018-10-28 08:39 PM | Reply | Newsworthy 1

Here's a nickle, snowflake. Go buy yourself a sense of humor.

nadoi.org

#6 | Posted by nullifidian at 2018-10-28 08:47 PM | Reply

*nickel

#7 | Posted by JOE at 2018-10-28 08:47 PM | Reply | Funny: 1

TFF

#8 | Posted by nullifidian at 2018-10-28 08:49 PM | Reply

"*nickel" - #7 | Posted by JOE at 2018-10-28 08:47 PM

nulliquisling wasn't taught spelling at CNU*

*CliffNotes University.

#9 | Posted by Hans at 2018-10-28 08:50 PM | Reply

#2 | Posted by sawdust at 2018-10-28 07:56 PM | Reply | Flagged newsworthy by nullifidian
If you can't be with
the one you love, love the one you're with.

#10 | Posted by Hans at 2018-10-28 08:53 PM | Reply

Top talent needed to stuff boxes, tape them shut and stick a label on it.

#1 | POSTED BY BAYVIKING AT 2018-10-28 07:49 PM

Jeez NutBay, we know you never read the article, but at least try to read the excerpt:

"the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way."

Your level of "try" is descending to Corkylike depths.

#11 | Posted by Rightocenter at 2018-10-28 08:53 PM | Reply | Newsworthy 2

- Corkylike depths.

GIGO... sorta like the AI... Artificial Lawyer, rofl.

#12 | Posted by Corky at 2018-10-28 10:12 PM | Reply

#12

Case in point.

#13 | Posted by Rightocenter at 2018-10-28 11:14 PM | Reply

#11 & 13, So what? Let's not pretend what 90% of the work and payroll at Amazon consists of. When you're devoid of humor and wrong about everything, there is nowhere else to go but character assassination.

#14 | Posted by bayviking at 2018-10-29 12:46 AM | Reply

Let's not pretend what 90% of the work and payroll at Amazon consists of.

I am not a fan of Amazon and Dr. Evil either, but that isn't what the story is about, it is about IT hires.

When you're devoid of humor and wrong about everything, there is nowhere else to go but character assassination.

Go easy on Corky, he rarely tries anyway.

#15 | Posted by Rightocenter at 2018-10-29 07:35 PM | Reply

Was it actually Misogynist or did it just not select enough women per some people?

#16 | Posted by Tor at 2018-10-29 07:45 PM | Reply

If you name it "Watson" and call it "Big Blue" what do you expect?

#17 | Posted by donnerboy at 2018-10-29 07:57 PM | Reply

This is a common problem with ML, not gender inequality, rather bias insertion in general.
ML relies on labelled input data the ML 'learns' from. If you rate the training set resumes in a non-gender neutral way, then the ML will rate the other resumes the same way.

A joke as an example.
HR: What is 7 + 2?
ML: 0.
HR: No that is wrong, 7+2 is 9. What is 18+33?
ML: 9.

ML just pattern matches new input to training set labels.

#18 | Posted by bored at 2018-10-29 08:09 PM | Reply

Or maybe it is because the ML trainers said the Amazon is looking for experts at stuffing boxes.

#19 | Posted by bored at 2018-10-29 08:09 PM | Reply

This AI was rating people for software dev and other technical jobs. Those fields are all sausage fests. The results are unsurprising to say the least.

#20 | Posted by sitzkrieg at 2018-10-29 08:11 PM | Reply | Newsworthy 2

For some strange reason, I've been getting persistent invitations to interview at Amazon, with a promise of an offer on the spot. I have no idea why, since i've never earned a nickle writing code and never applied for work at Amazon.

You don't have to read the article to know its about gender biased AI for a job in software development.

#21 | Posted by bayviking at 2018-10-30 12:08 AM | Reply

Comments are closed for this entry.

Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy | Copyright 2018 World Readable

Drudge Retort