Now this makes me very fucking angry.

For one because they did it at all but especially for targeting people who would have a harder time to say no and, of course, for deliberately not telling what the data was being collected for plus not informing them for what company it was.

And for the people who will go "mah privacy reeeeeee", everyone deserves to be able to make a well informed decision and the people, in this case, didn't have that chance at all.

Google, go fuck yourself.

  • 12
    They should absolutely be sued for that.
  • 9
    Like they have no money to pay people for shit like that
  • 9
    "Don't be evil"... Who said that again?
  • 5
    @Condor Well, they removed that from their policy. ^^
  • 1
    I like that I'm included in your rant :D
  • 3
    @Condor Funny how morals go out the window when more money can be made by ignoring them.
  • 2
    @Minion Hey, they got a five dollar gift card!
    That's not nothing!

  • 1
    This just feeds into the idea that Google is going to be an agent of the government by doing convert shit like this. Trust is at an all time low with every institution. This just falls into the "evil shit" category.
  • 0
    I totally approve this rant! Fuck Google.
  • 2
    Ok, they have stepped far into evil territory, I'd rather have homeless people actually be helped than be targeted just so some facial recognition doesn't call me a gorilla or something.
  • 1
    wow, they're really pushing it to the next level...
  • 2
    @gudishvibes they could still use homeless people, but just playing nice and explaining them. Paying accordingly. Done, it would be a nice move: training their data AND helping others.
  • 2
    @brunofontes I don't entirely agree as for those people saying no is very difficult when it comes to money and stuff.
  • 1
    @linuxxx it's hard. For one side, yes, it is hard for them saying no. On the other side, you are giving them an opportunity, which basically no one does (Well, at least here).
  • 2
    I'm sure many stock photo websites doesn't specifically disallow training your A.I. on their images, so why didn't they just buy all available sets?
    That way you get much more data, much cheaper, already categorised and tagged, yielding a much higher quality training set and you could even extract control sets to minimise false negatives and positives...
  • 1
    @Flygger they probably already own those sets.

    Selfie game mentioned in article is probably about making map of face from different perspective with same light to improve some facial recognition / face mesh creation machine learning algorithms.
  • 1
    @Flygger one problem with those datasets is that they are all done in perfect lighting by professional photographers. So the AI is not trained for one light source at a bad angle or people of color since advertising to the poor doesn't make much sense in a capitalistic society.
Add Comment