19

https://banfacialrecognition.com/fe...

What? is this an actual thing people believe? Racially biase?? It's a fucking computer, it couldn't give less of a shit about what colour you're let alone what you do/don't believe. Am I missing something or have people completely gone fucked?

I understand the whole problem with Google that they don't have enough darker skin face samples which might make it a little worse at recognising them but wtf?

PS - Sorry if this shouldn't be a rant, wasn't sure it it's random or not

Comments
  • 14
    It's how (allegedly) the information is being used. Remember, it's not an isolated environment. The computer might not care but who's doing what with the data?
  • 7
    The algorithms are biased because their training data are biased because people are biased, and the problem is that algorithms are hard to change. Although I think this specifically is unlikely to be a real problem.
  • 5
    After scrolling to the bottom I also found this.
  • 2
    the only thing I agree with is matching the home of the buyer and that
  • 2
    @pk76 this is a good point but the article is not addressing that, its saying that the computers are racist and that all facerecognition is a bad terrible thing that will ruin everything
  • 8
    @hamolicious and then arguing semantics makes sense, but that's about it. While I disagree that it's the "computer that is racist" because obviously it's not, it's just doing what it's told, the simple fact is bias can be introduced during training and after with the data that is collected.

    This isn't a peer reviewed journal article, it's a simple piece of media. Some leeway has to be given for how non-technical people are going to speak about the matter.
  • 3
    @hamolicious A combination of worldview (pervasive racial bias, in this case) and bad information (computers are racist, in this case) can lead to something like this quite easily. Whether the people producing it actually believe what they're saying or if they are just producing something to scare people who do is another question entirely. After all, there are serious movements in the US right now to restructure large parts of the curriculum to remove "elements of racial oppression," the latest victim of which is mathematics. (That such ideas will actually lead to an uneducated and oppressed racially segregated underclass has either escaped such people or is the whole point of their campaign, although I haven't looked closely enough to try making distinctions for myself.)
  • 20
    Why do you want to be tracked?

    Also, people are corrupt and horrible creatures; if someone has access to this data, something bad will eventually come of it. The more widely available the data, the worse the result.
  • 2
    its 2019 why r u surprised lol
  • 2
  • 3
    @Root yes but with constant checks and upgrades, updates, proper procedures... we can minimise the risk of the data being put out in the wild and take all of the benefits of such technology and use it for the better, sounds like a fairy tale now that I say it 😂
  • 0
    @Wisecrack thank u my dude.
  • 3
    The privacy issues are for anyone and have nothing racist on them. Unless of course the article concedes that people of colour commit disproportionately more crimes, statistically seen, so that they also would be targeted by law enforcement more often.
  • 1
    @hamolicious People have been saying that for decades and it has never been true. Besides, exactly how is this data useful? Marketing? Security? Making sure everyone is enjoying themselves? 🙄
  • 0
    @hamolicious @Root *waves hands*
    Big data!
    Machine learning!
    AI!

    You must be against progress if you're against this!
  • 3
    @12bitfloat I don't have Facebook or anything and its very simple: its none of their goddamn business to know where I am or go at what moment.

    Luckily, in the Netherlands, this won't be introduced as fast since the biometric data processing and storing laws are veeery strict.
  • 1
    @Root well there is one and its sort of an ehhh, imagine there is a terrorist with quite a big record of bad stuff... and the "authority" has multiple pictures but no leads, you can in theory, train a model to ping said authorities when they have been found on face recognition cameras and stop whatever could've happened.
  • 8
    @hamolicious The problem is that history has shown that states with massive surveillance are not safer for the citizens because the state itself goes rogue with that much power. Besides, each and every really huge crime with millions of victims has always had the support of states behind.

    It's just that many people and especially politicians ignore the facts.
  • 2
    @Fast-Nop and many people don't appreciate the study of history. Even outside of politics, I've found a study of programming history to be beneficial. Plenty of things have changed, sure. But there's still useful stuff we've forgot.
  • 5
    @pk76 Yeah history repeats. Sure, people say "this time it's different". They have always said this, that's why history repeats.
  • 6
    @hamolicious The argument in favor of surveillance and control is always "for safety!" (or "think of the children!") And yet no oppressive regime has ever been safer for its citizens than more open and free societies. Compare China and England, North and South Korea, etc.

    Also, your example is pretty contrived and far-fetched. A well-known terrorist would not show up at a concert, and if they did, it would likely be eith a suicide belt. Open carry would help significantly more than waiting on the police. Even if the surveillance would help in such a scenario, that scenario is extremely rare and unlikely, and the tracking would be actively harming people's privacy and liberty on a daily basis.

    Those who give up freedom for safety deserve neither, and often lose both.

    The point is, the more power people have over others, the more corruption and intolerance there will be. You can basically always trust people to serve their own interests and improve their own lot using anything at their disposal, so give them the ability to protect and better themselves, rather than control others. The world will be a better and safer place.
  • 0
    >what color you're let alone
    The true terror of this post: technically, this is grammatically okay
  • 1
  • 0
    @hamolicious normally in this context one would use the normal "you are". Seeing "you're" in that precise place is startling... but technically correct.

    The best kind of correct.
  • 1
    @Parzi huh i never knew that :/
  • 2
    The stupidest thing about that is that every ppl wanting this for sure post photos of the live show the next day.
  • 0
    Bad training sets for ML = biased computer. Just google 'racist machine learning' and you should get plenty results from past years fuckups.
    Assumption that something is objective just because it comes out of an algorithm is dangerous. (Similar to sawblade - must be operated by a skilled operator to limit the danger)
  • 0
    _

    > no oppressive regime has ever been safer

    > for its citizens than more open and free

    > societies

    Actually..

    I'm reminded when I was in college and everyone of us was from a different country, as such it we was able to hear a wide range of experiences.

    Our tutor assumed that all those nasty oppressive countries would be awful places to live, but instead heard they was much better than the open and free ones !

    After all, look at how Iraq and Libya was before they was turned into open and free countries by the liberating West..
  • 0
    One solution is never go out without your makeup..

    https://youtube.com/watch/...

    > Grandpa Bodybuilders Pranks

    > ( Kakek Tua Sangat Kuat PRANK )
  • 0
    Don’t go to live shows, problem solved
  • 1
    Actually I have been working in this field for many years and these people who think FR is somehow raciest are just ignorant and uneducated.

    I promise I will make a Solo rant about this subject very soon. As I am just coming out of a severe depression because of this very subject of people reacting poorly to our technology.
  • 1
    Same ppl who use mobile phones, and are obssesed with likes, and tagging thier friends, care about this.

    Think of the movie minority report, but instead of retina scans, it uses face recog...
  • 3
    I still feel that privacy is important.
  • 1
    @zcoder
    Remember, the more you differ from the average the more interesting you are to track. What about the concept of staying private by blending in with dull data?
  • 3
    "Deportation of immigrant fans." Immigrants won't get deported. That's absurd. Unless that is, they crossed the border illegally instead of going through the proper channels, hurting legal immigrants in the process. And even then, deportation is uncertain.
  • 0
    @nitnip FR used for deportation without Congress approval.
    https://deseret.com/2019/7/...
    Give government (or its agencies) a power and it'll get abused. They can start testing it on a concert venues. Once you got PoC right, you can deploy on any public camera system and do another step to a surveilance dystopia.
  • 0
    @qwwerty
    I dont remember cities having good cameras in the first place...
  • 2
    @Gregozor2121 sure, and it's not like they can be upgraded in a future. Because it's too hard to start another terrorism hysteria which would free some funds for such technology.
    btw sample of what is currently offered on the market https://business.panasonic.co.uk/se...
  • 0
    @Gregozor2121 I am very unique... shh. But based on the OP message, there really isn't blending in when they have video of your face.
  • 0
    @zcoder nope, there isn't any mention of that, by the looks of it, they are suggesting that the cameras are on the stage, but then they won't reach that far?! I am not sure
  • 1
    I mean, if it gets companies to stop being creepy bastards then sure. Probably the only way they'd actually listen, if they were told it was racist.
  • 1
    @Ellis it's a strecht but I guess that could work
  • 1
    @hamolicious Another bit of lore for you to make it credible, computers have historically found it difficult to see people of colour, e.g. the Kinect.
  • 0
    @Ellis oh I see that's what you meant... yes that's true, many other facial recognition software had problems too due to not enough samples to train on
  • 1
    @qwwerty

    I would have thought those cameras would have had:

    > Video Compression: H.265+

    Rather than just 264 ?

    Otherwise, aren't they going to swamp ones network ?

    How much are they anyhow, I couldn't easily find a price, other than perhaps $1,000 + ?

    3K cameras are much cheaper at around $200 each and use H.265+, the only issue is finding software that makes use of H.265+...

    Good for spotting aliens landing at night on your front porch. :-)
  • 2
    @Nanos
    Im curious how much bandwidth they require.
Add Comment