5
Bor3i
14d

Data scientists unite!
I am working on a face recognition solution to identify and tag different faces in real time and currently using chinese whispers algorithm for clustring. The more I think about it the more inadequate it sounds (number of faces encoded increase over time thus the complexity). I've heard about YOLO but cant really decide. Please enlighten me!

Comments
  • 6
    Please stop working on weapons of mass terror for a living. Think about what you’ll have to tell your kids some day.
  • 0
    @FrodoSwaggins I understand your worries although they are counted as a prejudice. I will share with you some details about what I'm working on. Our solution is targeted to shopping malls and fairs, we aim to provide business owners with behavioral information about their visitors and clients within their establishement premises. We do not store visuals of any kind and we do not link real identities to facial features. The main focus here is the detection of persons and their reappearance not their identity nor their shopping record.
  • 6
    @Bor3i sorry, I’m still not ok with it. There’s a few things wrong with what you’re saying.

    1. The business can not provide a guarantee that that will remain true or in fact was ever true to each customer, without letting the customer audit their system and that’s obviously not happening. I’m about guarantees not promises. Not having cameras running into a face recognition algorithm is a better guarantee.

    2. It is unlikely that the store will post a warning outside explaining the technology recognizing faces inside to entering customers. I feel I have a right to know and choose whether to subject myself or my children to that, and it is unethical/should be illegal to forgo mentioning that. More, it is only a matter of time until most everywhere will be employing a system like this and at that point do I have a say in the matter any more?

    3. Understanding behaviors of customers is dangerous information because the only logical presumption is it will be used for social engineering, as there are many known vulnerabilities/exploits in human conscience, and it is highly unethical to exploit those unbeknownst to a customer.

    4. This technology can be used to track down and kill political enemies in countries with tyrannous regimes and we should not be proliferating weapons of mass terrorism in developed and “free” society because it’s a spit in the face to the people this will be used to kill. It makes me absolutely fucking sick that people can develop this and never face the consequences incurred by their discovery. The fellows who developed the atomic bomb in the Manhattan project lives with that the rest of their lives. And because this is a nifty software program suddenly we don’t have to care? This is ultimately what makes it a weapon of terrorism.
  • 3
    @Bor3i

    5. I don’t think this is prejudice the same way I think feeling that the holocaust was one of the worst acts of genocidal terrorism is not being prejudice against nazis. Like it or not there is right and wrong in the world and lots of room to argue it. It is my civic duty to challenge people who may not have considered greater implications of dangerous technology.
  • 0
    @FrodoSwaggins Thank you for taking the time to respond in such a detailed manner. I will try my best to comment on each point you've made, not as a justification or a mean to convince you but in order to develop the discussion and learn.
    1. You are absolutly right, there is no way to take into consideration the preference of a customer before entering the establishement (still not justifying). Yet there's a solution, and I'll be explaining it in (2).
  • 0
    @FrodoSwaggins

    2. Where we intend to operate, all individuals/businesses are required by law to display disclaimers to visitors informing them of the use of cameras indoors even for security purposes. Use of outdoor cameras is restricted to goverment and we have no control over that. How is this contributing to (1)? If a business tells me as they are obliged that they will run my face against some sort of algorithm and if I disagree with it, I am free to eliminate that business from my to go list. One thing worries me as you said, what if all businesses do it? Will we have a choice? Are we enabling them? Have patience and read my next comments.
  • 0
    @FrodoSwaggins
    3. I cant stress enough that we dont provide the behavior of A specific client. We sum it up, by time frame on the total of visitors, by gender(biological not personal classification), by season, by group (family, couple,...) and by age group. We do have a consciousness and an obligation to protect those who are vulnerable for we are them. Technicaly, any data used to analyze customer retention will have a short life. We have no end/intention for mining data of individuals.
  • 0
    @FrodoSwaggins
    4. I strongly agree with you on this, this kind of tech is capable of providing location of a target if it is put in the wrong hands. We are aware to which extent our solution may harm individuals. From the start, we've built the solution on the basis that it is impossible to run recognition remotely, remote terminals aka cameras are immune to external interference (All data within is volatile, any attempt of access we'll be notified and everything will be purged). I can go on endlessly on how we thought about how we can prevent missuse much more than the features themselves. Who can garantee this? We, the founders of the company and agencies that we are required to provide with details/facts to ensure that no subject privacy is to be invaded.
  • 1
    @FrodoSwaggins
    5. I wish more people were of a critical thinking as you. I would've appreciated if you mentioned a more recent act of hate/war on humanity as till this day genocides are being committed in various numbers and I hate the idea of making the holocaust the sole refrence to human evil because it recorded a high number of executions. The act of killing itself is terrible and unthinkable. These days we grow more accustomed and comfortable hearing about a couple of deaths and terrorist attacks here and there. We should act upon every act of hate(this is my personal opinion). Now that you've detailed your conviction, I do respect you for that. It would have been better from the start to be precise because your comment triggered a defense mechanism in me. I apologize if I miss used the word prejudice but I can only know so much from a single sentence.
  • 0
    @FrodoSwaggins
    Now that you have challenged me, what is your end? Do you mean to convince me of dropping what I am working on or help me direct it as best a human can? I am intrigued to hear more from, if you are interested of course.
  • 0
    @Bor3i thank you also for taking time to respond. I think I said all I need to say and won’t make further attempt to convince you but I guess my closing statement is that I believe machine learning and other technologies in this class are weapons of terror that have been allowed to proliferate freely without much thought to the deeper implications. I trust nobody to design these systems correctly; mostly because I’ve seen that line of code in every os kernel that says “// does this work lol”. That’s not a statement about you and more of a statement about general software information responsibility in the industry. And it only takes one in a hundred with a bad motive for it to all come crashing down. That’s all I have to say.
  • 2
    @Bor3i I guess there is one fine point I’d clarify, I do see signs that notify video surveillance but I do not see signs saying facial recognition is in use here. I’m sort of kind of ok with just being on a cctv camera if it’s just being recorded to a vcr as a dumb example.
Your Job Suck?
Get a Better Job
Add Comment