20

https://banfacialrecognition.com/fe...

What? is this an actual thing people believe? Racially biase?? It's a fucking computer, it couldn't give less of a shit about what colour you're let alone what you do/don't believe. Am I missing something or have people completely gone fucked?

I understand the whole problem with Google that they don't have enough darker skin face samples which might make it a little worse at recognising them but wtf?

PS - Sorry if this shouldn't be a rant, wasn't sure it it's random or not

Comments
  • 13
    It's how (allegedly) the information is being used. Remember, it's not an isolated environment. The computer might not care but who's doing what with the data?
  • 7
    The algorithms are biased because their training data are biased because people are biased, and the problem is that algorithms are hard to change. Although I think this specifically is unlikely to be a real problem.
  • 6
    After scrolling to the bottom I also found this.
  • 2
    the only thing I agree with is matching the home of the buyer and that
  • 1
    @pk76 this is a good point but the article is not addressing that, its saying that the computers are racist and that all facerecognition is a bad terrible thing that will ruin everything
  • 6
    @hamolicious and then arguing semantics makes sense, but that's about it. While I disagree that it's the "computer that is racist" because obviously it's not, it's just doing what it's told, the simple fact is bias can be introduced during training and after with the data that is collected.

    This isn't a peer reviewed journal article, it's a simple piece of media. Some leeway has to be given for how non-technical people are going to speak about the matter.
  • 2
    @hamolicious A combination of worldview (pervasive racial bias, in this case) and bad information (computers are racist, in this case) can lead to something like this quite easily. Whether the people producing it actually believe what they're saying or if they are just producing something to scare people who do is another question entirely. After all, there are serious movements in the US right now to restructure large parts of the curriculum to remove "elements of racial oppression," the latest victim of which is mathematics. (That such ideas will actually lead to an uneducated and oppressed racially segregated underclass has either escaped such people or is the whole point of their campaign, although I haven't looked closely enough to try making distinctions for myself.)
  • 17
    Why do you want to be tracked?

    Also, people are corrupt and horrible creatures; if someone has access to this data, something bad will eventually come of it. The more widely available the data, the worse the result.
  • 2
    @angularvictim97

    cool name bro.
  • 2
    @Root yes but with constant checks and upgrades, updates, proper procedures... we can minimise the risk of the data being put out in the wild and take all of the benefits of such technology and use it for the better, sounds like a fairy tale now that I say it 😂
  • 4
    The privacy issues are for anyone and have nothing racist on them. Unless of course the article concedes that people of colour commit disproportionately more crimes, statistically seen, so that they also would be targeted by law enforcement more often.
  • 2
    @hamolicious People have been saying that for decades and it has never been true. Besides, exactly how is this data useful? Marketing? Security? Making sure everyone is enjoying themselves? 🙄
  • 1
    @hamolicious @Root *waves hands*
    Big data!
    Machine learning!
    AI!

    You must be against progress if you're against this!
  • 2
    @12bitfloat I don't have Facebook or anything and its very simple: its none of their goddamn business to know where I am or go at what moment.

    Luckily, in the Netherlands, this won't be introduced as fast since the biometric data processing and storing laws are veeery strict.
  • 1
    @Root well there is one and its sort of an ehhh, imagine there is a terrorist with quite a big record of bad stuff... and the "authority" has multiple pictures but no leads, you can in theory, train a model to ping said authorities when they have been found on face recognition cameras and stop whatever could've happened.
  • 6
    @hamolicious The problem is that history has shown that states with massive surveillance are not safer for the citizens because the state itself goes rogue with that much power. Besides, each and every really huge crime with millions of victims has always had the support of states behind.

    It's just that many people and especially politicians ignore the facts.
  • 3
    @Fast-Nop and many people don't appreciate the study of history. Even outside of politics, I've found a study of programming history to be beneficial. Plenty of things have changed, sure. But there's still useful stuff we've forgot.
  • 4
    @pk76 Yeah history repeats. Sure, people say "this time it's different". They have always said this, that's why history repeats.
  • 5
    @hamolicious The argument in favor of surveillance and control is always "for safety!" (or "think of the children!") And yet no oppressive regime has ever been safer for its citizens than more open and free societies. Compare China and England, North and South Korea, etc.

    Also, your example is pretty contrived and far-fetched. A well-known terrorist would not show up at a concert, and if they did, it would likely be eith a suicide belt. Open carry would help significantly more than waiting on the police. Even if the surveillance would help in such a scenario, that scenario is extremely rare and unlikely, and the tracking would be actively harming people's privacy and liberty on a daily basis.

    Those who give up freedom for safety deserve neither, and often lose both.

    The point is, the more power people have over others, the more corruption and intolerance there will be. You can basically always trust people to serve their own interests and improve their own lot using anything at their disposal, so give them the ability to protect and better themselves, rather than control others. The world will be a better and safer place.
  • 0
    >what color you're let alone
    The true terror of this post: technically, this is grammatically okay
  • 1
  • 0
    @hamolicious normally in this context one would use the normal "you are". Seeing "you're" in that precise place is startling... but technically correct.

    The best kind of correct.
  • 1
    @Parzi huh i never knew that :/
  • 2
    The stupidest thing about that is that every ppl wanting this for sure post photos of the live show the next day.
  • 0
    Bad training sets for ML = biased computer. Just google 'racist machine learning' and you should get plenty results from past years fuckups.
    Assumption that something is objective just because it comes out of an algorithm is dangerous. (Similar to sawblade - must be operated by a skilled operator to limit the danger)
  • 1
    Actually I have been working in this field for many years and these people who think FR is somehow raciest are just ignorant and uneducated.

    I promise I will make a Solo rant about this subject very soon. As I am just coming out of a severe depression because of this very subject of people reacting poorly to our technology.
  • 1
    Same ppl who use mobile phones, and are obssesed with likes, and tagging thier friends, care about this.

    Think of the movie minority report, but instead of retina scans, it uses face recog...
  • 3
    I still feel that privacy is important.
  • 2
    @zcoder
    Remember, the more you differ from the average the more interesting you are to track. What about the concept of staying private by blending in with dull data?
  • 3
    "Deportation of immigrant fans." Immigrants won't get deported. That's absurd. Unless that is, they crossed the border illegally instead of going through the proper channels, hurting legal immigrants in the process. And even then, deportation is uncertain.
  • 0
    @nitnip FR used for deportation without Congress approval.
    https://deseret.com/2019/7/...
    Give government (or its agencies) a power and it'll get abused. They can start testing it on a concert venues. Once you got PoC right, you can deploy on any public camera system and do another step to a surveilance dystopia.
  • 0
    @qwwerty
    I dont remember cities having good cameras in the first place...
  • 1
    @Gregozor2121 sure, and it's not like they can be upgraded in a future. Because it's too hard to start another terrorism hysteria which would free some funds for such technology.
    btw sample of what is currently offered on the market https://business.panasonic.co.uk/se...
  • 0
    @Gregozor2121 I am very unique... shh. But based on the OP message, there really isn't blending in when they have video of your face.
  • 0
    @zcoder nope, there isn't any mention of that, by the looks of it, they are suggesting that the cameras are on the stage, but then they won't reach that far?! I am not sure
  • 1
    I mean, if it gets companies to stop being creepy bastards then sure. Probably the only way they'd actually listen, if they were told it was racist.
  • 1
    @Ellis it's a strecht but I guess that could work
  • 1
    @hamolicious Another bit of lore for you to make it credible, computers have historically found it difficult to see people of colour, e.g. the Kinect.
  • 0
    @Ellis oh I see that's what you meant... yes that's true, many other facial recognition software had problems too due to not enough samples to train on
  • 1
    @Nanos
    Im curious how much bandwidth they require.
  • 0
    @Nanos

    Non-western nations tend to have less laws, and less enforcement.

    I've known people that felt more free in the slums of ukraine than in the U.S. Go figure.

    A lot of it comes down to fear, coercion, and social control. A lot of 'third world' countries have plenty of laws but lax enforcement and regulation, and for example, in places like mexico, if you really don't like the law you can break it and simple pay fine (a bribe) to do so. In america theres a good chance you go to prison.

    Western society in general could do with a whole lot more 'live and let live' and a whole lot less police (though I'm not against policing in general).
Add Comment