WITH NEW PATENTS FACEBOOK BRINGS US ONE STEP CLOSE TO A DYSTOPIAN FUTURE | WHAT REALLY HAPPENED


WITH NEW PATENTS FACEBOOK BRINGS US ONE STEP CLOSE TO A DYSTOPIAN FUTURE

This year Facebook filed two very interesting patents in the US. One was a patent for emotion recognition technology; which recognises human emotions through facial expressions and so can therefore assess what mood we are in at any given time-happy or anxious for example. This can be done either by a webcam or through a phone cam. The technology is relatively straight forward. Artificially intelligent driven algorithms analyses and then deciphers facial expressions, it then matches the duration and intensity of the expression with a corresponding emotion. Take contempt for example. Measured by a range of values from 0 to 100, an expression of contempt could be measured by a smirking smile, a furrowed brow and a wrinkled nose. An emotion can then be extrapolated from the data linking it to your dominant personality traits: openness, introverted, neurotic, say.

A common misconception about algorithms is that they can be easily controlled, rather they can learn, change and run themselves-a process known as deep “neural” learning. In other words, they run on self-improving feed back loops. Much of this is positive of course, unthought of solutions by humans to collective problems like climate change are more possible in the future. The social payoffs could be huge too. But what of the use of AI for other means more nefarious. What if, as Yuval Noah Hariri says, AI becomes just another tool to be used by elites to consolidate their power even further in the 21st century. History teaches us that it isn’t luddite to ask this question, nor is it merely indulging in catastrophic thinking about the future. Rapidly evolving technology ending up in the hands of just a few mega companies, unregulated and uncontrolled, should seriously concern us all.

Algorithms, as Jamie Bartlett the author of The People Vs Tech puts it, are “the keys to the magic kingdom” of understanding deep seated human psychology: they filter, predict, correlate, target & learn. They also manipulate.

Webmaster's Commentary: 

But what happens when these algorithms get something horrifically wrong?!?
Where is the evaluating human "common sense" factor in working with these predictors?!?

Are we now looking at "trial by expression" as we try to get on our way, by every public transport, trains, buses, and plains?!?

Mike and I live here on Oahu, so must use planes to get anywhere on the mainland.

My late Dad was an aircraft instrument mechanic, and I used to love to fly, because I naively thought that every other instrument mechanic paid as much attention as my Dad did to insuring that an instrument calibrated out perfectly before he signed off on it.

Now, I hate the experience, and the arrival of the "Grope-Nazis" (allegedly for our "protection"), has made travel off-island a singularly unpleasant experience.

And remember; the people who invented the whole "terror, terror, terror" mentality in the Bush era, like good old Michael Chertoff, the co-author of the US Patriot Act, and the 2nd Director of US Homeland Security, rarely have to take public transportation, and now make a killing, selling the equipment through which one has to pass through to in order to get on the aircraft, or bus, or train.

Comments

SHARE THIS ARTICLE WITH YOUR SOCIAL MEDIA