Algorithms are fast becoming the deciders in how our lives shape out. What are we doing to make sure that we still get a fair shake?
After taking a break from his studies, Kyle Behm applied for seven jobs, only to find he is repeatedly turned down. Having suffered from bipolar disorder, he was flagged again and again by the same computerized personality test as an undesired applicant.
A woman in Fresno, California got on the radar of local police after a risk-assessment software raised her risk score. Why? One of her tweets included the word “rage”. She was referring to a popular card game.
In 2010 schools in Washington DC fired over 200 teachers based on an algorithmic evaluation of their performance.
Algorithms now decide how much you pay for your insurance and the terms of the loan you take. Algorithms also decide what information you see online, and who will be prompted to join your social circle online.
Predictive Analytics, the technology behind profiling and rating systems is mature and already marketed as off-the-shelf products, says technology regulation expert Dr. Nimrod Kozlovski. Profiling and rating systems will take a growing part in our lives, he added.
“Your government, Uber, Airbnb, eBay, Tinder. Gradually each entity, private or public, will start making risk models and rate the people who come in contact with it – [an] algorithmic looking glass wherever you go,” Kozlovski said in a TEDx talk.
Over 30 local governments in China are currently experimenting with score-based social-credit systems. The system currently tested in Hangzhou, a city with a population of over 9 million in the east of China can impact loan rates, job proposals, and school admissions. A person’s score can change due to infractions of municipal laws, like paying the wrong subway fare. The system piloted in Shanghai factors traffic violations and late bill payments.
The Chinese Communist Party said it plans to roll out a nationwide social scoring system by 2020. The system is designed to utilize diverse data sources, including a person’s online activity. It will impact the ability to get loans and access to luxury hotels. Chinese citizens with a high score would also be eligible for more rapid government services.
The purpose of the system is to “allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step,” according to The Wall Street Journal.
In addition to data created by the actions we take online, and governmental and private databases, algorithms are also increasingly relying on an analysis of our physical responses in the real world.
According to Kozlovski, video analysis algorithms can now identify a person trying to avoid eye-contact. Sensors can now detect a higher than usual breath rate, pupil dilation, sweating and muscle stiffness. He explains that “The techniques are getting better and better.”
Based on her experience working for companies creating such algorithms, data scientist and social activist Cathy O’Neill concluded that the growing use of algorithms in such applications would result in a dystopian world.
“Algorithms would make sure that those deemed losers would remain that way,” O’Neill writes in her bestselling book Weapons of Mass Destruction published in 2016. “A lucky minority would gain ever more control over the data economy, taking in outrageous fortunes and convincing themselves that they deserved it,” she writes.
“If the algorithm is the new policing power, we believe it needs to answer to the rules,” posits Kozlovski. “The same way we limit the real police and its power, the same way we forbid private businesses from arbitrarily discriminating their customers. We believe it’s our right to police the policing algorithms”.