Summary: The impact of algorithms is starting to scale up to a dizzying degree, and literally billions of people are feeling the ripple effects.
Original author and publication date: Chris Taylor – August 31, 2020
Futurizonte Editor’s Note: No more human kings or presidents. Now, algorithms rule our lives. One question: Who/what rules de algorithms?
From the article:
The world in 2020 has been given plenty of reasons to be wary of algorithms. Depending on the result of the U.S. presidential election, it may give us one more. Either way, it’s high time we questioned the impact of these high-tech data-driven calculations, which increasingly determine who or what we see (and what we don’t) online.
The impact of algorithms is starting to scale up to a dizzying degree, and literally billions of people are feeling the ripple effects. This is the year the Social Credit System, an ominous Black Mirror-like “behavior score” run by the Chinese government, is set to officially launch. It may not be quite as bad as you’ve heard, but it will boost or tighten financial credit and other incentives for the entire population. There’s another billion unexamined, unimpeachable algorithms hanging over a billion human lives.
In the UK, few will forget this year’s A-level algorithm. A-levels are key exams for 18-year olds; they make or break college offers. COVID-19 canceled them. Teachers were asked what each pupil would have scored. But the government fed these numbers into an algorithm alongside the school’s past performance. Result: 40 percent of all teacher estimates were downgraded, which nixed college for high-achieving kids in disadvantaged areas. Boris Johnson backed down, eventually, blaming a “mutant algorithm.” Still, even a former colleague of the prime minister thinks the A-level fiasco may torpedo his reelection chances.
In the U.S., we don’t tend to think about shadowy government algorithms running or ruining our lives. Well, not unless you’re a defendant in one of the states where algorithms predict your likelihood of committing more crime (eat your heart out, Minority Report) and advise judges on sentencing. U.S. criminal justice algorithms, it probably won’t surprise you to learn, are operated by for-profit companies and stand accused of perpetuating racism. Such as COMPAS in Florida and Wisconsin, which ProPublica found was twice as likely to label Black defendants “high risk” than white defendants — and was wrong about 40 percent of the time.
The flaws in such “mutant algorithms,” of course, reflect their all-too-human designers. Math itself isn’t racist, or classist, or authoritarian. An algorithm is just a set of instructions. Technically, the recipe book in your kitchen is full of them. As with any recipe, the quality of an algorithm depends on its ingredients — and those of us who have to eat the result really don’t think enough about what went on in the kitchen.
READ the complete original article here