Algorithms and Machine learning / AI
Generally, the problem with machine learning/AI applications is that an algorithm processes a slew of statistics and comes up with a probability that a certain person might be -
a bad insurer, a risky borrower, a terrorist, or a mediocre employee.
That probability is distilled into a score, which is invariably inaccurate, and can turn someone’s life upside down. And if a victim of this discrimination fights back, “suggestive” countervailing evidence simply won’t suffice. The case must be ironclad.
Terrifyingly, the human victims of flawed algorithms and AI are held to a far higher standard of evidence than the algorithms themselves.
There are also a different payoffs for the various organisations and companies using these processes. A kind of political currency, and a sense that problems are being fixed. But for businesses it’s just the standard currency, and that’s money.
For many of the businesses running these rogue algorithms, the money pouring in seems to prove that their models are working. Look at it through their eyes and it makes sense. When they’re building statistical systems to find customers or manipulate desperate borrowers, growing revenue appears to show that they’re on the right track. The software is doing its job.
The trouble is that profits end up serving as a stand-in, or proxy, for truth - a dangerous confusion. This happens because data scientists all too often lose sight of the people on the receiving end of the transaction.
They certainly understand that a data-crunching program is bound to misinterpret people a certain percentage of the time, putting them in the wrong groups and perhaps denying them a job or a mortgage for their dream house.
But the real problem is that the people running these flawed systems don’t dwell on those errors. Their feedback is money, which is also their incentive. Their systems are engineered to grab ever more data and fine-tune their analytics so that more money will pour in.
Investors, of course, feast on these returns and shower these culprit companies with more money.
http://www.theunexplained.tv/media/theun...ed-306.mp3
Generally, the problem with machine learning/AI applications is that an algorithm processes a slew of statistics and comes up with a probability that a certain person might be -
a bad insurer, a risky borrower, a terrorist, or a mediocre employee.
That probability is distilled into a score, which is invariably inaccurate, and can turn someone’s life upside down. And if a victim of this discrimination fights back, “suggestive” countervailing evidence simply won’t suffice. The case must be ironclad.
Terrifyingly, the human victims of flawed algorithms and AI are held to a far higher standard of evidence than the algorithms themselves.
There are also a different payoffs for the various organisations and companies using these processes. A kind of political currency, and a sense that problems are being fixed. But for businesses it’s just the standard currency, and that’s money.
For many of the businesses running these rogue algorithms, the money pouring in seems to prove that their models are working. Look at it through their eyes and it makes sense. When they’re building statistical systems to find customers or manipulate desperate borrowers, growing revenue appears to show that they’re on the right track. The software is doing its job.
The trouble is that profits end up serving as a stand-in, or proxy, for truth - a dangerous confusion. This happens because data scientists all too often lose sight of the people on the receiving end of the transaction.
They certainly understand that a data-crunching program is bound to misinterpret people a certain percentage of the time, putting them in the wrong groups and perhaps denying them a job or a mortgage for their dream house.
But the real problem is that the people running these flawed systems don’t dwell on those errors. Their feedback is money, which is also their incentive. Their systems are engineered to grab ever more data and fine-tune their analytics so that more money will pour in.
Investors, of course, feast on these returns and shower these culprit companies with more money.
http://www.theunexplained.tv/media/theun...ed-306.mp3