(Before It's News)
Read aguanomics http://www.aguanomics.com/ for the world’s best analysis of the politics and economics of water
I was excited to read this book as soon as I heard Cathy O’Neill, the author, interviewed on EconTalk.
O’Neill’s hypothesis is that algorithms and machine learning can be useful, but they can also be destructive if they are: opaque, scalable and damaging. Put differently, an algorithm that determines whether you should be hired or fired, given a loan or able to retire on your savings is a WMD is it is opaque to users, “beneficiaries” and the public, has an impact on a large group of people at once, and “makes decisions” that have large social, financial or legal impacts on you. WMDs can leave thousands in jail or bankrupt the pensions of a company’s employees — without warning or remorse.
As examples of non-WMDs, consider bitcoin/blockchain (the code and transactions are published), algorithms developed by a teacher (small scale), and Amazon’s “recommended” lists, which are not damaging (because customers can decide to buy or not).
As examples of WMDs (many of which are explained in the book), consider Facebook’s “newsfeed” algorithm, which is opaque (based on their internal advertising model), scaled (1.9 billion zombies) and damaging (echo-chamber, anyone?)
I took numerous notes while reading this book, which I think that anyone should read who’s interested in the rising power of “big data” (or big brother) or bureaucratic processes, but I will only highlight a few here.
- Models are imperfect — and dangerous if they are given too much “authority” (as I’ve said)
- Good systems use feedback to improve in transparent ways (anti-WMDs)
- WMDs punish the poor b/c the rich can afford “custom” systems that are additionally mediated by professionals (lawyers, accountants, teachers)
- Models are more dangerous the more removed their data are from the data of interest, e.g., models of “teacher effectiveness” based on “student grades” (or worse alumni salaries)
- “Models are opinions embedded in mathematics” (what I said) which means that those weak in math will suffer more. That matters when “American adults… are literally the worst [at solving digital problems] in the developed world.”
- It is easy for a “neutral” variable (e.g., postal code) to reproduce a biased variable (e.g., race)
- Wall Street is excellent at scaling up a bad idea, leading to huge financial losses (and taxpayer bailouts), and these were not accidental when Wall Street knew that profits were private but losses social.
- O’Neill has an interesting exploration of how for-profit colleges use online advertisements to attract (and rip off) the most vulnerable — leaving them in debt and/or taxpayers with the bill. Sad.
- A good program (for education or crime prevention) relies on qualitative factors as well that are hard to code into algorithms. Ignore those and you’re likely to get a WMD that is biased. I just saw a documentary on housing for the poor that asked “what do the poor want — hot water or a bathtub?” They wanted a bathtub bc they had never had one and could not afford to heat water. #checkyourbias
- At some points in this book, I disagreed with O’Neill’s appeal to justice in lieu of efficiency. She does not want to allow employers to look at job applicants’ credit histories because “plenty of hardworking people lose jobs,” etc. Yes, that’s true, but I can see why employers are willing to take the chance of losing a few good people to avoid a lot of bad people, especially if they have lots of remaining (good credit) applicants. Should this happen at the government level? Perhaps not, but I don’t see why a hotel chain cannot do this: the scale is too small to be a WMD.
- I did, OTOH, notice that peer-to-peer lending might be biased against lender like me (I use LendingClub) who rely on their “public credit models” as it seems (in my increasingly regrettable experience) that these models are badly calibrated, leaving retail suckers like me to lose money while institutional borrowers are given preferential access. (I’m hoping to look into the retail/institutional performance gap.)
- O’Neill’s worries about injustice go a little too far in her counterexamples of the “safe driver who needs to drive through a dangerous neighborhood at 2am” as not deserving higher insurance rates, etc. I agree that this person may deserve a break, but the solution to this “unfair pricing” is not a ban on such price discrimination but an increase in competition, which has a way of separating safe and unsafe drivers (it’s even called a “separating equilibrium” in economics). Her fear of injustice makes me think that she’s perhaps missing the point some times. High driving insurance rates are not a blow against human rights, even if they capture an imperfect measure of risk, because driving itself is not a human right. Yes, I know it’s tough to live without a car in many parts of the US, but people suffering in those circumstances need to think bigger about maybe moving to a better place.
- Worried about biased advertisements? Just ban all of them.
- O’Neill occasionally makes some false claims, e.g., that US employers offered health insurance as a perk to attract scarce workers during WWII. That was mainly because the government put on a wage freeze, and that was the only way to offer “more money” to workers. In any case, it would be good to look at how other countries run their health systems (I love the Dutch system) before blaming all US failures on WMDs.
- I’m sympathetic to the lies and distortions that Facebook and other social media spread (with the help of WMDs), but I’ve gotta give Trump credit for blowing up all the careful attempts to corral, control and manipulate what people see or think. Trump has shown that people are willing to ignore facts to the point where it might take a real WMD blowing up in their neighborhood to take them off auto pilot.
- When it comes to political manipulations, I worry less about WMDs than the total lack of competition due to gerrymandering. In the 2016 election, 97 percent of representatives were re-elected to the House.
- Yes, I agree that humans are better at finding and using nuances, but those will be overshadowed as long as there’s a profit (or election) to win. Can we push back on those problems? Yes, if we realize how our phones are tracking us, how GPA is not your career, or how “the old boys network” actually produced a useful mix of perspectives.
- Businesses will be especially quick to temper their enthusiasm when they notice that WMDs are not nearly so clever. What worries me more are politicians or bureaucrats who commit to use WMDs based on a sale-pitch that saves them time but shifts risk onto citizens. That’s how we got dumb do not fly lists, and other assorted government failures.
- Although I do not put as much faith in “government regulation” as a solution to this problem as I put into competition, I agree with O’Neill that consumers should own their data and companies only get access to it on an opt-in model, but that model will be broken for as long as the EULA requires that you give up lots of data in exchange for access to the “free” platform. Yes, Facebook is handy, but do you want Facebook listening to your phone all the time?
Bottom Line: I give this book FOUR STARS for its well written, enlightening expose of MWDs. I would have preferred less emphasis on bureaucratic solutions and more on market, competition, property rights solutions, but that’s another topic for debate.