Discrimination by Math

Citizen Wealth Financial Justice
Facebooktwitterredditlinkedin

5399389a5e1ae61cf1eda5d0e84ef070Seattle   Having spent a week in Juneau, Alaska working with men and women dealing daily with the stigma and discrimination that comes with mental health challenges and disabilities, I should have been prepared for Cathy O’Neil’s Weapons of Math Destruction and its warnings of the pervasive, powerful, and often destructive and discriminating role that Big Data and the algorithms it is fueling are having on all of our lives. I wasn’t. But, I also wasn’t surprised.

One of the issues I heard about from the members of MCAN included being fired from jobs in violation of the Americans with Disabilities Act (ADA). They didn’t know the half of it! O’Neil detailed the way that huge employers including lower wage service establishments like McDonalds and others are using personality tests with data driven questions that sort out people with any kind of mental health issue. A lawyer in Tennessee watched his son, a super student with two years at Vanderbilt University who had dropped out for a couple of semesters to deal with depression successfully, somehow failed to land any minimum wage jobs as a janitor, burger flipper, and so forth from a number of companies using the same blunt instrument of a personality test. He filed a ADA class action suit that is still pending. Even that may be only the tip of the iceberg since data driven, resume reader machines are also discarding applications with a few misspellings, bad typos, and other trivialities.

These WMD’s, as O’Neill cleverly calls them, are perhaps most destructive when it comes to the way too many of them from police and crime statistics to loan applications to even the efforts to get insurance or an apartment from a landlord are discriminating, often invisibly, based on the zip codes identifying where someone lives. The question may never say race or risk, but the zip code identifying the neighborhood plots the Big Data odds, and they do not stack up in your favor. Stop and frisk programs, common under New York mayors Guilliani and Bloomberg and now touted by Trump, under analysis revealed huge racial profiling and targeting of African-Americans and Latinos because of misapplied and understood algorithms.

It was also disconcerting, given our long experience in the United States and Canada in providing service at citizen wealth centers for low-and-moderate income families to find that algorithms employed by payday lenders, diploma mills, and other shyster, predatory operations that are datamining names and contact information from people who are going online to ask for information and access to programs to provide them advice or assistance. I shouldn’t have been surprised. I can remember complaining to our tech people years ago when we used Google Ads about the fact that I could be writing a Chief Organizer Report on our fights against payday lenders and find, embarrassingly, ads running alongside my blog for some of the same blood sucking, scammers I was calling into account in the paragraphs next to their ads. Duh!

It goes on and on. O’Neill cautions that there are dangers here, and they need to be regulated not just for privacy along the European opt-in system, but for transparency. If you ever thought, even for a second, that some of the “value-added” tests for teacher evaluations that many states have employed were valid or about the meaning of things like body-math-indexes and wellness, your application for McDonald’s would also probably be rejected.

She does argue that it is not the math’s fault, as much as the way the math is being used. With a different objective some of the same algorithms could be pointing people in the right direction, connecting them with resources, getting them out of prison, rather than in, and into a job rather than out on the street.

There seems to be no mathematical formula on when that miracle might happen.

Facebooktwitterredditlinkedin