Mathematics for the oppression: An invitation to read “Weapons of Math Destruction” by Cathy O’Neil

J. David Beltrán
10 min readMar 7, 2021

In December 17–18 of 2020, within the framework of the “XIV JORNADA DE MATEMÁTICAS” of the Universidad Distrital Francisco José de Caldas, Professor Federico Ardila gave a delightful and thought provoking talk titled “Geometría, Robots y Sociedad”. Besides presenting a handful of really interesting results of the research of his group (a collaborative project between San Francisco State University and Universidad de los Andes), he discussed some very important ethical aspects about the projects he is heading and the impact they may have in our society. By the end of his speech, he recommended the attendees to read a book called “Weapons of Math Destruction” written by Cathy O’Neil in which some controversial issues of the “Big Data” era are exposed in detail.

“Geometría, Robots y Sociedad” Federico Ardila. December 2020.

Federico Ardila has always been immensely committed to the ethical responsibility of mathematicians around their profession and the building of inclusive spaces in mathematics in the USA and Colombia. He is an extraordinary mathematician and a person that I admire a lot. I welcomed his suggestion and I read the book (a couple of months after his talk). It was shocking and extremely interesting and I couldn’t help myself to share it with my friends.

Taken from https://es.wikipedia.org/wiki/Cathy_O%27Neil

The Author. Cathy O’Neil is a north-american mathematician and data scientist. She earned her PhD at Harvard University and was professor at the department of mathematics of Barnard College and MIT. By 2007, she moved to the private sector and worked for the hedge Fund D. E. Shaw among others companies in the financial industry. She has worked for many start-ups building predictive models and she started the Lede Program in Data Journalism at Columbia. She is the author and co-author of three books and the blog mathbabe.org.

Taken from https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815.

The Book. “Weapons of math destruction: how big data increases inequality and threatens democracy” was published by Crown Publishing Group in 2016. It is an exhausting examination of the “Dark Side” of the Big data era and all its nefarious implications in our modern society. The book is divided into 10 chapters that focus on different aspects in which algorithms and models threaten our daily life. From online advertising and job search to the justice system and democracy, Cathy O’Neil exposes an in-depth analysis about the consequences of the current full scale implementation of algorithms and automatized models in several scenarios close to us. Each specific point is thoroughly presented by means of real-life events (carefully referenced and developed) which illustrate the wide consequences of ill-used machine learning in our society. The book is a page-turning invitation to raise awareness about the gruesome power of algorithms and its miss-using nowadays; it is an urgent call to mathematicians and data scientists to take part in the fight against the “Weapons of Math Destruction” that hazard the modern world.

Weapons of Math Destruction.

One sunday, you are in front of your computer thinking about the possible destination of your next longed-for vacation. You search in google about hotel prices in San Andrés, el “eje cafetero” or even in Europe. Once you have finished your search, you close all the webpages you have been raking on and you turn off your computer. In the next two-three weeks you are being bombed with advertising about ticket flights, hotels and tour discounts on Facebook, Youtube and in almost each single web page you open. But, how does Google know that you want to travel soon ? This is a very simple scenario in which models are used to boost advertising. Each time you click on anything on the internet or you use a search engine like Google, you are feeding an enormous data-base about your interests, likes and dislikes and basically, your entire life. All the “crude” information that you have sent by your searches in Google is analyzed by machines and then a series of predictions are thrown according to what the computer thinks you may be interested in. This is how Google knows you want to travel soon: it has received a bunch of information that may indicate that you are looking for the opportunity to travel, it correlates this information with many similar data of people who has shown the same trend and have actually traveled and then Google’s computers conclude that you will behave in the same way, so they target you to receive the mass of advertisements that are related to tourism. Models gather some data, analyse them according to some rules and then generate predictions of the future in relation to its specific purposes.

Our life is rife with models, some of them are somehow harmless but others may be an authentic nightmare. Cathy O’Neil has managed to identify some pernicious models that she has baptized as “Weapons of Math Destruction” (WMDs from now) and from which she gives a precise taxonomy characterization:

  1. They are invisible to us, opaque; in some cases we don’t know they are there, let alone how they work. They stay in the gloom, wielding mathematical techniques and algorithms to deduce information about us.
  2. They are immense; they deal with millions of data each day and progressively, they are broadening their horizons to new and unexpected fields.
  3. Finally, they are unfair, harmful and they oppress the poor and the vulnerable. They contain pernicious feedback loops that wind up in catastrophic consequences for the underprivileged which are being modeled.

Although the social differences between a country as the USA and Colombia are clearly very different, WMDs are scaling rapidly towards more and more countries. Currently, I would hazard to say that they are present in some fields of our life and that undoubtedly, they will be embracing much more ground in our country soon. We are not conscious about the severity of the implementation of WMD in our society, and Cathy O’Neil’s book lights up the way in which WMDs are destroying the life of thousands of people and are widening the huge social gap in our countries. I would like to illustrate (based on Cathy O’Neil’s examples) the power of WMDs in our daily life, and I would like to do it by means of a specially discouraging topic to people in Colombia: the seek for a job.

Finding a Job.

Getting a job in Colombia is a task that each day is becoming more and more difficult. The current social and economic situation have worsened this serious problem (needless to mention the appalling consequences of the pandemic) and in general finding a job is a constant worry for many young people in the country.

In the United States, many companies use algorithms to optimize their hiring process. Instead of contracting a huge amount of staff directed to human resources (which will translate into paying many more salaries) they have machines that read hundreds of CVs, Resumes and job applications in seconds. These algorithms are designed to cut administrative costs and to reduce the risk of “bad hirings” (people that may require more training or will show a low performance in the job). One company called “Evolv, Inc. “ developed and sold this kind of softwares to many companies in the USA. One of the things that the model of Evolv tried to do was to calculate the likelihood of a new employee to stick around the job for an acceptable amount of time (one of the biggest expenses for companies is the workforce turnover, so they try to find people that long in the job for as much time as possible). The algorithm took into consideration some metrics that we may expect: for example, it evaluated the time that the applicant had been in previous jobs and assigned a score to each person according to these considerations. However, the algorithm also found an unexpected correlation between geography and willingness to stay in the job for a long time.

The algorithm found out that people that lived farther from the job were more likely to leave the job sooner, and this is quite reasonable. I can imagine myself taking a Transmilenio bus from the “20 de Julio” neighborhood to a job place in the very north of the city. It would mean to spend several hours each day waiting in those horrible buses! Now, imagine that this kind of algorithm is used in a city like Bogotá. The transportation system is just disastrous; living in a neighborhood away from the center of the city will automatically mean to spend a lot of time commuting to work. Moreover, usually the population of scarce resources are concentrated in the peripheries of the city because the housing expenses are cheaper there. In a few words, if such an algorithm were implemented in our city, applicants living in the suburbs of the city would be extremely punished. It would take little time for the algorithm to become classist.

A (usual) traffic jam in Bogota. Taken from https://www.uniminutoradio.com.co/bogota-tercera-ciudad-con-mas-trancones-viales-en-el-mundo/
Bogotá from Ciudad Bolivar. Taken from https://www.eltiempo.com/bogota/coronavirus-en-bogota-como-enfrenta-ciudad-bolivar-el-covid-19-y-la-delincuencia-537829

But the matter doesn’t stop there. It is common also for the applicants to be required to fill up “personality tests” in the hiring process. The above described algorithms also score applicants according to their results and as you probably may expect, sometimes those kinds of algorithms have been erroneously linking mental disorders with poor job performance. The result: a huge amount of people blackballed just because they suffer from those conditions. This is totally unacceptable. I know several people who suffer from bipolar disorder or hyperactivity and nevertheless, are excellent teachers/students positioned in prestigious universities.

It becomes worse. As Cathy O’Neil explains (2016) :

“ In 2001 and 2002, before the expansion of automatic résumé readers, researchers from the University of Chicago and MIT sent out five thousand phony résumés for job openings advertised in the Boston Globe and the Chicago Tribune. The jobs ranged from clerical work to customer service and sales. Each of the résumés was modeled for race. Half featured typically white names like Emily Walsh and Brendan Baker, while the others with similar qualifications carried names like Lakisha Washington and Jamaal Jones, which would sound African American. The researchers found that the white names got 50 percent more callbacks than the black ones. But a secondary finding was perhaps even more striking. The white applicants with strong résumés got much more attention than whites with weaker ones; when it came to white applicants, it seemed, the hiring managers were paying attention. But among blacks, the stronger résumés barely made a difference. The hiring market, clearly, was still poisoned by prejudice.”

The above arguments evidence that the implementation of an ill-constructed algorithm may end up with an unfair, classist and even racist model. But the problem goes on. They constitute authentic Weapons of Math destruction. These algorithms never receive feedback on their performance. They never know if an applicant they rejected would have been an excellent candidate. Since there is nothing to tell them that they made it wrong, they assume that all the process was successfully accomplished and a pernicious loop begins, ingraining more and more the prejudices in the algorithms. And they share all the taxonomy of WMDs:

They are opaque. When we are applying to a job and we are not called, we rarely know why we have been sweeped out of the process. We simply don’t receive any information and we continue looking for other opportunities. In addition, some algorithms are considered vital trade secrets and companies refuse to share how they work, so it is literally impossible to know what went wrong in our applications.

They are unappealable. Since we never receive a reason for being blackballed, we can not complain about the decision. We just don’t know anything about our process. Moreover, algorithms and mathematical models are surrounded by a scientific air that make us think they are fair and impartial. They are camouflaged with the evenhanded science and we trust them. It is hard to imagine that a machine is rejecting you for your race or social status, but as Cathy O’Neils points out (2016):

“An algorithm is just an opinion formalized into code”.

They may become immense. In the United States more and more companies are employing these algorithms each day. In terms of profits they are extremely efficient, and as we know, money is all that matters by these days.

Conclusion.

The implementation of those algorithms in countries like Colombia would be just catastrophic. This is just a little example that shows how a “little” WMD works, but the truth is that there are ubiquitous and a lot of them are constantly poisoning several aspects of our society. Cathy O’Neil gives us a marvelous invitation to take part in the change. The much better exposed and explained examples in her book are incredibly stimulating and will change the way that you look at the very attractive era of Big Data.

Finally, I highly encourage anyone to read Cathy O’Neil’s book and realize a little bit more about the power and consequences of the misuse of mathematics in our daily life. As Professor Federico Ardila and Cathy O’Neil, any mathematician or data scientist or any of us should impose more moral values in our profession and in the way we act each day.

--

--

J. David Beltrán

I’m a PhD. student in Mathematics at The University of Iowa. I’m originally from Bogotá-Colombia and I love the estimulating atmosphere that Medium offers.