0%

https://intelligentmachineslab.com/wp-content/uploads/2020/02/shutterstock_554230621_web.png

Blog

Like millions of people around the globe, I have been helplessly watching the explosion of the pandemic and the resultant economic hardship and trying to work out my part in fighting the systemic racism and inequity that they have laid bare. Having spent 30 years in the world of digital technology, the only place I know to start is here in my field, by trying to first understand the role that digital platforms, products and technology play in inadvertently enabling and propagating damaging bias.

Digital technology is one of the most omnipotent forces of change on earth and still holds promise to level the playing field. The disintermediation revolution that is rearranging every industry, social media, technology that brings education to the masses– these are all powerful positive forces that are leaning heavily on the arc that bends towards justice and equality. Social media brought the tragedy of Mr. George Floyd to millions of people across the globe, from San Francisco to Sydney, in a very visceral way, sparking what the New York Times is calling the largest movement in U.S. history.

The same social media, our free-for-all digital public square, has also been very effectively used to drive division and misinformation, and paradoxically, narrowing our aperture into the world  and even influencing elections. Technology providers are quick to point out that the fault is not in the technology, but in us. Facebook claims that it is not their place to interfere with public discourse and that their platform is fundamentally neutral. This, however, ignores the conflict of holding public discourse on a for-profit platform that has some of the most opaque algorithms the world has ever seen, which influence user behavior in ways no one fully comprehends.

Racial biases built into facial recognition AI technologies are well documented, and yet that has not slowed down rapid adoption by law enforcement  for citizen surveillance across the globe. A 2016 study by the Georgetown Law Center on Privacy and Technology found that one in four U.S. state or local police departments had access to facial recognition technology, and half of all American adults are in a police facial recognition database. That was four years ago. We are still trying to understand the implications of such a system. Candidate screening systems, Airbnb, Netflix, Uber and almost every other scaled technology has one or more of the following biases built in to their platforms and products:

  • Design bias, where deliberate or subconscious bias of a designer is baked right into the product
  • Usage based bias, where digital products pigeon hole us into information bubbles, based on our prior searches, movies seen, ads viewed to wall us off from possibilities beyond, and finally the most innocuous of them all
  • Data bias, where training data which is generally viewed as objective, neutral, cold skews entire systems, like facial recognition systems.

While there should be substantive and meaningful debate on the role that  all technologies and platforms play in sustaining and propagating entrenched societal biases, an area that does not get much focus is the nascent but growing area of Automated Essay Scoring (AES). Assessment through written essays is commonly used to demonstrate writing, comprehension, and critical thinking skills, but grading even a small number of essays is an arduous and time-consuming task. Already pervasive enough to go by an acronym, AES is right in the cross-hairs of the burning need to get quality education equitably to the masses.

In August of 2019, Vice’s research into AES showed that despite being “fooled by gibberish and highly susceptible to human bias, automated essay-scoring systems are being increasingly adopted”. The article goes on to talk about how essay grading technologies widely used for standardized tests, “tended to underscore African Americans and, at various points, Arabic, Spanish, and Hindi speakers—even after attempts to reconfigure the system to fix the problem.” This is extremely problematic, considering the diversity in our schools and colleges. The fate of millions of students cannot be subject to biased black boxes.

AES engines are now either the primary or secondary grader on standardized tests in at least 21 states, according to a survey conducted by Motherboard. AI algorithms typically model the behavior of human graders, using patterns and proxies to do so. Trained on hundreds or thousands of human-scored essays, new essays are assessed for patterns that most closely match ones in the training set. Any bias in the human-scored essays used to train the machine is faithfully propagated into all machine generated grades and potentially amplified.

The COVID-19 crisis could exacerbate the issue, with school districts and higher educational institutions being compelled to adopt more digital learning tools as more courses and classrooms go online. If the AI technologies they deploy do not explicitly preclude human bias in the design and training process, instead of leveling the playing field, we may inadvertently be institutionalizing these biases across the system.

In future posts, I will be exploring ideas on how we can embrace bias-free AI with a focus on language processing applications. This is a critical piece in the larger puzzle of creating equality in a digital world.

– Joy Dasgupta, CEO of Intelligent Machines Lab

Latest Comments

Avatar
Jisha Mathew
July 20, 2020, 5:25 pm
Very Insightful
Avatar
Valerie
July 20, 2020, 5:30 pm
Everyone deserves equality in education. I look forward to hearing more on the subject!
Avatar
Jenna Mulrenan
July 20, 2020, 6:18 pm
Really enjoyed reading your thoughts on the biases that can be embedded into AI and the emphasis on the need to address them in our current education system. Great content!
Avatar
Sumit Roy
July 20, 2020, 6:37 pm
I truly believe, the digital technology will help the global economy to remain connected and keep going amid this pandemic. The above blog post resonate with my thinking.
Avatar
Vijayalakshmi
July 22, 2020, 7:04 am
Great post! It provides much insight on the crucial role of AI in education system and plenty of food for thought.
Avatar
Trishna
August 4, 2020, 6:21 pm
Very informative!

Leave a Comment

Begin typing your search above and press return to search.
Top