What drives Nithya Sambasivan’s fight for fairness

When Nithya Sambasivan was finishing her undergraduate degree in engineering, she felt slightly unsatisfied. “I wanted to know, ‘how will the technology I build impact people?’” she says. Luckily, she would soon discover the field of Human Computer Int…


This content originally appeared on The Keyword and was authored by MJ Pham

When Nithya Sambasivan was finishing her undergraduate degree in engineering, she felt slightly unsatisfied. “I wanted to know, ‘how will the technology I build impact people?’” she says. Luckily, she would soon discover the field of Human Computer Interaction (HCI) and pursue her graduate degrees. 

She completed her master’s and PhD in HCI focusing on technology design for low-income communities in India. “I worked with sex workers, slum communities, microentrepreneurs, fruit and vegetables sellers on the streetside...” she says. “I wanted to understand what their values, aspirations and struggles are, and how we can build with them in mind.” 

Today, Nithya is the founder of the HCI group at the Google Research India lab and an HCI researcher at PAIR, a multidisciplinary team at Google that explores the human side of AI by doing fundamental research, building tools, creating design frameworks, and working with diverse communities. She recently sat down to answer some of our questions about her journey to researching responsible AI, fairness and championing historically underrepresented technology users.

How would you explain your job to someone who isn't in tech?

I’m a human-computer interaction (HCI) researcher, which means I study people to better understand how to build technology that works for them. There’s been a lot of focus in the research community on building AI systems and the possibility of positively impacting the lives of billions of people. I focus on human-centered, responsible AI; specifically looking for ways it can empower communities in the Global South, where over 80% of the world’s population lives. Today, my research outlines a road map for fairness research in India, calling for re-contextualizing datasets and models while empowering communities and enabling an entire fairness ecosystem.

What originally inspired your interest in technology? 

I grew up in a middle class family, the younger of two daughters from the South of India. My parents have very progressive views about gender roles and independence, especially in a conservative society — this definitely influenced what and how I research; things like gender, caste and  poverty. In school, I started off studying engineering, which is a conventional path in India. Then, I went on to focus on HCI and designing with my own and other under-represented communities around the world.

Nithya smiling at a small child while working in the field.

How do Google’s  AI Principles inform your research? And how do you approach your research in general?

Context matters. A general theory of algorithmic fairness cannot be based on “Western” populations alone. My general approach is to research an important long-term, foundational problem. For example, our research on algorithmic fairness reframes the conversation on ethical AI away from focusing mainly on Western, meaning largely European or North American, perspectives. Another project revealed that AI developers have historically focused more on the model — or algorithm — instead of the data. Both deeply affect the eventual AI performance, so being so focused on only one aspect creates downstream problems. For example, data sets may fully miss sub-populations, so when they are deployed, they may  have much higher error rates or be unusable. Or they could make outcomes worse for certain groups, by misidentifying them as suspects for crimes or erroneously denying them bank loans they should receive.  

These insights not only enable AI systems to be better designed for under-represented communities; they also generate new considerations in the field of computing for humane and inclusive data collection, gender and social status representation, and privacy and safety needs of the most vulnerable. They are then  incorporated into Google products that millions of people use, such as Safe Folder on Files Go, Google Go’s incognito mode, Neighbourly‘s privacy, Safe Safer by Google Maps and Women in STEM videos. 

What are some of the questions you’re seeking to answer with your work?

How do we challenge inherent “West”-centric assumptions for algorithmic fairness, tech norms and make AI work better for people around the world?

For example, there’s an assumption that algorithmic biases can be fixed by adding more data from different groups. But in India, we've found that data can't always represent individuals or events for many different reasons like economics and access to devices. The data could come mostly from middle class Indian men, since they’re more likely to have internet access. This means algorithms will work well for them. Yet, over half the population — primarily women, rural and tribal communities — lack access to the internet and they’re left out. Caste, religion and other factors can also contribute to new biases for AI models. 

How should aspiring AI thinkers and future technologists prepare for a career in this field? 

It’s really important that Brown and Black people enter this field. We not only bring technical skills but also lived experiences and values that are so critical to the field of computing. Our communities are the most vulnerable to AI interventions, so it’s important we shape and build these systems. To members of this community: Never play small or let someone make you feel small. Involve yourself in the political, social and ecological aspects of the invisible, not on tech innovation alone. We can’t afford not to.


This content originally appeared on The Keyword and was authored by MJ Pham


Print Share Comment Cite Upload Translate Updates
APA

MJ Pham | Sciencx (2021-03-22T15:00:00+00:00) What drives Nithya Sambasivan’s fight for fairness. Retrieved from https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/

MLA
" » What drives Nithya Sambasivan’s fight for fairness." MJ Pham | Sciencx - Monday March 22, 2021, https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/
HARVARD
MJ Pham | Sciencx Monday March 22, 2021 » What drives Nithya Sambasivan’s fight for fairness., viewed ,<https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/>
VANCOUVER
MJ Pham | Sciencx - » What drives Nithya Sambasivan’s fight for fairness. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/
CHICAGO
" » What drives Nithya Sambasivan’s fight for fairness." MJ Pham | Sciencx - Accessed . https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/
IEEE
" » What drives Nithya Sambasivan’s fight for fairness." MJ Pham | Sciencx [Online]. Available: https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/. [Accessed: ]
rf:citation
» What drives Nithya Sambasivan’s fight for fairness | MJ Pham | Sciencx | https://www.scien.cx/2021/03/22/what-drives-nithya-sambasivans-fight-for-fairness/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.