Saturday, November 16
12:45 – 1:45PM
There is currently considerable research into how the discriminatory practices of everyday social living have been transposed into the technological realm. For instance, many of the same biases we are witness and subject to in our physical realities, when transferred over into the digital sphere, see themselves exaggerated and expanded.
This panel will depart from an exploration of the ways in which technology may produce and reproduce some of the most pernicious inequalities of our societies. Take the example of the facial surveillance/detection function: Snapchat demonstrated its inability to overcome racial biases in the coding of the facial recognition algorithms, thus failing to recognize people of non-white ethnicities (particualrly individuals of East-Asian descent) in order for them to use the “filter” functionality fo the app. Relatedly, we see it in the case of the wife-tracking application sold in Saudi Arabia. In this respect, panelists will talk about data amalgamation, algorithmic bias, the gender problem in the tech sector etc. The panel will then turn to a discussion of the ways in which people are challenging these harmful applications of technology. In particular, it will look at the ways in which technology is being harnessed to achieve equity for both women and other marginalized groups.
Mónica Cabrera Pellerano
Mónica (she/her/hers) is pursuing a Master of Arts in Law and Diplomacy at the Fletcher School, focusing on International Technology Policy and Human Security. Most recently, Mónica has undertaken research into nanotechnologies and the intersections of emerging deep technologies and behavioral sciences for a forthcoming book by Cambrian AI, served in the Office of Security Council Affairs for the Permanent Mission of the Dominican Republic to the UN, and as a Research Associate at The Center for Ethics and the Rule of Law at the University of Pennsylvania. Mónica holds a B.A. in International Relations from Syracuse University.
Josephine Wolff (she/her/hers) joined the faculty of the Fletcher School as an assistant professor of cybersecurity policy in 2019. Her research interests include the aftermath of cybersecurity incidents, cyber-insurance, security responsibilities and liability of online intermediaries, the impact of cybersecurity and privacy policies, and government-funded programs for cybersecurity education and workforce development. Her book “You’ll See This Message When It Is Too Late: The Legal and Economic Aftermath of Cybersecurity Breaches” was published by MIT Press in 2018. Her writing on cybersecurity has also appeared in Slate, The New York Times, The Washington Post, The Atlantic, and Wired.
Kendra Albert (they/them/theirs) is a clinical instructor at the Cyberlaw Clinic at Harvard Law School, where they teach students how to practice technology law by working with pro bono clients. They hold an appointment as a Lecturer on Law, teaching a classroom course on the First Amendment and another course on transgender law. Kendra also serves as the Director of the Initiative for a Representative First Amendment, a joint project of the Cyberlaw Clinic and the Berkman Klein Center for Internet and Society.
Sabelo (he/him/his) is a computer scientist and researcher whose work focuses on the ethical implications of technology in the developing world, particularly in Sub-Saharan Africa, along with the creation of tools to make Artificial Intelligence more accessible and inclusive to underrepresented communities.
His research centers on examining the risks and opportunities of AI in the developing world, and in the use of indigenous ethical models as a framework for creating a more humane and equitable internet. His current technical projects include the creation of Natural Language Processing models for African languages, alternative design of web-platforms for decentralizing data and an open-source library for offline networks. His most recent research focuses on Ethics & Technology and extend his work on Artificial Intelligence for the Developing World.