two laptops are open on a desk; a class of high school students on the other side of the glass
October 28, 2022

Exploring Data Ethics at Google with Torrence Boone ’87

How do technology companies handle ethical issues?
by Ryan Ravanpak

Last week, my 500-level Ethics and Technology course had the pleasure of hosting a visit from Torrence Boone ’87—a Phillips Academy alum, as well as current vice president at Google and co-lead of the New York office.

While speaking about his path from Andover to where he is now, Mr. Boone told us about his time as a dancer before going to business school and his current enrollment in an MFA program where he is working on a novel. When he discussed the ethics of data privacy at Google, he began by posing a question: How might we have a personalized experience online without it feeling creepy?

Everyone has had these creepy experiences in which they search for something online and then begin to see advertisements associated with that thing at every turn—on every website. Part of the problem is the use of “cookies”—pieces of data from a website that are stored within a web browser and can be retrieved later. Mr. Boone explained that Google is moving away from its reliance on cookies. To make things feel less creepy, they are trying to create more generalized “buckets” of interests that they can put accounts into. In this way, you might see advertisements for sports-related material of various kinds instead of advertisements for soccer balls.

The students asked him several questions as well. How do you balance creative pursuits with your role in the business world? How is Google organized? Who handles questions of ethics in technology within the company? How does Google try to have foresight about how technology will impact the population before deploying that technology? How has Google changed from, say, a decade ago?

Mr. Boone handled the questions gracefully. He spoke about Google’s Trust & Safety division and gave an example of a past initiative in which they needed to address problems involving search engine and AI bias. The danger of AI is that it must be trained on data—and sometimes, the way it is built leads to bias and unsavory consequences. Mr. Boone explained that to address such a problem, they brought together an interdisciplinary team of workers, including computer scientists, linguists, and others.

The students thought the experience was fantastic and asked for more like it in the future. We thank Torrence Boone for taking the time to visit our class.

Categories: Featured

Other Posts

the classroom building Sam Phil on a summer day with blue sky above and green grass in foreground
Professional Learning & Partnerships

Summer and fall at the Tang Institute

empty garden boxes in an open field on a sunny day
Experiments in Education

How will meaningful learning occur in this year’s Workshop?