I love teaching robotics because it includes all the elements of what I believe to be good teaching: collaboration, real world problem-solving, student-driven projects, and interdisciplinary connections. So when my school asked me to develop a science elective for a group of students who had already taken the standard high school science offerings, I wanted to make sure this new robotics class included all of those pieces. 

My robotics students are post graduates, students who spend one year at Phillips Academy after having finished four years of high school at their previous school. In many cases, these students would not initially describe themselves as science people,” and they do not come to these classes with experience in robotics or design. Many arrive at Andover never thinking about what it means to be a creator of technology, let alone the ethical implications or responsibility it entails. As these students shift their thinking from consumers of technology to creators of technology, how can we, as educators, include discussions about being ethical designers from the get-go, making it an integral part of their curriculum?

What’s exciting to me ― and, in time, to my students ― is that there’s no shortage of topics to explore. For example, remember when a fitness company released all their tracking data and gave away the location and structure of a secret US military base? Or how about the time a self-driving car killed a pedestrian during a test drive? Both of these instances, and many more, are examples of ethical situations where we are determining right and wrong in new and different ways. And these are perfect examples to bring into the classroom from the start.

So, how can we create a framework for thinking about these issues within a science classroom? And how do we teach that framework in authentic, meaningful, thoughtful ways?

As a physicist, I had no previous academic training in formulating a framework that would allow me to address these ethical issues in my classroom. And while I personally have a clear moral compass, my goal as an educator is not for my students to parrot my views. Rather, I want them to develop and articulate their own ethical values so that they can then think about how these principles would apply in specific situations. It was my work with the ethi{CS} project that gave me the confidence and the language to do this work.

While attending the first ethi{CS} summer project in 2020, I attended a presentation by Blakeley Payne, formerly of the MIT Media Lab, about a teaching approach she developed called the ethical matrix. This tool allows us to identify the various stakeholders in a particular situation, and the invariably-competing values that come into play. I love Blakeley’s ethical matrix! 

Ethical matrices
Students try out ethical matrices in Artacho’s robotics classroom

I committed to introduce this in my class last fall, and it quickly became apparent that I would need to scaffold the use of the ethical matrix because of two related concerns. First, the topics we study ― both ethical and technological ― are complex. Second, there’s also the degree to which students have a personal, emotional connection to the topic, which I call emotional entanglement”. I kept thinking of one student who said I’m not a celebrity, so I have nothing to hide — I don’t care if Alexa listens in on me” — and thought this was a perfectly reasonable, final argument on the matter. I want my students to do better than that.

And, after using the ethical matrix, they did! My students’ thinking about technology and related ethical issues shifted throughout the term, significantly so for some:

As a consumer, I know that none of this is really in my control and I will never really know how AI effects my life because I do not fully understand all of the possible uses for it.”

-Student Reflection on Understanding AI Ethics

I think it [the geotracking RFID chip] is a huge invasion of privacy. I could never know who exactly has access to my information, and it may be people who I have never met or trust […] if a hacker were to gain control over the chip, they could do anything that might put my safety at risk, from tracking me to tampering with the chip inside of my body. “

-Student Reflection on Microchip Implants

There is still more work to do, of course, but I am proud of my students for all the progress they’ve made. I’m also grateful to Dr. Kiran Bhardwaj, Tang Institute fellow and collaborator on the ethi{CS} project. Our collaboration has enabled me to collect examples of technology, design, and ethics, and also establish classroom routines and habits that strengthened student thinking and reflective practices. The possibilities ahead are also exciting: I’d like to have a robust discussion and create a framework for how students can use these matrices to make decisions. Additionally, I have been thinking about the connection between accountability and consequences — notions of fairness and justice for all involved in the creation and use of new technologies. I look forward to sharing my progress with all of you in the months ahead!


Carol Artacho is an instructor in physics at Phillips Academy and a Tang Institute fellow with the ethi{CS} project. Stay tuned for the next post in this series, the ethi{CS} project in Robotics: Guiding Principles”.

Back to Top ↑

Be a part of our community!

Subscribe to our newsletter, Notes on Learning, for monthly updates.

SUBSCRIBE