Published: April 2, 2023 By
Athanasios Moulakis

To educate students on the importance of engineering ethics, Lucky Vidmar has launched the Moulakis Lecture Series on Responsible Engineering. The lecture series is named after Athanasios “Thanasi” Moulakis, a former CU professor and mentor of Vidmar. In his time teaching at CU, Thanasi was a strong advocate for liberal arts education for engineers, specifically philosophy and classical literature.

Drawing inspiration from Moulakis’ published work, Beyond Utility: Liberal Education for a Technological Age, Vidmar brings CU students an updated look at what it means to be a responsible engineer today.

The inaugural lecture, “Technology Is a Very Human Activity”, focused on one of Kranzberg’s six laws of technology: technology is neither good nor bad, nor is it neutral. In particular, Vidmar described the very real danger of considering technology to be neutral, stating, “neutrality is anti-intellectual and moral laziness.” As the head of Intellectual Property litigation for Microsoft, Vidmar provided insight on how we can do better with our designs.

Considering controversies in tech such as privacy and algorithmic bias, it is more important than ever for software engineers to consider how their work will impact users. By accepting that technology is inherently not neutral, it is possible to see that often issues that arise are not a fluke bug, but are rather ingrained in the software because of how it was developed, and who developed it. who it was developed by.

Vidmar strayed from Moulakis’ ideas to propose a new approach to engineering ethics education. Where Moualakis would stress the importance of exposure to classical literature as a means to teach an engineer to do the right thing, Vidmar argues that times now call for us to be more intentional.

“Having this Herbst program for [over] 30 years is a differentiator. Jointly working with the [Herbst] foundation, I came up with the new lecture series to…support responsible engineering at CU. The [lecture] format [allows for] students to hear some voices that sometimes they might not hear just from their academic pursuits,” said Vidmar.

The history of technological advancement has been one of deliberate, incremental, intentional change. Worldwide connectivity and access to information have expanded so much, however, that the industry has power over consumers everywhere on a scale never before seen.

The stakes are higher, and Vidmar argues that the industry is making negative decisions more than they used to. The ease with which consumer data can be collected and consumed by large companies is certainly not neutral.

With the speed that technology is progressing, the days of clear cut ethical situations are long gone. So how as engineers do we navigate this new moral gray area? Vidmar calls for engineers to return to their humanity, stating “responsibility is doing what people would do if they could.”

So much of the moral ambiguity we see in tech doesn’t seem intentionally bad on paper, but distrust is growing. From his work in the industry, Vidmar knows well the challenges of companies profiting off of consumer data. “It’s this game of attrition right, where they’ll always be able to play more of the long game than all of us.”

So how do we make this game of attrition easier on the consumer side? Vidmar points to company culture. The lecture explored the power of companies that are psychologically safe. Psychologically safe describes companies that encourage employees to ask questions, bring up concerns, point out mistakes, and speak up when they don’t know. In short: bringing humanity back to the table.

Artificial intelligence and automation make it more important than ever to be humans before we are engineers. When we design systems that will quite literally take on a life of their own, I suggest we come back to Vidmar’s key point: “Responsibility is doing what people would do if they could.”