How Stanford is inspiring students to think critically about the impacts of technological change
Stanford political scientist Jeremy Weinstein discusses why it is important to cultivate a new generation of tech-savvy students who can anticipate the ethical, legal, policy, and societal implications of technology.
When Stanford professor Jeremy Weinstein was serving as a national security official in the Obama administration, he was struck by how difficult it was for the nation’s foreign policy decision-makers and technologists to communicate with one another.
There was no shared language, no common understanding of technology, and little history of working through how to balance the competing interests at stake on issues as diverse as encryption, artificial intelligence (AI) and cyberwarfare. When Weinstein returned to Stanford in 2015, he was committed to training a next generation of leaders who are equally at ease thinking about technology and policy, he said.
One way Weinstein hopes to advance this goal is through the newly launched Public Interest Technology University Network, a partnership convened by the Ford Foundation, Hewlett Foundation and public policy think tank New America with Stanford and 20 other colleges and universities committed to developing a new generation of civic-minded technologists and bridging the gap between technology and public service.
Weinstein, who is spearheading Stanford’s involvement in the project, recently talked about the initiative.
Weinstein is a professor of political science and the Fisher Family Director of Stanford Global Studies. He is also a senior fellow at the Freeman Spogli Institute for International Studies and the Stanford Institute for Economic Policy Research. In addition, he is faculty co-director of the Immigration Policy Lab and the Data for Development Initiative.
What is public interest technology?
Public interest technology is an emerging area of interdisciplinary inquiry and practice. It focuses on the application of technology expertise to advance the public good. I think of the relevant technology expertise quite broadly: it could be a set of competencies that people have to create, apply and use new technologies and/or an in-depth understanding of the core ethical, legal, policy and societal dimensions of technological change. But part of what distinguishes this new area is the explicit focus on the role of technology in advancing common societal interests rather than simply commercial or individual goals and interests. By calling this out explicitly, it challenges us to think about how we design technologies and for whom, and the role of society in governing how those technologies are deployed.
Why is there a need today to create a field dedicated to public interest technology?
There is a growing movement among faculty leaders in higher education to find new ways of bridging STEM fields and other disciplines, especially the humanities and social sciences. There are so many exciting issues at this intersection. How should governments think about the role of algorithmic decision-making in policy processes? How should social media platforms be designed to promote civility and reasoned debate? Who is responsible for dealing with the societal dislocation caused by automation? How can advances in AI be harnessed to improve the well-being of marginalized groups? By raising these issues – and by creating educational pathways that enable students to explore them – the goal is to inspire a new generation of tech-savvy students who can think critically about the impacts of technological change and explore opportunities to work in service of the public good.
What inspired you to get involved in this project?
Every day, we confront new challenges that are a function, in part, of the rapid pace of technological change. How we design, deploy and govern new technologies is one of the most important societal challenges of our time. I am excited to be at a university where faculty and students are serious about working across disciplines and schools to unpack these difficult issues and to think together about how we can realize the benefits of new technologies while maintaining (and even advancing) our shared societal commitments to values like justice, equality, and democracy. And I believe there is so much to learn from how other universities are approaching these same issues.
How will this initiative unfold at Stanford?
Stanford is excited to join a network of universities committed to developing innovative curricular and co-curricular opportunities in public interest technology. We’ll be joining faculty from other universities around the country and thinking together about how we teach, train and support students who want to explore issues at the intersection of technology and society.
The good news is that we have a strong foundation to build on at the university. Student-initiated groups such as CS+Social Good and she++ have demonstrated just how much interest there is among undergraduates in thinking critically about technology and its role in society. We also see exciting innovations where faculty and students from across the university are trying to bring the “public interest” to curricula in computer science and engineering and “technology” to courses in the humanities and social sciences. For example: a revamped version of computers, ethics and public policy co-taught across three fields; a course on law, bias and algorithms co-taught between law and engineering; and a course on data and sustainable development jointly designed by Stanford Earth and the CS department.
As to where we will go from here, there are enormous opportunities for curricular change and innovation embedded in three major university initiatives that are coming out of long-range planning: one focused on human-centered artificial intelligence, a second on data science and a third on social problem-solving. I expect you will see new curricular innovations, educational pathways, research opportunities and career trajectories in public interest technology as these three initiatives take off in the coming year.