(This article appeared as a CCC BLOG post on June 28, 2022.)
Last month the Networking and Information Technology Research and Development (NITRD) program commemorated their 30th Anniversary in Washington D.C. You can read the full event recap here. In an effort to highlight the impact federal investments have had on the computing research community, the event featured five panels in which participants discussed key achievements in the field over the past decade and future directions going forward. Each panel focused on an important subarea of computer research: Computing at Scale, Networking and Security, Artificial Intelligence/Machine Learning, Privacy and the Internet of Things and Socially Responsible Computing.
All the panels featured throughout the day at the NITRD 30th Anniversary Symposium had two common threads: highlighting the astronomical advancements we’ve experienced in computing over the past few decades and mitigating the risks associated with these new advancements. Topics included matters of algorithmic bias, non inclusive technologies, privacy invasions, security risks – the list goes on. One thing the panelists did agree on was that there was hope for change and all discussions touched on a path forward towards more ethical and responsible computing. Moderated by Alondra Nelson (Office of Science and Technology), Panel 5 “How Technology can Benefit Society: Broadening Perspectives in Fundamental Research” featured relevant discussions by panelists, Janet Abbate (Virginia Polytechnic Institute and State University), Deborah Estrin (Cornell University), Charles Isbell (Georgia Institute of Technology) and Ramayya Krishnan (Carnegie Mellon University).
Nelson kicked off the session by asking each panelist a different question geared towards their expertise. Abbate and Krishnan considered solutions through a systems lens while Isbell and Estrin focused on pedagogy and training.
Abbate encouraged developers to view computing technology as a part of a larger sociotechnical system by incorporating three currently unaccounted for aspects as active parts.
- Users – build open ended systems that allow the user to adapt and tailor technologies to their needs. The Internet is an effective successful example of this.
- Social Environment – be realistic and anticipate the negative trends you see in society (racism, bias, sexism, religious discrimination). Expect them to manifest in technological outcomes and bring in affected parties that are knowledgeable about the social environment to design protections.
- Natural Environment – consider the entire lifecycle of technologies. Minimize toxic waste, emphasize reducing the carbon footprint of computing and use recyclable, durable materials.
Building off of Abbate and rethinking the systems framework, Krishnan spoke on the value of deploying technology in support of societally relevant and consequential decision making tasks. Krishnan urged us to consider the entire pipeline of system building from identifying a problem to evaluating the system for bias. Each step – collecting the data, developing the AI and modeling, the design and intervention phase, and the evaluation, are often siloed and this must be changed to approach the pipeline holistically.
Changing gears to how to implement this vision and provide a workforce that incorporates these larger perspectives, the conversation turned to Isbell and Estrin on education and how to integrate societal and public interest perspectives into graduate education for human centered technology.
Estrin emphasized the power of engaging with applications and real users, not developing technology for the sake of innovation but doing it to “build tech in service of public interest and public sector needs.” As an example she identified the National Science Foundation’s Center for Embedded Networking Sensors (CENS) that curated public interest and public sector opportunities for graduate students as they went through their education.
“It’s not that we are not trained to care. We are trained to not care, right?” – Charles Isbell
It isn’t all about a lack of programs or opportunities to foster public interest and inspire researchers to work towards the public good. It is the incentive structures in place that encourage designers to only worry about the end design and accomplishing the task that the technologies are designed for, without considering how these innovations will be used. Isbell pointed out that the way the curriculum is built and the way people are educated treats ethics as something tacked on at the end, while it should be ingrained in the very beginning to be incorporated throughout the rest of a student’s education.
Continuing with the conversation, Nelson asked what the government or NITRD should be doing to train students and build pipelines that create a more human centered approach to technology.
Panelists had a number of recommendations including encouraging a more interdisciplinary approach to computing, making the field more welcoming to ethicists, sociologists, psychologists etc., changing incentives to reward people that prioritize privacy, equity, justice etc., driving the conversion of ethical principles to actual tools, creating opportunities for large groups of diverse, interdisciplinary collaboration and using their power to convene to bring people together and set the agenda towards ethical practices.
Nelson pointed out that bringing people together for interdisciplinary collaborations can be difficult due to key differences between the separate fields. Krishnan pointed out three methods to ease these difficulties and establish a cooperative foundation between disciplines. As an example, he cited the Computing Community Consortium’s recent workshop on Artificial Intelligence and Operations Research. The workshop brought two groups together that work on similar issues in different ways. Conferences and workshops that focus on leveraging synergies between fields, provide an opportunity to have discussions that lead to cross fertilization. Another vehicle for engagement is fellowships. As an example, he brought up Georgia Tech’s Data Science for Public Policy Fellowship that brought in graduate students from computing and social science to conduct field experiments with an aim to solve social, environmental problems in major policy areas. A third example is Georgia Tech’s scopeathon, which works much like a hackathon and teaches people how to scope and work on problems, how to ask the right questions, and how to determine the correct stakeholders. A fourth and final forum to encourage cross disciplinary collaboration is federally funded interdisciplinary institutes such as the National Science Foundations AI Institutes and National Laboratories.
Continuing in the vein of the government’s role in securing a more ethical and responsible future in computing, CIFellow Jasmine Berry from the University of Michigan asked panelists about the move towards more centralized systems and how the government plans to give people more of a voice in how science is advanced and ensure that their ideas about how AI is being built is actually incorporated.
Estrin assured her that there were already researchers looking to engage and communicate more with impacted communities but that the solution did not lay in decentralizing systems. She emphasized that in order to achieve a more democratic approach to the way technology is designed, the government will have to work from the top down from the incentive structures to everything else.
The panelists curated a great discussion identifying potential funding focuses and practices that will hopefully lead to a more human centered approach to technology development. While there is still a lot of work to be done there is a movement to bring emphasis to ethics and justice in these systems. You can watch the full session here and check out the rest of the panel recordings on the CCC website or NITRD YouTube channel.