Pathways to Impact is a series of conversations with data for social impact leaders exploring their career journeys. Perry Hewitt, CMO of data.org, spoke with Nate Matias, Assistant Professor in the Communication Department at Cornell University, about his path to academia, and how he came to focus on digital governance and networks shaped by algorithms.
How did you start out doing this work? How did this work become on your radar as a possibility for the kind of work a person could do in the world?
I first started doing things with computers as a teenager. And for me, getting access to a computer was actually a pretty big deal. My parents had moved to the United States from Guatemala before I was born, and my father had a career as a night shift mechanic in a factory. In the mid-90s when my parents got the sense that computers might be important, they scraped together the money for an Intel 386 desktop computer that we could hopefully do something with as kids.
But we didn’t have the money to license software. I ended up with a computer, and a manual for the basic programming language, and a challenge from the computer shop owner to learn how to do things with it. That was my baptism by fire into computing, perhaps like many people who got into it as hobbyists — but in my case it was borne of necessity.
With a fair amount of computer experience as a teen, I actually studied literature as an undergraduate. This was in part because of some great advice from a local college professor, who said that while computing is great, and an important area to study, ultimately, computers are a lens on the world. If I wanted to do interesting and meaningful things with computing, I should first learn about the world. As a result my major was literature with a minor in computer science, and it was this combination that launched my career as an academic and in the startup world.
That’s wonderful. You had an early perspective on technology as a lens on human behavior and experience. But when did you get a sense that technology could be in service of good or bad acts, and that there was a problem that technology or data might improve or worsen?
I started thinking clearly about technology and society while studying post-colonial literature at Cambridge University, on a scholarship for people underrepresented in their fields. At Cambridge I learned about people like Solomon Plaatje in South Africa who used the printing press and created dictionaries as a way to reshape knowledge and power in the colonial era. Back then only certain people had access to printing presses, the communication technologies of the era, and they used this control over information to reinforce the power of colonization and empire. I read stories of people who grew up in the empire and saw these communication technologies as ways to remake power for democracy. That was the moment when I realized that I can — and should — apply a similar lens to the technologies of our time. I saw that we all need to be asking where these technologies are (or are not) serving the common good, where they are serving freedom and democracy, and what we might need to do differently to make sure that happens in the future.
And now you’re at Cornell — what part of this problem are you seeking to solve?
At Cornell, I lead the citizens and technology lab (CAT Lab), which works for a world where digital power is guided by evidence and accountable to the public. We work toward that goal by collaborating with the public in citizen science or community science. We talk to and organize members of the public who face issues like online harassment, or who worry about inclusion in knowledge on sites like Wikipedia, or who are excited about creating a flourishing digital society. We collaboratively design research studies that answer questions exploring the impact of a technology on people’s lives or some idea for creating change, whether that’s social change or change in how technologies and technology firms operate.
How often in your work in citizen science or community science does the word data come up? Is it perceived as an opportunity or as an issue?
I often find that people who are most affected by technology issues have a profound and nuanced understanding of how data impacts their lives. They might not know all the technical details, but they’re the experts in their own lives and situations. And more often than not, when our team has collaborated with communities, we’ve learned things about the data that we didn’t know, that people intimately know because they have to live with it.
One example is a project we did a few years ago to support a group of women who were documenting how tech platforms did or did not handle cases of online harassment. We were trying to support them both to report the issues they were facing, and also to document those issues in a way that could be written up for policy makers and platforms. We learned so much from the level of understanding and detail that these women had about how online harassment happens, how it is or isn’t recorded, the kinds of evidence that law enforcement needed, or didn’t need, or didn’t pay attention to, and the mismatches between the data and interactions the platforms recorded and what survivors needed. It’s fair to say we learned more from them than they learned from us in that process.
I read stories of people who grew up in empires and saw these communication technologies as ways to remake power for democracy. That was the moment when I realized that I can — and should — apply a similar lens to the technologies of our time.Nate Matias, Ph.D. Assistant Professor Cornell University
What were some unexpected blockers to your career? Your path went from being from the child of immigrants pulling together money to pay for that first computer to working on entirely new problems at an Ivy League institution. What fueled your progression and what were some of the things, expected and unexpected, that posed obstacles?
You’re right that one of the basic challenges was that there was no defined category for what I do today. It also took me time to gain a sense that I could do things that mattered in the world. I remember showing up at Cambridge and being surrounded by people who expected from an early age that they would go on into careers that would have consequence and power. And it shocked me because I had not grown up in similar circumstances, or with the expectation that my views or actions could really change society. I needed the right support to see those pathways and follow them, to learn how to participate in conversations of consequence with the values I brought as someone under-represented in academia.
One thing I’d add: if you want to do something innovative that contributes to the common good, sometimes there’s an existing pathway, and sometimes you have to create that pathway. This is particularly true if your lived experiences and perspectives lead you to do things differently from the norm. I think that in higher education and powerful institutions people who come from traditionally underrepresented groups are often innovators in that way. But forging those pathways often requires even more creativity and support from institutions to become a reality.
One challenge that trailblazers working on technology and social good have faced is the mismatch between the goals of tech company leaders and the public purpose that social good innovators have. Often, there’s a lot of overlap and opportunity for collaboration. And sometimes there’s real conflict. As someone who’s been part of the startup world and the tech industry, I eventually realized that my kind of public interest and citizen science work needs to be independent of the tech industry. There are just certain kinds of public service that are easier to do well in a trustworthy way if there’s no question about the influence of industry on your work, especially where your work holds companies and industries accountable.
Whatever path people choose, everyone has to navigate a tech world that is largely driven by profit motive. There will absolutely be moments where you can align the public interest with those profit goals. There are moments where you find a way for them to travel side-by-side. And then there are moments where the people need to say “stop” — we need to use our power as a democracy to force companies to steer in a different direction. Navigating these tensions and decisions was one of the most challenging aspects of my time as a PhD student and now as faculty.
We still struggle to bring together conversations about values and social impact in the same spaces as technical details, mathematics, and statistics. Scientists often call something pure science or if it avoids ethics conversations and prioritizes white, Western ideals. Values conversations continue to be split off into new fields, without giving students in what we might call “purely” technical fields as much of an introduction to the important ethical and social issues that they’re going to confront in their work. Those of us who are responsible for higher education need to bring those fields and topics closer together. Students face really difficult choices about which lane to go into, especially because the lanes that are in front of them were created at a time when those social and ethical implications were less recognized.
What community of people or resources bolsters your work? I’m curious where do you go when you need a question answered, or to exchange ideas around the work your lab does today?
I love that question, because I strongly believe there is no success without some set of communities supporting and uplifting that person — and that’s totally been true for me. In my early career, communities at the Berkman Klein Center at Harvard, the Center for Civic Media at MIT, and also the community of bloggers and activists at this network called Global Voices profoundly shaped my understanding of what was possible, and have provided so many sounding boards for ideas and ethics. As a professor who cares about both our lab and supporting a wider ecosystem of more people to do this kind of work, I’m finding myself more and more involved in creating and sustaining communities that can offer that kind of support.
Some of those communities are within academia, within the Association for Computer Machinery. Some of them are new organizations. For example, I’m a co-convener of something called the Coalition for Independent Tech Research, which has just launched this summer. The Coalition is trying to create both a supportive community for people who want to do industry-independent research in the public interest, and also to advocate for and make the case for the rights and freedoms that we need to be able to inform the public and provide evidence to guide democracies on tech and society.
Specifically, there’s a constellation of fellow travelers that I find myself sharing queries and ideas with. Over the years, this has included Public Lab, the team at the Algorithmic Justice League, the Distributed AI Research Group, and the at Consumer Reports. I’m often in conversation with the at Princeton, , and the at Columbia.
The computing ecosystem often moves too fast to look back or forward. I have ultimately decided in my career that I want to do work that is valuable both now and in the long term. That’s one of the unique privileges that scholars have in academia: we can be thinking about that long term. It’s one of the reasons I’m so excited to be here at Cornell in the Communication department, where we have technologists and social scientists who care about public engagement while also doing work that stands the test of time.
I often find that people who are most affected by technology issues have a profound and nuanced understanding of how data impacts their lives.Nate Matias, Ph.D. Assistant Professor Cornell University
You have a degree in literature, and you’ve done postgraduate work. Beyond the more technical parts of your job, what are the skills you feel like really help you be successful? What non data science or technical skillset has proven valuable?
There are a few skills that we might not think of as data related that have transformed how I work. One comes from the humanities, and it’s a willingness to research the history of the values and politics of technologies and systems.
Many of my computer science papers, we go deep into that history. In our most recent paper, which re-imagines citizen science tests of technology concerns, we briefly review 1930s debates over the relationship between statistics and eugenics. Many of today’s statistical tests were first imagined by eugenicists By revisiting eugenics debates among statisticians at the time, we can re-imagine the assumptions behind data science today.
Reading that history gave us insight into how to rethink the design of research that would benefit people at the margins. It drove us to look beyond the average outcome variables, which privilege the majority, to examine other parts of the distribution where marginalized groups might be facing harms you wouldn’t see otherwise. That’s just one example of the contribution the humanities can make to fundamental advances in computer science. When I get to write about technology and society for a public audience, I often weave poetry, history, and original reporting into articles about our digital future.
I’m also grateful for my graduate school training in community organizing. When I was a PhD student at the MIT Center for Civic Media with Ethan Zuckerman, he insisted that we get trained in organizing and facilitation, because Ethan believed, rightly, that the most powerful outcomes are created when people come together for a common cause, and when you’re genuinely engaging with and listening to the people you’re collaborating with. I would love to see more people receive organizing training. Whether you are negotiating with a client, or collaborating with policy makers, or working with communities, there’s a give and take. This collaborative negotiation of understanding is central to the success of any data project.
What advice do you have to someone new to the field who’s interested in doing this work? If someone wants to get started with a background in CS, or just a strong interest in how data is affecting their lives, what might you recommend?
We’re still creating those pathways, which is one reason I’m so glad we’re having this conversation. My top suggestion at this stage is to follow the people whose work you admire, connect with other people who care about the same things. You can reach out to those people whose work you admire to ask for suggestions and introductions. You can also have a conversation with people who share your values and goals about the best pathways for you.
The world of data and social good is so varied, and it encompasses so many fields that it would probably be a mistake to say, “here is the one true pathway”. But what is really important is to find those people whom you trust, and who share your values, and who you feel can help at least light the next few steps. I know when I was a student, you were one of the people I reached out to in that regard, Perry, and I appreciate having that conversation many years ago.
The day I learned I could contact the people whose work I admired was a game changer for me. I remember being a college student in the early days of blogging, and one day I emailed a blogger who I admired, and I said, “Can I interview you for this project that I’m doing?” And they said yes, which blew my mind. And if those people are busy or overwhelmed— show up at events, join local meetups, participate in forums— finding people who share your passions is an important step to finding your path.
One piece of advice for students right now is to consider the advice that I was given by a professor: if you focus on the technical skills alone, it’s going to be harder for you to chart a course through the values and issues you care about. Take advantage of a course in the humanities or social sciences, or some other field that you’re curious about. Learn about your own identity and history, or maybe something about the world that you’ve never encountered before. Those are the courses that will help you chart the path, even as your technical courses help you build the skills you need to follow the path.
There are, of course, fields and disciplines where we’re trying to assemble the pieces for you. Communication is one where we bring together the social sciences, computer science, humanities, and history. Information science is another great example. But even if you’re at an institution that doesn’t have those interdisciplinary programs, taking those electives, going to public events with speakers, or getting on Zoom to hear live streams with people whose work you admire can make a big difference.
One final piece of advice: it’s important for young people, especially, to overcome the fear of not having enough technical capability. There will be endless paths you can take to further refine and improve your technical skill, and that can be an ongoing journey. But the technical skills take you nowhere unless you have a vision for where to take them.
What the next big thing in tech or data for social impact that you see? Do you see people creating and using these technologies becoming more aware of data?
I think and hope that we’re going to move towards the expectation that new technologies and existing technologies be supported by evidence evidence that the risks and harms are being managed effectively. We’ve spent so many years in the digital technology world where the approach is “put it out in the world and see what happens.” Democratic societies and our representatives have gone through enough crises as a result of that approach. We are going to see more expectation that the benefits be proven and the risks are provably managed. That will create an amazing opportunity for people working with data to help us navigate those big decisions so that when we are debating topics like mental health and social media, we’ll have the evidence we need, and we’ll be able to see that companies or public interest organizations are using their power in a way that’s helping and not hurting.
Achieving evidence-based tech policy will be an ongoing journey and I would love to see it fulfilled within my lifetime. I feel like we’re reaching a turning point in the US, in Europe, and elsewhere where that’s starting to become the default expectation, that you have the data to back it up, not that you bring in data as part of a crisis management approach when things go wrong.
Remember, there was a period of time where we put cars on the road without knowing if they were safe. Large numbers of people died before the United States government finally realized that we might want safety standards and evidence practices. Someone had to invent the crash test dummy. We were driving cars and people were dying before we even had a way to tell if a given car model was safe. Similarly with food safety and drug safety: there was an era where anyone could make and sell anything. We may or may not borrow different parts of regulation from those pasts, and digital technology definitely has its differences. My hope for our lifetime will be that we figure out what that model is and create a robust, ongoing evidence base for making those decisions as a society.
What’s your don’t miss daily or weekly read? What are the kinds of media you consume to stay informed or encouraged?
Almost every day I listen to PRI’s I hop on my bike, read the book of nature, and hear stories from all around the world about the issues and challenges that people face, as well as fun and interesting stories about culture and the arts. Even when the news is dire, gives me hope as I hear about the challenges, aspirations, and work of so many people in other parts of the world. I also stay heartened by listening to poetry, fiction, and nonfiction in audio form, most recently Robin Wall Kimmerer’s luminous book Braiding Sweetgrass, about science, the environment, and indigenous knowledge.