Social Impact LIVE: Courtney Cogburn and Desmond Patton on Social Work, Media, and Technology
Richard Hara is joined by faculty members Courtney Cogburn and Desmond Patton to discuss why social workers need to be at the table when planning applications of new media and technology and why they designed a new minor in Emerging Technology, Media, and Society (EMS) at the school.
READ FULL TRANSCRIPT
Richard Hara: Hello. I am Richard Hara, and this is Social Impact LIVE, a weekly conversation with members of the Columbia School of Social Work community. This week, I am pleased to welcome to our award-winning program Dr. Courtney Cogburn and Dr. Desmond Patton, both on the faculty at the School of Social Work.
Dr. Cogburn’s work employs a transdisciplinary approach to examining racial inequalities in health. She is on the faculty of the Columbia Population Research Center, and a faculty affiliate of the Center on African-American Politics and Society and the Data Science Institute. She has received support from—for her work—from the National Institutes of Health, the Robert Wood Johnson Foundation, and the Brown Institute for Media Innovation at the Columbia School of Journalism.
Dr. Patton’s research uses qualitative and computational data collection methods to examine the relationship between youth and gang violence and social media. He is the founding director of SAFE Lab, a member of the Data Science Institute, and a faculty affiliate of the Social Intervention Group at the Columbia School of Social Work. His work on so-called Internet Banging has been widely featured in the media, including the New York Times, the Chicago Tribune, USA Today, NPR, ABC News and Nature, and was also cited in a brief submitted to the Supreme Court for a case concerning the interpretation of threats on social media.
Professor Cogburn and Professor Patton, welcome to Social Impact LIVE.
Courtney Cogburn: Thank you for having us.
Desmond Patton: Thank you.
Richard Hara: It’s great to have you here today to talk about how new media and technology are changing the nature of social work research and practice. In order to keep pace with these changes, you’ve developed a new course of study here at CSSW, called the Emerging Technology, Media and Society minor, for students in our master’s program. So, I’m looking forward to hearing more about this minor, and all that entails, but before we do so, if you could share with me a little bit about how you came to connect to these particular issues in a more individual and personal way, if that’s okay.
Courtney Cogburn: Yeah. I think Desmond and I have, on separate paths, been working on the, you know, working in technology for quite some time, and my work involves kind of media and narrative, as well, and we have had, since we’ve been working together here, at the Columbia School of Social Work, we’ve had our eyes open for how we might do something together, but I think it started just to make sense that we were among few social workers working in this space that we wanted to think about ways to create an infrastructure that would help support our different projects, but also integrate more of our students into becoming substantive players in this space.
Desmond Patton: And I think by virtue of us being in tech spaces as social workers, we’ve realized that there was an absence of us, in terms of our thinking, in terms of our practice, how we conceptualize ethics and frameworks, and we really wanted to fill that gap.
Courtney Cogburn: Yeah, I mean, there were times when I’ve been in meetings and I’m thinking, holy-moly, we need a social worker at this table. Not just me, someone else too.
Desmond Patton: Right.
Courtney Cogburn: Or, the people on the tech side may say, “If only we had someone who could help us understand our client base and what they are going through, and why they are posting these vitriolic messages.” I am like, that’s a social worker, and I’m screaming. So, there is lots of ways in which we’ve seen that gap.
Richard Hara: Yeah. Coming into the program, I was thinking, you know, we’re so intimidated by tech and everything else, and you know, we’re all reacting to it, right? And, it’s interesting that maybe we should flip it around, right?
Courtney Cogburn: Absolutely.
Richard Hara: Maybe it’s not, oh, what is tech going to do for social work, but rather, what can social work contribute to tech and media?
Desmond Patton: One hundred percent. I think one of the things that I have been wrestling with is this idea of there are non-tech people in tech, and the reality is we all consume tech. We are all a part of tech and we should all be contributing to tech. So, we really have to eradicate this thinking that you have to be a data scientist or an engineer in order to contribute.
Courtney Cogburn: And not just contribute but construct, utilize, develop, critique. You know, the students who are taking the course that we’re teaching with Jacqueline Sawyer this fall, we’re really pushing them to say you already have a skill set and knowledge base necessary to critique technology and how it’s being used in society. You don’t necessarily need to know how to code, you don’t necessarily need to know all of the technical underpinnings of this particular app. You have something you’re going to say because it’s interfacing with society, it’s interfacing with human beings and human behavior that have implications that a social work perspective are important to apply to.
Desmond Patton: Even yesterday, we watched this student in class, kind of struggle with not feeling like they had enough knowledge to be a part of the conversation and that was the prime example of where we need to be is that no, you as a social worker, are right for these types of conversations.
Courtney Cogburn: Yeah. You don’t need to know, we’ll keep adding to it, but you already know a lot. You’re already bringing a lot to the conversation.
Richard Hara: Yeah, and I mean, certainly, technology isn’t a new issue, right? I mean, I worked in an agency and we used to do, for example, sort of telephone calls, right, to clients, and so on that hopefully had a therapeutic impact, and I mean, now you’ve got therapists who have their own sort of web programs or things like that, and so on.
So you know, but it’s more than that, right? We’re sort of… it seems like we’re entering a kind of digital era, right? And so, so what does that look like? I mean, where it’s beyond just chatrooms and things. I mean, what does it look like to you as far as this technology piece and social work practice?
Courtney Cogburn: I mean, I think, so you’re making two points, right? So, there is how social workers might use technology in their practice, right? If you’re going to use some kind of virtual portal to meet with a client who’s more rural and you’re in an urban setting, but they can’t afford to come in, and you’re able to still meet with them. That’s sort of an integration of technology into your practice.
Richard Hara: Right.
Courtney Cogburn: But we’re also arguing that when we’re thinking about bias in AI, surveillance of vulnerable communities, how we’re hiring and determining housing based on facial features and proportions, which is actually happening, right?
Social workers play an important role in that discourse, as well, so it’s a two-way street, but there is all sorts of technologies that are emerging, like using artificial intelligence to surveil communities, virtual environments where you can be with hundreds of people in a virtual world who are all over the world, and be in a virtual space at the exact same time, but we can also track your movements and behavior in that virtual space, and that data can be saved, right. There is all sorts of technologies that I think are important to consider.
Desmond Patton: And I think what we have to look beyond as social workers is that there really are no boundaries for the ways in which we engage these spaces, right? I think when I’m in these conversations with social workers, we tend to move towards their practice base, or towards their clinical space, and that’s fine. I think that there is a lot to be said around how we protect data when we’re in telehealth spaces, but what Courtney is really trying to underscore, I think, is really important is that there is so much happening around us that we need to be in conversation with, and so we should really be opening up our minds and hearts to being a part of those conversations in really rigorous ways.
Richard Hara: So, let me get this straight. So, you’re saying that maybe by focusing too much on creating this clinical space, we’re missing another component of social relations that maybe, for example, has to do with more social justice issues and, and so on?
Desmond Patton: I think that we have relegated our self to a particular space that we’re comfortable with. I think what Courtney and I are pushing through our new lab, the JET Lab, and our new minor is that you need to get uncomfortable in this space. You need to be uncomfortable with the things you don’t know and just join the conversation.
Richard Hara: Okay. So, so could you give me some examples either from some of the current work that you’re doing, the projects that you’re involved with and, and maybe the lab that that you’ve created as well?
Courtney Cogburn: Mm-hm.
Desmond Patton: Yeah, so I direct the SAFE Lab, and for the past six years now, we have been trying to understand the role of social media in gun violence prevention, and we use artificial intelligence, in particular machine learning and computer vision to understand pathways to aggression online.
The reality is, the gold standard techniques in data science just don’t work in this space. They failed time and time again, and we work with the most brilliant minds in data science here at Columbia. When we went to our social work groups, when we brought in community members, folks who have lived experiences around gun violence who can help us translate and interpret context, then our accuracy improved greatly, and again, that was just the social work skill. It wasn’t something that was special, it was something that I learned in my MSW program at Michigan.
Courtney Cogburn: Yeah, I think really, honestly, some of Desmond’s work is really the, some of the most clear examples of how social work changes can fundamentally change tech and how emerging technology are fundamentally a social justice issue. So, to put it in another way, the data science and machine learning approach has an algorithm for analyzing language, and that algorithm did not work in the communities where Desmond was working, and it didn’t work until they brought community members into that space to start changing the algorithm, right, and they had an expertise that was above and beyond how you create an algorithm. They had an expertise in their own lives, and that’s fundamentally a social work principle.
Richard Hara: No, I think AI sort of got to the next level when it sort of abandoned this top-down approach instead of coming up with these rules. And rather they’re sort of built from the experience of all these different sources of information, right? That would sort of build to a better understanding in creating these algorithms. So—so what is it that, for example, AI is going to do to help social work practice? I hate to sound like that’s why AI is here, but is there a use for AI? I mean are we going to be using AI as a sort of diagnostic tool, and sort of keeping track of what’s going on in communities? I have to say, you know, we’ve got a lot of issues now, right? Right, about big tech and data privacy and so on. So how does that affect what you’re doing?
Desmond Patton: So I really think we should take a step back and think about the problems and the questions that we’re trying to address, and not every problem or question needs a tech solution.
Richard Hara: Right.
Desmond Patton: So I think a social worker person is able to say what is the problem? What are we listening to from communities, and should we involve tech or not? And that should be a part of the input when making a decision around when to involve tech. So I don’t think we should jump to the conclusion that artificial intelligence is going to save us. However, there are some spaces and places where AI could be useful. AI is being used in suicide prevention, right? And, so we are learning a lot from the communications, the images, the conversations that are being had on social media and folks are using artificial intelligence to better understand that language, and to better predict when someone might have suicidal ideations. So that’s an example where you can leverage artificial intelligence for good, and then you can work with social workers who have expertise in that space to have to disseminate some targeted interventions as well.
Richard Hara: Great, and in terms of specific technology like virtual reality? I mean what are the possibilities there that you’re seeing?
Courtney Cogburn: I think there’s a lot of possibilities. First, I will say that when I started working in virtual reality, I had never used virtual reality to give people a sense of what you don’t need to bring to the table to actually start working in this space. Strategic collaborations may be a wonderful way, right, to get started, which is what I did. I am using virtual reality to help people engage complex social issues that they may be personally removed from. And by putting them into those circumstances, from a first-person point of view, trying to find ways to help them better engage social issues, like structural racism. That’s really based on the argument that you can effectively solve a problem like structural racism if you don’t actually understand the problem you’re trying to solve. In my work I see a lot of discourse around individual bias, empathy, individual feelings, individual experiences of being mistreated, not policy systems, structures that are producing these patterns. So I’m using virtual reality to help people understand systems and structures that they may be blind to or not fully getting to understand by putting them into it and helping them engage in a way that they may not be able to do otherwise. But I think there’s so much potential for understanding interpersonal dynamics. For understanding patient client dynamics. And, again, the contribution of social work is not just, how do we use it? It’s also how do we understand it, critique it, unpack it, hold it accountable as well.
Desmond Patton: And just to—a really quick point, you know, Courtney has—is fairly new to virtual reality, but she is one of the most well-known people in virtual reality. So I think that says a lot about the type of work that she’s doing, but I think what also makes Courtney stand out from the rest is how she forms teams, the technical conversations she’s willing to have and engage in, the difficult conversations that she’s willing to embrace through virtual reality. I think it’s very new and exciting. And, so I think that that is something that is unique from a social work perspective that Courtney has brought through virtual reality.
Courtney Cogburn: When we come to the tech, it’s not about the tech. It is about the problem we’re trying to address, the communities we’re trying to lead, the people we’re trying to help. It is not about doing something cool in virtual reality, which starting at that point puts on a completely different place. We’re not trying to use cool tech. We’re trying to think about implications of the tech for the communities that we’re trying to serve and support.
Richard Hara: And to add to collaboration.
Courtney Cogburn: Absolutely.
Richard Hara: So it’s not the tech, it’s not the policy, I mean, it’s how people work together, right? Social work brings things to the table. So we have, you know, some time yet, and I want to bring in the new minor that you have developed here. I just want to remind our audience as well that if you have questions, please type them in and our moderator will relay them to us so we can use them doing our question and answer period. Okay, so, yeah, this new minor, what’s it called? How does it work?
Desmond Patton: It is Emerging Technology, Media and Society.
Richard Hara: Okay.
Desmond Patton: And, it started from multiple vantage points. So, Courtney and I have been having conversations around the need for social workers to be brought into the 21st century around technology and how we can impart some of the things that we’ve been learning in our labs in this space. In addition, about a year and a half ago I received a collaboratory grant, to partner with Data Science Institute to create a new course around coding for social workers. And so we decided to marry those two things and to develop this new minor.
Courtney Cogburn: Yeah, and I think that what we started to learn is, one: we have students on our team who are already engaged in this work. Students we even encountered who have been doing this work outside of social work, have been working in media, and tech outside of the space. Just been delighted that they are given the opportunity to marry their interests in a more significant way. So the minor will involve a core course that takes place—more of a survey course—that happens in the fall. And then there will be a number of electives that they have to complete over the course of the year. Those electives might be a Python class, it may be a narrative story telling for social justice course, it may be a design thinking human rights course. So there’s a range of courses that will be in social work that are still being developed and approved, and then courses that we’ve identified from around the university that meet this broad framework for the minor.
Desmond Patton: I think one thing that we realized is that, I think I have assumed that our students, 22–23 years old, know a lot about technology, and that’s not necessarily the case. There’s a lot of language, there’s a lot of back story that’s just not part of the conversation. And so this intro course is where there’s this really nice survey of what is AI? What is machine learning? What is algorithmic bias? And these are things that social workers are engaged in day to day in their conversation in the community and with clients but haven’t used that language to talk about it.
Courtney Cogburn: And one thing we’ve just quickly, we’ve been so excited about in class is that we’ve had so many people who have worked in tech, who have been critiquing tech, who have come to our class either by person or by Zoom who are experts in their own right in different areas, who are directly sharing perspectives and experiences with our students. So just yesterday we had Dr. Ruha Benjamin, who wrote Race after Technologies, a professor at Princeton, who came to the class and spoke and engaged in some critical discourse. Bruce Lincoln, who is one of the co-founders of Silicon Harlem, who is really thinking about the integration of tech in Harlem communities. It’s such a unique opportunity for our students to engage with these figures and really get a sense of what’s happening broadly in this space.
Desmond Patton: And that’s really the power of being at Columbia in New York City is that we can have such a pool of people come to our school to come to our classes, but our students can also go to these different spaces as well. So we have a planned meeting with Justice from our labs coming up in November where our students can actually go out to Brooklyn and hang around this new virtual reality space.
Richard Hara: Yeah, so I see two wonderful aspects of this. First is what I think of your—you’ve been involved in transdisciplinary right? Kind of work, both across our university and scholars in the field. And at the same time there’s a community connection, and to be able as social workers and social work students, to bridge and to bring together these different communities, stakeholders and so on, I think, yeah, that’s—that’s –that’s the promise of a social work degree, right?
Courtney Cogburn: Absolutely, yeah. I mean social work is inherently transdisciplinary. We’re not a discipline. We pull from so many different perspectives and integrate. We take on really difficult problems, and so that’s what we do well. We’re just leveraging what social work already is, what social work already does, and trying to apply it in new and interesting ways.
Desmond Patton: Yeah.
Richard Hara: Okay, well it sounds wonderful. I think we’ve got some questions from the audience to address. Have you seen virtual reality work for exposure therapy to address fears such as fears of flying, fear of water, etc.? More clinical?
Courtney Cogburn: Yeah, I have. I have seen, there’s some work happening out of USC, for instance, where they’re working with vets around post-traumatic stress disorder and putting people back into spaces and experiences that trigger them. And helping sort of engage using sort of a cognitive behavioral approach. But I also recently met some vets, some disabled vets who were talking about having problems with that approach and wondering if that’s the way to go. So I think there’s some possibilities and some room to grow.
Desmond Patton: Right.
Richard Hara: Yeah, I mean you could use it to help give people an experience, right? And maybe build empathy to address some issues that maybe have more structural implications, but at the same time we have to be careful, right? At the same time it’s not a fit for everyone as you said, right? It’s not meant to solve every problem.
Courtney Cogburn: It’s not a magic pill, right? And, so this is part of what I push against in virtual reality space is that you had a virtual reality experience. What do you do with people when they come out of a headset? What information do they have? What types of conversation are they having? What types of curriculum are they engaging? I think that’s an important complementary piece. It’s not just the tech itself, it’s what goes with it.
Desmond Patton: And what do you do afterwards? And it’s not just an individual thing. It’s also what do I do out there in my community?
Courtney Cogburn: Yeah, absolutely.
Richard Hara: Okay, wonderful. Next question, my question is for both guests as well. What topics would you love to see social worker researchers tackle in the tech space? Can you suggest two or three PhD research topics?
Courtney Cogburn: That’s cheating.
Desmond Patton: I think that there is a continued need to think about algorithmic bias and ethics, but to really, to blow up this idea of what ethics are, I think what we—what I’ve heard in this space is that it’s really about bias in algorithms. We are omitting the bias that we have inherently in a white supremacy culture. And I think we really have to start addressing the things that we’re bringing to tech in real ways, and it’s not –it’s not tech. it’s not the instrument that’s the problem, it’s us. And, so I think we really need to start to attack that in very vigorous ways.
Courtney Cogburn: I think I would add that I would like to see more work that not only targets the client, but also targets the systems that serve or not serve our clients. So what do we do? You know, often we might prepare someone coming out of prison to engage these systems and how to interview and apply for a job. How do you engage the people who are placed in the position to hire them or not? How do you engage the people who are in charge of housing, funding, policy decisions on that end? So I think we need to engage—social workers not only need to engage our service of clients, but also the systems and the people who control those systems that become impairments for –create structural barriers for our clients.
Desmond Patton: And I think one of the things that we really need to contend with is, accessibility to this space and the workforce that’s going to be impacted by the space. What I’m really concerned about is that there are going to be so many jobs that are going to be displaced as part of this intelligence, and we’re not prepared in a vast array of spaces. And as social workers, we’re going to be the ones responding to it. And, yet we have no response. And so this is coming, and so I think we really need to start doing some research now to prepare for how to train people so that we can prevent some of these things from happening.
Richard Hara: Hmm, any ideas about what that response would look like?
Desmond Patton: Well first, I think we just need to open the door in terms of success ability to this space. Like with web pages in 1997, only a few folks could do web pages and they were paying lots of money. Now anyone can just pay $100 and develop a web page. The same thing is happening with artificial intelligence. We need to start to get people who can apply AI to various spaces, places and systems. And, so we need to blow up this idea that you need to be an engineer in order to be a part of the AI workspace.
Courtney Cogburn: We are just barely tapping into what’s possible with these technologies, because you haven’t engaged people who don’t—who didn’t develop the technology, right? So only the people who have developed the technology have applications for their technology. So part of what we’re arguing is things like what happens when you explain to a social worker very clearly what the technology is? What might they have to say about that? How may they apply it? One of the most interesting things I’ve just seen is an individual who’s blind, whose modified the guiding cane to give haptic feedback, to map information, who’s integrated technologies into this cane because it’s coming from the perspective of someone who lives that experience. No one else would have imagined that solution until this person had access to think about how to apply these technologies. So we’re trying to create, to open the flood gates and bring as many people in as possible.
Richard Hara: Yeah, I guess it’s social work’s ability, right? To give voice to people’s experiences that often aren’t heard, so in different spaces. So I think that’s obviously, something we want to continue, but it’s intimidating, right? I mean I’m listening to AI, and I’m thinking that you need super computers to do that kind of stuff, and you know, self-driving cars of the future and all that kind of thing, but AI can have a much more, sort of, down to earth and every day connection to our lives.
Desmond Patton: I think that’s part of the design, right? To create a space that has been so distant that you think that you aren’t a part of it when in reality we are already a part of it. The way in which this whole video is happening and how? There’s some AI system to power a lot of things that are going on right now. So I think we have to blow up the language and make it more accessible for everybody.
Richard Hara: Okay, and well—it’s an intense topic, and I would love to pursue it more, but we will see if we have some more questions here too from the audience. If you were hired by Facebook or Instagram at a high level to address user well-being, what is the first thing you would change?
Desmond Patton: Well I do this now, so I’m a consultant at Facebook. I’ve been there for a year. One of the things I really need to say is that I work with some great colleagues at Facebook that are really passionate about social justice and are trying to make change. It’s when you move up the ladder that there’s immense change. There’s this culture around breaking things and moving really quickly. And I think that we should actually slow things down and think critically about the application of these tools. I think had we thought about the implications of a lot of these tools, then we wouldn’t be where we are today. And, so I think that there needs to be a different culture where, you know, breaking things to get a bigger buck is not going to be the thing that actually creates a more healthy society.
Courtney Cogburn: Right, and just quickly I would add that in that position I’ve been in those roles as well. It’s not just about protecting the user. It’s also holding the system and organization accountable. And often social workers tend to think about protecting the person over here. We also have to constantly come back to, “how do I hold this organization, their systems, their policies, their cultures accountable for their threat that they’re producing?”
Desmond Patton: Yeah.
Richard Hara: So if we’re not working for them, how do we hold them accountable? What are some of the things that we might do?
Courtney Cogburn: I think using social media, collective organizing, you know, tried and true advocacy and activism. Efforts matter, and we need to develop basic fluency in what’s actually happening, what’s going on. Not feel removed from it, that that’s happening over there. As Desmond has been stating, it’s integrated. It’s all around us. It’s already happening. Major decisions are being made on our behalf and for our society that will fundamentally change who we are and how we function. And in some ways will make the most vulnerable even more vulnerable than they already are. Racism, sexism, etc., on steroids with some of these systems. So we really can’t afford to not have social workers involved and then use the tools that we already have that work quite effectivel: advocacy, disruption, pushing back. We need to employ all of that.
Desmond Patton: Yeah, and facilitating conversations, right?
Courtney Cogburn: And facilitating conversations.
Richard Hara: At the table, at the table. Can technology and data science be used to influence more people-oriented social welfare policy?
Desmond Patton: I think so. One of the things that I think is really important about a data science approach is that it can elucidate patterns and larger data that may currently be unheard of or not discussed because the traditional methods of using surveys may not get at some of the root causes of problems. What I really appreciate about social media and data science is that we get to see multiple cells. We get to see multiple identities, multiple ways of thinking and knowing, and we can then harness that to really get at the root of how people are –are thinking about an issue. And I think there’s a big entry point for social work for policy in that space.
Richard Hara: Okay.
Courtney Cogburn: Yes, I agree and won’t go into a longer answer.
Richard Hara: No one has a monopoly, right, on knowledge?
Courtney Cogburn: I think the policy question is around helping people understand social problems that they don’t currently understand. If you understand root causes of mass incarceration, if you don’t then you’re going to come to solutions that have nothing to do with the actual causes of mass incarceration. So yes, I think technology can be used in a number of ways to help support better thinking, understanding of the problem, so we get to better policies and solutions.
Richard Hara: Sources of data etc., so on to consider, so wonderful. Do you have any tools for broaching this subject with tech leadership who might not consider this conversation a priority? I mean what’s—how do you get the foot in the door so to speak?
Courtney Cogburn: We have a whole workshop on that. I have generally found that once people understand what social work is, they’re very receptive to it because they’re grappling with at least the priority of not causing damage to their company by something they do and choose to do. That’s low bar priority. Others actually express interest in doing good. Social justice, social impact, they have no idea how to actually do that. They don’t have anyone on their team that does the “for good” portion of that. So I have found people to be quite receptive, at least on the surface.
Desmond Patton: I think that’s clutch in terms of understanding the language that maps really well with social work, but translates for us? We have the answers, and we can easily be in those conversations and provide frameworks and practices and thoughts and ideas whereas most of those conversations are very surface. And I think finding where you fit in is a really great entry point.
Courtney Cogburn: Yeah, yeah, and honestly, I think the default is to consider engineers are creating the technology, so how do we help engineering become more humanist as opposed to—maybe engineers are not the people you want in isolation addressing humanist problems. Maybe you need humanists, social justice community, community-based people working in collaboration—
Desmond Patton: Folks save that, underscore that, highlight that, that’s a really important point.
Courtney Cogburn: The core engineering curriculum does not substantively address human behavior, it does not substantively address complex social issues, so why would they be in a position to develop solutions to those really complicated problems. You need people who are really grounded in the complexities of human behavior, society, oppression, etc. They need to be core to technology solutions.
Desmond Patton: 100%.
Richard Hara: At the end of the day we’re all people. So that concludes today’s episode. We’ll be joined next week by Columbia School of Social Work faculty member Richard Beck on the need for therapists to seek self-care. So I hope to see you then. Thanks and have a great week.