← back · transcript · 1E7BoJQf050 · view dossier

Transcript

How social workers use AI to help unhoused teens | Anamika Barman-Adhikari | TEDxMileHigh

URL: https://www.youtube.com/watch?v=1E7BoJQf050
Video ID: 1E7BoJQf050
============================================================

Transcriber: Gefi George
Reviewer: Pari ‌Szi Even when you might not know it, we have all experienced how artificial intelligence 
can make our lives easier and more fun. Machine learning algorithms are the reason why a horror movie 
will never cross my Netflix screen. It Schitt’s Creek, The Office 
or nothing else. These sneaky algorithms 
have a way of figuring out what I want even before I know it. Then they help me find those products 
and then those exact products, follow me from website 
to website, day and night. I’m still in a moment of weakness. I click the ad and purchase the product. But shopping and scrolling are
just the tip of the iceberg. As a social work scholar, I have seen how artificial intelligence 
can help provide solutions to some of the biggest problems
facing our society. Problems like homelessness,
poverty,‌ addiction. For the last seven years, ‌I have been working
with a computer scientist and technologist to develop algorithms that can help some of the most vulnerable
members of our society. Young people experiencing homelessness. Among the countless issues
that they experience, Substance use is almost an epidemic
among this group of young people. Studies show that almost 50% of all young people
experiencing homelessness use some kind of illicit substance. In comparison, less than 5 to 10% of young people, who do not experience
homelessness do the same. Statistics can, however, be numbing. So let me explain to you why substance use is so pervasive
among this group of young people in a manner that perhaps
can be more relatable. Imagine you’re 18. You have nowhere to stay. You cannot call your mom or your dad. Everything that you own, 
you have to carry on you at all times, either in a backpack or a shopping cart. And you have to find a new place
to sleep in every night. Take a look at this picture taken by a 19 year old girl experiencing
homelessness right here in Denver. This is what it’s like. Sleeping on a dirty stained mattress. Against a coal cinder block wall. Trash littered everywhere. Use condoms and heroin
needles on the floor. I would be terrified in a place
like this, wouldn't you? You would do anything to escape 
this reality and nump your pain. And that’s what 
a lot of these young people do. They often use substances
to cope with their trauma, to feel comfortable
in an uncomfortable world. Now, researchers have tried to tackle
this problem for decades using a method  that sounds great 
on the surface: pure led groups. For 6 to 8 weeks,
these young people come together to talk about their problems, get advice and positive support
from their peers, and work on developing 
better coping skills. These groups are relatively
inexpensive to operate, which is a big deal 
for cash strapped nonprofits who are often the ones running them. And the reason these groups work is because they’re designed 
to be developmentally appropriate. Adolescents are almost biologically 
hardwired to question, resist and doubt what adults tell them to do. If you're a parent or a teacher, you
know what I'm talking about. But let’s be clear. Our friends can sometimes also be toxic. I mean, I literally smoked
a banana peel once. (laughter) Because my friends told me 
that it would get me high and it was the cool thing to do. Any kids out there, including my own, please do not repeat 
the same mistakes I did. But you get the point. Let’s put a group of young people who might mimic each other’s negative 
behaviors in a group for 6 to 8 weeks. What could go wrong? A lot of you might have guessed it. Instead of going down. Substance use actually increases 
along with other negative behaviors, Until recently, researchers didn’t have 
a reliable way of predicting which groups would work and which groups
had spiraled out of control. Social workers are usually the ones
putting these group groups together, either assigning young people
to these groups randomly or allowing them 
to self-select into those groups. Unfortunately, neither scenario is ideal. Let's take a look at this graphic from
my research to better understand why. In case you’re wondering, this is a social network map 
of 160 young people experiencing homelessness
in Los Angeles, California. Every dot on this picture is one young person
experiencing homelessness. And the links between
them are shown by the lines. The youth who are very popular
and well-connected, find themselves at the centre
of this network. And the youth who are not as popular
or well connected are pushed to the periphery
of this network. Now notice the red dots on this picture. These are the young people who use meth. And more importantly, 
notice how tightly connected they are. You know what that tells me? It tells me if we put 
these young people in a group together, Meth use will probably go up
instead of down. Because they’ll just reinforce
each other’s meth use behaviors, and also, the grey dots will catch up
to what everyone else has been doing. Randomly assigning them to groups. It’s less harmful, but no better, because it ignores natural
social influence. Who are you more likely to listen to? Your friends or some random stranger? So I just I started wondering, what if you could use
artificial intelligence and machine learning to figure out what’s the best way
to configure these groups? In order to do that, I partnered with researchers from USC’s Center 
for Artificial Intelligence and Society and got permission 
from a group of young people experiencing homelessness to use their social network data 
and basic personal information. Then we fed the machines this data
about their social relationships, their substance use behaviors
and other relevant behaviors. Then the machines would run iteratively
testing 30 to 50 groups at a time. Finding out the best configuration. How did the algorithm do this? The algorithm found 
a way of mathematically maximizing the number of existing 
friendships in each group while minimizing
the number of substance users. Not only that, the algorithm also found 
a way of accounting for how people’s relationships
would change in these groups in response to the intervention, that positive relationships 
would develop and deepen, and negative relationships
with subside and influence over the course of the intervention. You might ask,
why can’t a person do this? Well, to start with, social services are notoriously
understaffed and overworked. It would be extremely time consuming
and mind numbing for a person to do all these gas calculations, especially for a large group of people. Second, a person, unlike a machine, cannot predict in advance which groups
would actually reduce substance use. The results of the algorithm
were astounding. compared to randomly allocated groups, these AI generated groups
were able to reduce substance use by almost 40 to 70%. That’s right. 40 to 70%. That’s huge. Results like these could mean that social workers can use their time
and resources more efficiently to help more youth faster. And for the youth themselves, getting sober can be life changing. Staying sober can be one of their
pathways out of homelessness. Now, This is just example of many of how 
technology can improve social services. I’ll provide a few others. In Allegheny County, social workers are using 
a predictive analytics algorithm to screen for and triaged
the most serious child abuse cases. Researchers in New York, are studying young people’s tweets to develop an algorithm
to screen for signs of loss, aggression and substance use 
as a way of reducing youth violence. In one of my own studies, I’ve been using 
young people’s Facebook post to develop an algorithm that could predict their likelihood 
of developing substance use disorders. Compared to traditional methods
that can predict substance use accurately, maybe 30% of the time, this algorithm can predict substance use 
almost 80% of the time. This level of accuracy could mean that social workers are better 
able to prioritize their caseload instead of expanding 
their limited time and resources on people who might
not need it at that moment, they can instead help people 
who need it the most. And that’s the goal. This is not about machines
replacing social workers. Algorithms are not perfect
in the wrong hands. They can actually exacerbate
bias and systemic racism. However done right, they can be a powerful tool 
in the hands of social workers to empower them
to do what they do best, to use their intuition, their empathy, and their experience to help people
overcome life's most difficult challenges. And that’s the kind of future
I wish to see. And I now I hope you do as well. Thank you so much for your time. (Aplouses)