Podcast: Play in new window | Download
About this episode’s guest:
Renée Cummings is a criminologist and international criminal justice consultant who specializes in Artificial Intelligence (AI); ethical AI, bias in AI, diversity and inclusion in AI, algorithmic authenticity and accountability, data integrity and equity, AI for social good and social justice in AI policy and governance.
Foreseeing trends and anticipating disruptions, she’s committed to diverse and inclusive AI strategy development; using AI to empower and transform communities and cultures; securing diverse and inclusive participation in the 4IR, helping companies navigate the AI landscape and developing future AI leaders.
A multicultural cross-connector of multiple fields and an innovative collaborator, her passion is forming connections and unifying people and technologies; enhancing quality of life and economic prosperity. She’s also a criminal psychologist, therapeutic jurisprudence and rehabilitation specialist, substance abuse therapist, crisis intelligence, crisis communication and media specialist, creative science communicator and journalist.
She has a solid background in government relations, public affairs, reputation management and litigation PR. A sought after thought-leader, inspirational motivational speaker and mentor, Ms. Cummings is also a Columbia University community scholar.
She tweets as @CummingsRenee.
This episode streamed live on Thursday, September 17, 2020. Here’s an archive of the show on YouTube:
About the show:
The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.
Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.
Transcript
01:19
hey
01:19
humans hello how are you i’m
01:22
really glad to see some of you tuning in
01:25
i want to hear uh
01:26
hear from you those of you who are
01:27
online let me know where you’re watching
01:29
from
01:30
tell me how how the weather is there as
01:32
we’re moving into fall
01:35
really glad to have you here you’re
01:37
tuned into the tech humanist show
01:39
it’s a multimedia format program
01:41
exploring how data and technology shape
01:43
the human condition
01:44
i’m your host kate o’neil and i hope
01:47
you’ll subscribe and follow wherever
01:49
you’re
01:50
tuning in from so today we are talking
01:53
with renee cummings wait let me
01:55
let me uh before i read fantabulous bio
01:59
for our guest
02:00
i am i’m just gonna do uh one little
02:04
retweet here and make sure folks are
02:06
able to catch us from the tech humanist
02:07
twitter account
02:08
hey stu saunders checks in and says
02:11
great to see your smile thank you stu
02:15
i wish i could be seeing your smile too
02:17
but oh look you’re there in your uh
02:20
in your icon so good i’ve i’ve got you
02:24
who else is out there and tell me where
02:25
you’re where are you tuning in from
02:28
i’m i’m really excited this is going to
02:30
be a really
02:31
important and great discussion today so
02:33
i think you’re i think you’re really
02:34
going to enjoy it
02:36
let me go ahead and introduce our guest
02:38
and everybody else feel free to start
02:39
keep checking in and let me know where
02:40
you are and where you’re tuning in from
02:44
today we are talking with renee cummings
02:46
who’s a criminologist
02:47
and international criminal justice
02:49
consultant who specializes in artificial
02:52
intelligence
02:53
ethical ai bias in eai diversity and
02:56
inclusion in ai
02:58
algorithmic authenticity and
02:59
accountability data integrity
03:02
and equity ai for social good and social
03:04
justice in ai policy and governance
03:07
for seeing trends and anticipating
03:08
disruptions she is committed to diverse
03:10
and inclusive ai strategy development
03:13
using ai to empower and transform
03:15
communities and cultures
03:17
securing diverse and inclusive
03:18
participation in the 4ir
03:20
helping companies navigate the ai
03:22
landscape and developing future ai
03:24
leaders
03:25
a multicultural cross-connector of
03:27
multiple fields evidently as we can hear
03:30
and an innovative collaborator her
03:31
passion is forming connections
03:33
and unifying people and technologies
03:36
enhancing quality of life and economic
03:38
prosperity
03:39
she’s also a criminal psychologist
03:41
therapeutic jurisprudence and
03:42
rehabilitation specialist
03:44
substance abuse therapist crisis
03:46
intelligence crisis communication and
03:48
media specialist
03:50
creative science communicator and
03:51
journalist
03:53
that’s an impressive set of skills if
03:54
you ask me she has a solid background in
03:56
government relations
03:57
public affairs reputation management and
04:00
litigation pr
04:01
she’s a sought-after thought leader
04:02
which i can vouch for following her on
04:04
linkedin
04:05
and twitter all i see all day long are
04:07
like this is where i’m going to be this
04:08
is where i’m going to be i’m like how do
04:09
you have the time and energy in the day
04:12
which is why i’m really really glad she
04:14
can join us today she’s an inspirational
04:16
motivational speaker and mentor which i
04:18
think you’ll
04:18
you’ll get from listening to her today
04:21
and miss cummings is also
04:22
a columbia university community scholar
04:25
so
04:25
audience please start getting your
04:27
questions ready for our incredible
04:29
guest please do note that as a live show
04:32
while we’ll do our best to vet comments
04:34
and questions in real time we might not
04:36
get to all of them
04:37
but we very much appreciate you being
04:38
here and participating in the show
04:41
so with that please help me welcome
04:44
renee cummings renee
04:47
you are on the show let me have you um
04:50
lift up if you can
04:51
lift up your camera just a little bit or
04:53
uh
04:54
bring yourself up in the there we go
04:56
yeah i don’t want you cut off
04:57
on the screen there renee thank you so
04:59
much for being here
05:01
thank you so much for inviting me kate
05:03
it’s certainly an honor and a pleasure
05:04
to be with you oh my god
05:06
it’s my honor thank you uh little note
05:09
of administration let me go ahead and
05:10
have you do a little bit more if you can
05:12
bring if it’s a
05:13
laptop camera if you can bring it down
05:15
just a little bit
05:16
we’re cutting off you’re like here oh
05:19
perfect
05:19
perfect i don’t want you to look back at
05:22
it and see just this
05:23
uh now i see this much can we do a
05:25
little more perfect
05:27
i’m sorry we should have caught this
05:29
beforehand yes
05:30
it’s much better so renee after uh
05:33
reading that incredible list of
05:34
accomplishments and distinctions
05:36
i also just noticed on linkedin earlier
05:39
this week that you used to be a
05:40
sportscaster
05:41
is that right that’s true a very long
05:44
time ago
05:46
that’s an amazing career trajectory how
05:48
did that happen
05:50
well i started as a print journalist and
05:52
then i went into broadcast and while i
05:54
was in broadcast
05:55
i’ve always had this love for sport and
05:57
the station that i was working at did
05:59
not have
05:59
any woman doing sport so i saw it as an
06:02
opportunity to do something different
06:04
uh to have a little more fun on the job
06:07
and really uh put women
06:08
out there on camera doing sport and
06:12
it was a fantastic experience it
06:14
contributed so much to
06:16
who i am and it keeps me alive because i
06:19
still love sport and i think it
06:20
certainly gives me a particular energy
06:22
that keeps me going
06:23
yeah well you have such a presence you
06:26
have a great voice and presence for
06:28
being
06:28
on screen anyway so i can see where that
06:31
carries through to this day
06:32
what’s it like for you now as a fan of
06:34
sport watching all of these
06:36
athletes compete with no audiences
06:38
what’s that feel like
06:40
well that’s where we are now and it’s so
06:42
many things that have created that and
06:43
it’s so funny that
06:45
what’s happening in sport really informs
06:47
the work that i do when it comes to
06:49
social justice
06:50
uh racial justice and even looking at
06:51
things like racin and covet and how
06:54
covert
06:54
uh is really are changing covert 19 is
06:57
changing the ways in which we engage and
06:59
we interact and
07:00
how ai is now uh really uh creating a
07:03
new type of context
07:05
for what sport is going to look like and
07:07
what fans are going to look like moving
07:09
forward
07:09
yeah it’s also been really interesting
07:11
to see the intersection between
07:13
sport and social justice playing out
07:15
over the last few months
07:16
right with the the sort of um conscience
07:19
coming from i think the women’s nba
07:21
league seems like that then women’s
07:23
soccer has been doing a really great job
07:24
of
07:25
sort of leading the path and then nba
07:27
following and and
07:28
other other leagues stepping up and
07:30
saying well we’re going to either strike
07:32
or
07:32
we’re bringing the protest to the court
07:34
or the field that’s really encouraging
07:36
it seems like
07:37
to me how do you feel about that well
07:39
well it is and i think uh
07:41
for a very long time i think we’ve
07:42
celebrated men of color in sport on the
07:45
court and on the field
07:46
and i think what they’re saying now on
07:48
the street we need that kind of respect
07:51
as well so i think it is really powerful
07:53
and i think it’s something
07:54
that we’ve got to celebrate and
07:55
something that we’ve got to support
07:57
yeah wonderful thought in fact it’s a
07:59
really great transition it seems like to
08:01
to the topic of ai and technology
08:03
because i’ve seen where you have said
08:06
the line the phrase criminology must be
08:09
the conscience
08:10
of ai and i found that profound
08:13
i i think you know certainly for me it
08:15
seems like discussions of algorithmic
08:17
bias
08:17
are always sort of bouncing around near
08:19
the areas of law enforcement and
08:21
criminal justice and that’s an
08:22
overlap area in those conversations but
08:25
it’s rarer in my experience to
08:27
like plant the conversation there and
08:29
from the beginning and say let’s
08:31
design from here outwards and yet once
08:34
you
08:35
establish that orientation it makes
08:36
perfect sense so
08:38
how did you arrive at that insight in
08:40
your career
08:41
well i think i i entered uh ai
08:45
from the perspective of what was
08:46
happening in the courts and how risk
08:48
assessment tools
08:49
were being used to uh give a score as to
08:52
whether or not someone was going to
08:53
reoffend
08:54
and uh attaching risk assessment uh
08:57
tools to
08:57
recidivism rates and i felt that to be
09:00
uh something that was a bit too shaky
09:02
and what we saw with with criminal
09:04
justice is that many of the uh
09:06
high impact decisions happen there when
09:09
it comes to life of liberty when it
09:11
comes to life or death
09:12
and if you’re planning to use an ai or
09:14
an algorithmic
09:15
decision-making tool in the criminal
09:17
justice system
09:18
you cannot be using a tool that is so
09:20
opaque or a tool that’s providing
09:23
predictions that have overestimating
09:25
recidivism
09:26
and i felt that what i was seeing was
09:28
not uh
09:29
really a conscience when it came to the
09:32
use
09:32
of artificial intelligence in the
09:34
criminal justice system
09:36
because too many of the tools that were
09:38
being designed
09:39
were being designed from a place of bias
09:42
data a place uh where data was
09:44
discriminating and what i realized i was
09:46
seeing
09:47
is what i called uh guilty conscience ai
09:50
because now we’re seeing the pull back
09:52
we’re seeing the moratorium on
09:54
predictive uh policing on facial
09:56
recognition
09:56
uh technologies and many of the
09:58
technologies that are being designed
10:00
to uh lead what they call
10:02
intelligence-led policing or
10:04
or big data policing so i said that what
10:06
we really need are not only data
10:08
scientists
10:09
designing tools for criminal justice but
10:11
criminologists and criminalists and
10:13
criminal psychologists and other
10:15
individuals involved in the criminal
10:17
justice system
10:18
working alongside those data scientists
10:20
so it was really a call for a multi-uh
10:23
disciplinary approach to how we do ai in
10:26
criminal justice
10:27
and the fact that what we don’t want to
10:29
continue making
10:30
in the criminal justice system would be
10:32
the mistakes that we’ve made
10:33
in the past so why use new data why use
10:36
new technology sorry
10:37
to amplify old historic biases
10:41
that have kept us trapped yeah what a
10:43
brilliant observation
10:44
that is it’s because it seems to me that
10:46
it may be possible to think
10:48
of other applications of ai and
10:49
algorithmic decision making that stand
10:51
to affect
10:53
people more people in a broader range of
10:55
ways but it would be hard to think of an
10:57
application that has a greater chance of
10:58
doing more harm to
11:00
an individual’s rights liberty and
11:02
quality of life than
11:03
how law enforcement and criminal justice
11:06
uses
11:06
ai and other technologies right is that
11:08
affair
11:09
when you look at it when someone is
11:11
arrested when someone is incarcerated
11:13
and someone has a sentence to serve
11:15
think of how the families are impacted
11:17
think about what happens to a child when
11:19
a parent is incarcerated
11:21
what happens to the trajectory of that
11:23
child’s life
11:24
and then you start to see the social
11:26
cost of crime so it’s just not that one
11:28
individual
11:29
who is incarcerated but it’s families
11:32
and generations that are impacted by
11:34
that
11:35
so this is why i speak a lot about
11:37
intergenerational
11:38
trauma in data and the fact that we’ve
11:40
got to look at the data
11:42
and understand the history of the data
11:45
the psychology the sociology
11:47
behind it it’s just not numbers it’s
11:49
lives we’re playing with
11:50
yeah yeah and so when i think about
11:54
the tech the taxonomy of sorts of of ai
11:57
or algorithmic systems and how uh they
12:00
intersect with with criminal justice
12:02
um i i sort of imagine a few broad
12:04
categories like i think about law
12:06
enforcement i think about courts
12:08
and then i think about detention or
12:10
supervision uh of
12:12
sort of parole and that sort of thing
12:14
and so first of all my question to you
12:15
is is that a fair taxonomy
12:17
in terms of the breakdown of how you
12:20
know ai and other technologies
12:21
intersect with with law enforcement and
12:23
criminal justice
12:25
yeah definitely so it’s going to be the
12:27
police it’s going to be the courts and
12:28
it’s going to be corrections
12:29
that’s the triad in criminal justice
12:31
you’re going to find that those are
12:33
definitely the big categories where ai
12:35
would be applied okay and then within
12:37
within that uh do you find that you have
12:39
equal focus
12:40
across each of those areas or is it more
12:43
focused
12:44
is your focus more let’s say on law
12:45
enforcement because it’s sort of
12:47
you know top of the funnel and the most
12:49
urgent problem of policing and racial
12:51
disparities
12:52
well i think what we’re seeing right now
12:54
would be policing and the courts because
12:56
those are the two that are up front
12:57
we’re not seeing much what’s happening
12:59
in corrections because most of us don’t
13:01
exist behind the prison walls
13:03
but definitely there are things that are
13:04
happening there and what’s happening
13:06
and now and it’s something that i keep
13:08
speaking about it’s we’re creating
13:10
digital prisons
13:11
we’re using ai to create digital prisons
13:14
so
13:14
you don’t have to be behind the prison
13:16
walls to be serving
13:17
a sentence with this technology so
13:20
that’s how we’re using it when it comes
13:21
to
13:22
algorithmic policing when it comes to
13:24
the algorithmic decision-making systems
13:26
in the judicial system or when it comes
13:28
to e-incarceration or what’s happening
13:30
there
13:31
we are creating those digital prisms
13:33
outside
13:34
we are using the technology to create
13:36
what i call the
13:37
digital chokehold and this is why i’ve
13:40
been speaking so much and how
13:42
we fuse ai in criminal justice but
13:44
something that we don’t think about when
13:46
it comes to these risk assessment tools
13:48
they’re already being used in a juvenile
13:50
setting so how is that going to change
13:52
the trajectory
13:54
of a child’s life it’s being used in a
13:56
child protection case
13:58
so are we going to use an algorithm to
13:59
separate a parent and a child so we’re
14:02
not
14:02
seeing those other cases but those are
14:05
indeed as traumatic
14:07
and will have extraordinary impact on
14:09
the life outcomes of children
14:11
and families and generations yeah indeed
14:14
that really sounds like it
14:15
it’s a very serious and urgent
14:17
consideration what are the problems that
14:19
at each of the other levels so in court
14:21
systems
14:22
bias and prediction models and things
14:23
like that how how is ai
14:25
entering into that that process or that
14:28
part of the trial is probably
14:30
frustrating due process
14:32
particularly in the judicial system and
14:34
it’s overestimating risk for black and
14:36
brown people
14:37
so that is really a challenge given uh
14:40
the system itself
14:42
in corrections what we’re seeing is that
14:44
people are now
14:45
challenging algorithms to gain parole
14:48
to gain their freedom and and that
14:50
really is unfortunate
14:52
and you’ve had a few cases where uh
14:54
persons were formally incarcerated
14:56
returning citizens
14:57
had to challenge these vendors uh to get
15:00
their freedom to get the
15:01
parole board to say uh you know we’re
15:03
going to sign
15:04
uh that release so those are really
15:07
critical things
15:08
that we’re not thinking about when it
15:10
comes to ai
15:11
whether or not this technology is
15:13
accountable
15:14
whether or not it is transparent whether
15:16
or not it is explainable
15:18
do we have rights to peep into the black
15:20
box if someone is incarcerated and an
15:23
algorithm keeps saying that you’re a
15:25
risk
15:25
when you’ve done 15 years of good
15:27
behavior
15:28
when you are ready to to reconnect with
15:31
your family and to build back your life
15:33
and now you have to face an algorithm
15:36
that is unfair and indeed that’s why i
15:38
say
15:38
that criminal justice must be the
15:40
conscience of ai
15:42
because those are some big decisions
15:46
and we can’t leave those life and
15:48
liberty decisions
15:49
to an algorithm yeah or even
15:52
it seems like or even to a set of you
15:55
know software developers and designers
15:57
who are
15:58
uh bringing their own biases into the
16:00
design of those systems right so
16:02
that is one of the ideas that seems like
16:04
it needs to be exposed is
16:05
that the data models of course and the
16:07
biases that go into the data models
16:09
themselves
16:10
must be uncovered even before we can
16:12
talk about the bias
16:13
in the designs of algorithms in terms of
16:14
logic and flow and
16:16
you know rules and so on so that is part
16:18
of the discussion that happens i know
16:20
within
16:21
ai in a larger sense how often is that
16:23
part of the work
16:24
that you’re doing within these different
16:26
pieces of the the triad
16:28
uh in criminology well it’s it’s the
16:31
work that i’m doing now and this is
16:32
where my life is situated
16:34
really looking at data really trying to
16:37
understand what are we doing with this
16:39
data
16:40
and understanding that if people of
16:42
color had trust
16:43
issues with the criminal justice system
16:45
before ai
16:47
how can we trust the data that we’re
16:49
using now
16:50
so one of the things that i like to ask
16:52
you know when i look at bias
16:54
and discrimination and systemic racism
16:56
in the data and how it’s
16:57
baked into the data and how we keep
16:59
using this data to design
17:01
tools for the criminal justice system so
17:04
if you’re using data that’s been
17:06
gathered
17:07
from communities that have been over
17:08
policed of course the data
17:10
is not going to be as accurate as you
17:13
want that data to be
17:14
but i always ask you know when you look
17:17
at the history
17:18
in the country of enslavement when you
17:21
think about things like the slave codes
17:23
and the
17:23
fugitive our slave or you think of the
17:26
13 amendment or the the one drop rule or
17:28
the
17:28
three-fifths compromise and jim crow and
17:31
segregation
17:32
how do you remove that from the data how
17:35
do you take that history
17:36
of systemic racism out of that data and
17:39
if that history
17:40
is baked into your data sets what are
17:43
you going to produce with that
17:45
yeah that was actually a question i had
17:46
for you because i i noticed that there
17:48
was this
17:49
article that surfaced last week in the
17:51
route about
17:52
a judge asking harvard to find out why
17:54
so many black people were in prison and
17:55
they could
17:56
only find one answer systemic racism so
17:59
while that’s not
18:00
a tremendous surprise i think to anyone
18:02
who’s been paying attention
18:03
it’s a striking thing to have on record
18:06
and i wonder
18:07
when that kind of data exists in the
18:09
larger system around us
18:10
are there ai approaches and solutions
18:14
that could help potentially overcome
18:15
that or is it just a matter of
18:18
of governance and oversight you know in
18:20
the development of ai systems to make
18:22
sure that that’s not
18:23
furthering and deepening those those
18:25
issues
18:27
well i think there are many uh companies
18:29
right now particularly at this moment
18:31
where systemic racism seems to be on the
18:34
front burner
18:35
you know in every conversation there are
18:37
many uh designers who are presenting
18:39
tools
18:40
that are looking for risks you know
18:42
detecting and and monitoring and
18:44
managing risks and
18:45
uh they have bias and discrimination in
18:47
there and racism
18:49
but it’s bigger than that you’ve got to
18:51
go back to the subconscious
18:52
you’ve got to go back to the design
18:54
consciousness you’ve got to go back to
18:56
the
18:57
millions of dollars companies have spent
19:00
in implicit bias training
19:02
that really have not bore any type of
19:04
fruits
19:05
so you’ve got to think about that so
19:07
it’s going to take a combination of
19:09
thinking
19:10
it’s going to take a new type of
19:11
consciousness that we’ve got to develop
19:14
it’s going to take uh a lot of due
19:16
diligence
19:17
at the design table it’s going to take
19:20
vigilance
19:21
because you’ve got to be constantly
19:23
aware and we all have biases
19:26
we all have prejudices so we’ve got to
19:28
be constantly aware of these things
19:30
but it also calls for diversity and
19:33
equity
19:34
and inclusion as a risk management
19:36
strategy
19:37
it’s not going to be perfect yeah
19:39
there’s no way
19:40
i was just going to say there’s no way
19:41
for people to to
19:43
understand the diversity of experiences
19:45
unless there’s a diverse group of people
19:46
at the table participating and building
19:48
systems right i mean that
19:50
makes sense to me diversity in itself
19:53
just doesn’t uh
19:54
create an ai system that’s perfect but
19:56
at least we will know that certain
19:58
checks and balances
20:00
have been applied at least we know that
20:02
what we are producing
20:04
a certain or requisite level of due
20:06
diligence has been applied
20:08
but i think uh from being in this area
20:11
right now
20:12
where it comes to diversity equity
20:13
inclusion in a digital space
20:15
i keep asking myself you know what is it
20:18
about diversity that intimidates
20:20
what is it about inclusion that breeds
20:23
fair
20:24
what is it about equity that makes us so
20:26
territorial and tribal
20:28
because we are many years along
20:31
but we still seem to need a different
20:34
kind of formula
20:36
a different kind of engagement to get
20:38
these things right
20:40
so you had a story i heard on one
20:42
interview one discussion you talked
20:44
about a study that had been done
20:46
at stanford uh can you i i you seem like
20:49
you know the study i’m talking about can
20:51
you give us the lay of the land on that
20:53
well it was just a basic study that
20:54
looked at implicit bias and i think what
20:56
the scholars were trying to do and this
20:58
is what they do now in law enforcement
21:00
as well
21:01
when they do this shoot don’t shoot uh
21:03
sort of simulation
21:04
uh trying to introduce positive images
21:07
of black men
21:08
to see the kind of response that you’re
21:10
going to get so maybe if you were to put
21:12
the face of you know non-threatening a
21:14
black man maybe uh
21:15
will smith i don’t know right so they
21:18
were putting friendlier faces
21:20
trying to see what the impact would be
21:22
and they realized that when they
21:23
introduced these are friendly faces
21:25
handsome faces uh good-looking you know
21:28
young agile
21:30
men young black men the uh perspective
21:33
change but how long did it last
21:35
and that is what they were saying it
21:36
lasted for a week for two weeks so three
21:38
weeks and then we fall
21:39
back into our behavior because that’s
21:42
how does the psyche work
21:44
and and that’s how we work think about
21:46
it like a diet you know sometimes you
21:47
start a diet
21:48
and it’s working and then we see a piece
21:50
of chocolate cake or whatever it is that
21:52
we like and we slip right back into old
21:54
behavior
21:54
so this is why i’m saying it calls for a
21:56
different kind of consciousness
21:58
it calls for an eternal kind of
22:00
vigilance it calls for that kind of due
22:02
diligence
22:03
it calls for at the design table you
22:06
know intellectual curiosity
22:07
and intellectual confrontation to ensure
22:10
that those things are part
22:12
of the risk management strategy so it’s
22:14
it’s just not only designing
22:16
a risk culture or encouraging a risk
22:18
culture
22:19
in a a tech organization but it’s about
22:21
drilling down
22:22
deeper into that and understanding that
22:25
we’ve got to ask the tough questions
22:27
understanding that the answers are going
22:28
to make us very uncomfortable
22:31
but it is in the uncomfortable answers
22:34
in that kind of conflict that we can
22:36
really build
22:37
the systems that we need yeah i feel
22:39
very called out about the chocolate cake
22:41
though
22:43
especially during covid times right hey
22:46
so our friends at all tech as human have
22:48
piped in on youtube and ask is there a
22:51
visual way we can better understand the
22:53
concerns around the power of an
22:55
algorithm to shape our society and then
22:57
they elaborate and said for example
22:59
every news story runs with a pick
23:01
like terminator or pepper or sophia or
23:03
handshake with a robot
23:04
what’s the best way to represent what
23:06
we’re talking about here is there a
23:08
metaphor
23:08
is there a visual that that you use or
23:10
can think of
23:12
well actually i usually speak about that
23:14
you know that that uh robot stepping out
23:16
of the sky or the two hands are they’re
23:18
going to judge
23:18
i think we’ve got to think about how do
23:20
we rebrand that because
23:22
so many people are using the technology
23:25
right now but they don’t know that they
23:27
are using it so most people are still
23:29
waiting for ai to come ai is here
23:31
it is super powerful super pervasive and
23:34
i think
23:34
people have got to understand that this
23:37
is something that can have an
23:38
extraordinary impact
23:39
on society i am very passionate about
23:42
the technology
23:43
but passionate about the ethical use of
23:46
the technology the responsible use
23:48
of the technology ai we can trust so i
23:51
think we’ve got to build
23:52
images that look as though
23:55
trust is at the center that look as
23:58
though they can build
23:59
public confidence because i think uh to
24:02
ensure
24:02
ai maturity we have got to ensure
24:06
diversity equity and inclusion is part
24:08
of that or as we’re going to have a
24:09
technology
24:10
that is very immature so i think things
24:13
that build confidence
24:15
things that build trust and and really
24:17
things that show that
24:19
each of us have the opportunity
24:22
to use this technology to do things that
24:25
are better
24:27
i love that because it’s actually one of
24:28
the recurring questions that i ask
24:31
uh guests from time to time is you know
24:34
what
24:35
what makes you most optimistic about
24:37
when when you think about tech and the
24:39
future of human ex
24:40
experiences are there applications of
24:42
tech that you get really excited about
24:44
and that maybe even fill you with hope
24:45
and the good that you that they can do
24:47
and i want to hear your answer but i
24:48
also wanted
24:49
to note that it sounds like you you have
24:51
already given at least an indication of
24:53
you know the idea that ai can help us
24:56
lead better lives right
24:57
what is it that that makes you hopeful
24:59
about that
25:00
well i am it’s an it’s a very powerful
25:03
uh suite of technologies when you think
25:05
of artificial intelligence
25:07
what it can do when it comes to
25:08
communication and connectivity
25:11
and connecting people who are apart i
25:13
mean that’s brilliant
25:14
what it could do in healthcare it’s
25:16
absolutely amazing
25:18
the kind of change that ai can make in
25:20
healthcare
25:21
just about every industry ai can have
25:24
an extraordinary impact that could
25:26
really uh
25:28
create a world postcovid
25:32
that uh you know can be much better than
25:34
the world that we were in
25:35
you know before so i’m very passionate
25:38
about it i’m very passionate
25:40
but i’m also passionate about doing it
25:43
right
25:44
not creating any harm with the
25:45
technology
25:47
inclusive innovation ethical technology
25:51
those are the things
25:52
that i’m passionate about about ai
25:54
justice
25:55
and and my uh passion is about using ai
25:59
to create justice at scale
26:02
using ai in positive ways in the
26:05
criminal justice system
26:07
you know i think what we’re doing now
26:09
with it just trying to catch
26:10
criminals you know that
26:13
you know it’s just the basic
26:16
let’s take it to the next level so
26:18
unimaginative
26:22
yeah i know that and that’s beautiful i
26:24
mean i you really touched me when you
26:25
said justice at scale
26:27
because that really feels like it’s an
26:28
important way to think about
26:31
you know what’s possible to me a lot of
26:32
my work i talk about as human experience
26:35
at scale or meaningful human experience
26:37
at scale so just
26:38
as at scale makes complete sense to me
26:40
and
26:41
i think you know i would love to to push
26:43
you a little further on that like
26:45
what does justice at scale look like in
26:48
applications like
26:49
what what would we see if we saw ai and
26:52
emerging technologies helping
26:54
shape a world around us that had justice
26:56
at scale
26:58
i think what we will see is this
27:00
technology
27:01
and technologists working with
27:03
data-driven technologies
27:05
understanding that in at every data
27:08
point
27:09
or every data set has a certain
27:12
amount of tragedy and trauma attached to
27:15
it
27:16
and we’ve got to think about that and
27:17
that’s something i learned working in
27:19
homicide you know you would see
27:21
the statistics you know the homicide
27:23
rate but then
27:24
to every individual there was
27:28
a partner a family a generation
27:31
attached to that and much of the work
27:33
that i did with therapeutic
27:34
jurisprudence which is
27:36
pretty much using the law in a
27:38
therapeutic way
27:39
looked at the impact of children how a
27:42
child’s life
27:43
changes if a parent or a caregiver is
27:48
murdered
27:48
and there is no justice or if justice is
27:52
delayed
27:53
or you know we don’t think about how it
27:55
impacts
27:56
generations and intergenerational trauma
27:58
in data is something that i am
28:00
a very very passionate about oh that’s
28:03
interesting yeah i mean we hear about
28:06
intergenerational trauma but i hadn’t
28:08
thought of intergenerational trauma in
28:09
data how do you see that modeled in data
28:12
well this is what i’m working on now and
28:14
this is what i’m thinking about and this
28:16
is where
28:17
where i am when it comes to
28:18
understanding that and this is how
28:20
criminology and criminal psychology
28:23
and therapeutic jurisprudence and the
28:24
work that i’ve done uh when it comes to
28:26
being a substance abuse therapist
28:27
working
28:28
in rehabilitation this is how it it
28:30
fuses with a.i
28:32
uh to really think of new ways of
28:35
ensuring that justice is served
28:37
in real time ensuring that children
28:39
don’t become
28:40
collateral damage and understanding that
28:43
we really have got to
28:44
use this technology in the criminal
28:47
justice system
28:48
in more positive ways looking for
28:50
positive outcomes
28:51
as opposed to what we’re seeing right
28:53
now that’s i
28:55
i love the imagination you use with that
28:57
but it’s
28:58
fused with empathy right and that makes
29:00
it brings it all right back around to
29:02
that whole ai
29:03
uh you know that criminology is the
29:05
conscience of
29:06
ai or should be uh so that’s a that’s
29:08
everything that i do
29:09
comes from a place of compassion and
29:11
empathy and i think we’ve got to think
29:13
about
29:14
also fusing that with technology
29:18
i love that so i think too about
29:21
you know you talked about earlier about
29:23
um about surveillance and
29:25
and other deployments of of what really
29:28
are ai
29:28
in some level you’re the ceo of urban ai
29:32
right uh so clearly a lot of predictive
29:35
policing is deployed through the mesh of
29:36
city infrastructure and increasingly the
29:39
the promise of
29:40
smart cities quote unquote so how do we
29:42
go about ensuring that we can reap the
29:44
benefits of ai and emerging tech
29:46
in cities which you know i think anyone
29:49
can agree that there’s there’s some
29:50
benefit that can happen from that the
29:52
smart monitoring of
29:53
resource availability and utilities and
29:55
sanitation and so on
29:57
without further enabling surveillance
30:00
and broken windows policing and
30:01
disproportionate policing of
30:03
marginalized communities
30:04
what’s the work that we need to do there
30:07
well you know covert 19 has sort of
30:09
toppled that smart cities model
30:12
and it’s not a model it’s a model that’s
30:14
trying to rebrand right at this moment
30:17
because so much of cities is going to
30:19
have to be contactless and what you’re
30:21
seeing
30:22
is an escape out of the cities as
30:24
opposed to what we had had
30:25
prior so cities may be a pretty empty
30:27
place uh
30:28
as we move forward so but when it comes
30:31
to smart cities and when it comes to
30:32
that whole concept of urban ai
30:34
urban ai is really about bringing ai
30:37
into an urban space
30:38
so there are two things there’s a part
30:39
that really looks at how do we use
30:42
ai to ensure that resources in cities
30:45
are equitable that the individuals who
30:48
are living in the cities
30:49
are just not only are there existing
30:52
but they have a stake in what this city
30:54
looks like and what this city feels like
30:56
and the only thing that we’re
30:57
are going to do in the cities would be
30:59
surveillance no uh the the only
31:01
application so far when it came to crime
31:04
and criminal justice and
31:06
criminality in the smart city space was
31:08
surveillance technology
31:10
and it’s much more than that but it’s
31:12
also about
31:13
understanding the demographics of the
31:15
cities and understanding who lives in
31:17
the cities
31:18
and understanding that we’ve got to
31:20
empower all
31:22
people all demographics we’ve got to
31:25
look because in our cities it’s where we
31:26
find
31:27
our disenfranchised off you know
31:29
underserved under-resourced
31:31
disinvested marginalized communities so
31:34
what the smart city model was doing
31:36
was building a city within a city
31:38
because it did not include
31:40
those marginalized communities in their
31:42
plans
31:43
and if you don’t invite people to the
31:46
party
31:48
what happened it’s not a very good party
31:54
i mean yeah that and that’s a it’s a
31:57
really
31:58
important observation you and i are both
31:59
in new york city right
32:01
yeah so we’re both here in a big city uh
32:04
that is as you say
32:06
a little emptied out uh in the wake of
32:08
covid uh or during covet i don’t like it
32:10
when people are talking about covet in
32:11
the past tense it’s like
32:12
it’s still going on people no it’s still
32:14
here right right
32:16
um but but i love the idea of of
32:19
thinking about
32:20
the the more creative imaginative ways
32:23
that we can use ai
32:24
and and bring that equity to different
32:26
communities
32:27
one of the things that it seems like
32:28
that calls to mind is there’s you know
32:30
of course the big uh movement right now
32:33
for
32:34
um uh defronting and even maybe
32:37
abolishing
32:38
police in different communities and then
32:40
the the reinvestment
32:41
into community policing or community um
32:45
you know management policies okay
32:47
perfect community-led policing
32:49
yeah community legislation so i think
32:51
and i don’t know
32:52
another community resources right
32:55
definitely so i think what we need to
32:56
see and what i’m hoping to do
32:58
with some of the uh the projects that
33:00
i’m working on right now
33:02
is empowering i think what ai also needs
33:04
is is more diverse
33:05
stakeholder engagement so so far we’ve
33:08
been designing technologies
33:10
that are re-victimized re-marginalize
33:12
communities when you’re thinking about
33:14
the
33:14
deployment of surveillance technologies
33:16
into a disinvested community
33:18
ah what we do need is to educate people
33:20
in communities all these technologies
33:22
that are being used against them uh they
33:25
need to understand the kind of digital
33:26
force
33:27
that’s now being applied against them
33:29
and one of the things that i have been
33:31
seeing when it comes to
33:32
algorithmic policing would be uh the
33:35
digital subpoenas
33:37
are doing things like uh you know
33:38
geofacing uh warrants and
33:40
geolocation warrants and and we’re
33:43
creating what i call you know the
33:44
digital handcuffs and the digital choco
33:46
we’re doing all of these things
33:47
with algorithm but all of it is focused
33:50
on marginalized communities
33:52
disinvested communities communities that
33:54
are already culturally alienated
33:56
communities that need resources so what
33:59
we need to use this technology to do
34:02
is to really empower communities because
34:04
what we’ve been doing uh
34:06
in these early stages of ai would be
34:07
disempowering communities
34:09
and what we really need to do is
34:11
democratize this technology
34:13
so people in in communities understand
34:15
how their data
34:16
could be weaponized against them people
34:19
need to understand
34:20
the power of their data this is why data
34:22
advocacy
34:23
is so important because data is a part
34:26
of your civil rights
34:27
it’s a part of your human rights and
34:29
most people are just not there yet
34:31
uh when it comes to understanding as we
34:33
would say the social dilemma
34:35
of data so people have got to think
34:37
about that but definitely more
34:39
community-led uh policing
34:40
which means communities need to decide
34:42
what type of resources
34:44
are required in the communities because
34:46
i mean i’ve been in policing for a very
34:47
long time
34:48
it really doesn’t make sense when a
34:50
community is disenfranchised
34:52
when a family is already traumatized
34:54
where you have intergenerational trauma
34:55
and free floating anger
34:57
to apply law enforcement to that that is
34:59
not the solution that you’re looking for
35:01
so definitely community-led uh policing
35:05
strategies as well as greater
35:07
stakeholder engagement
35:09
between communities and law enforcement
35:12
in designing
35:13
the kinds of strategies that are
35:14
required
35:16
yeah that’s a really super important
35:18
area and i i it seems like such a
35:20
wonderful time for it right now that
35:22
that intersection is happening between
35:24
the growth and the discourse around ai
35:27
and this movement
35:28
around uh reforming police and and
35:31
thinking
35:32
more creatively about community
35:35
solutions
35:36
uh and reinvesting in those communities
35:38
uh definitely because it’s about racial
35:40
justice and it’s about social justice
35:42
and it’s really about policing
35:44
rethinking itself and every
35:46
uh change in policing has been attached
35:49
to a social movement
35:50
so uh when we think about uh the war
35:53
when we think about vietnam
35:54
and how police had to change their
35:56
strategies uh when we think about
35:58
uh the gay rights uh liberation movement
36:01
and that
36:02
and stonewall and how that impacted uh
36:04
law enforcement
36:05
and when you think about uh the civil
36:07
rights and and
36:08
and the movement pre-civil rights and
36:10
how that impacted law enforcement
36:12
this is impacting law enforcement as
36:14
well so law enforcement also has to
36:16
rebrand
36:16
has to rethink reposition and understand
36:19
that
36:20
you know you’ve got to you have got to
36:24
treat communities with dignity and
36:27
respect and i think we’ve got to bring
36:30
that back into the question
36:34
i love that that’s so important when you
36:36
think about
36:38
regulations and protections that those
36:40
that are in place now those that you
36:42
think are most vital to get into place
36:44
which ones come to mind as as the most
36:47
important to protect
36:49
uh the most vulnerable and marginalized
36:51
from wrongful prosecution uh whether it
36:53
be through
36:54
faulty facial recognition or flawed
36:56
algorithmic models that
36:57
disproportionately identify black and
36:59
brown suspects or anything like that
37:00
what what do we need still
37:03
i think what we need at this stage is to
37:05
do an audit
37:06
and all the technologies that are out
37:08
there when it comes
37:10
to surveillance our law enforcement
37:13
uh how technologies are being used
37:15
because i don’t think we have done that
37:16
audit just yet
37:18
i think there needs to be an education
37:20
so people have got to be educated
37:22
about their digital rights as you’re
37:24
educated about your civil rights and
37:25
your human rights
37:27
you’ve got to be educated about how ai
37:30
informs
37:31
or your rights i think there are
37:33
frameworks out there that we have been
37:35
using but they’ve been very broad
37:37
so we have the uh gd uh you know the
37:40
general
37:40
data protection we have that we have the
37:42
california consumer
37:44
privacy we have the illinois we have
37:45
there’s several you know there’s the
37:46
algorithmic
37:47
accountability so they’re all of these
37:49
things that are out there and there’s so
37:51
many every day it’s a new framework
37:53
every day it’s the new ethical framework
37:55
that’s being applied
37:56
but one of the things that i keep
37:57
realizing from some of these frameworks
37:59
is that uh they lack uh they mention
38:02
diversity equity
38:04
and inclusion but they all come from a
38:07
very
38:08
uh western and eurocentric perspective
38:11
so when you look at the committees and
38:13
you look at the groups and you look at
38:15
the designers
38:16
there’s a total lack of diversity equity
38:18
and inclusion but they’ve become great
38:20
slogans you know everybody’s talking
38:22
about it but how do we operationalize
38:24
this
38:24
in real time how do we make this change
38:27
meaningful and sustainable and how do we
38:31
do
38:32
things that speak to a particular kind
38:35
of truth and i think we we’re just not
38:38
there yet
38:39
so uh when it comes to what’s out there
38:42
we have got to understand
38:44
that many of the technologies because so
38:46
many of these technologies are being
38:47
designed
38:48
not in clandestine but being designed uh
38:51
not in the open you know there’s a lack
38:52
of
38:53
many of them are not are done through
38:55
public procurement or many of them are
38:56
being funded by private agencies
38:59
uh private companies our brands so we
39:01
really don’t know what law enforcement
39:03
what the justice system
39:04
or what the companies who are designing
39:06
for them are coming up with
39:07
so this is why it’s so critical for
39:10
individuals to be
39:11
educated on their digital rights and to
39:14
build
39:15
that kind of public understanding it’s
39:16
only going to make the technology better
39:19
yeah i agree that makes total sense and
39:21
it it’s long overdue and i love
39:23
your point that so many of you know gdpr
39:26
in california and all of the other
39:28
things
39:29
there there is you say it is this kind
39:30
of glancing blow at diversity like oh
39:33
let’s
39:33
include diversity in there but it’s not
39:35
everybody’s talking about it now
39:37
yeah yeah but he’s talking about it it’s
39:40
not oriented from there it’s not
39:42
saying hey let’s make sure that this is
39:43
a truly inclusive framework
39:46
and start from there and then build out
39:48
then design
39:49
out from there definitely yeah so
39:52
that that’s beautiful you know a lot of
39:54
your work seems to have to do and even
39:56
your your background
39:57
includes uh psychology and so on so it
39:59
seems to have to do with understanding
40:02
sociology and neuroscience and and so on
40:04
uh so
40:05
with so much focus at the in the moment
40:08
within companies and organizations on
40:10
trying to improve
40:11
diversity and inclusion and reduce
40:12
biases and you also made a reference a
40:14
little while ago about how
40:16
you know some of that bias training has
40:18
been ineffective
40:20
what do you find about whether we can
40:22
unlearn
40:23
bias and privilege and even racism and
40:25
and what do we need to do
40:27
to do that what what’s the emphasis that
40:29
we need to to put forward
40:31
in in society for that so
40:34
it’s not easy no it’s certainly not
40:37
going to be easy it’s like unlearning
40:38
violence and that’s where i work you
40:40
know really trying to
40:41
bring people back people who are using
40:43
violent children who are using violence
40:45
um but it comes to exposure and one of
40:48
the things we say when it comes to
40:49
violence is the greatest predictor of
40:51
someone going to use violence
40:53
in their adult life is early exposure to
40:56
violence so
40:57
think about it when it comes to things
40:58
like systemic racism
41:00
and it really goes back into the
41:03
individual
41:04
the family the mind and i think what
41:07
we’ve got to do
41:08
again uh particularly when it comes to
41:11
understanding that a half-day course in
41:15
implicit bias of subconscious bias
41:17
is not going to do it it’s not going to
41:19
do it
41:20
or just saying that i’m not racist or
41:22
saying that i’m anti-racist
41:24
or just saying things we’ve got to start
41:27
to see action because now
41:28
there’s a lot of talk there’s a lot of
41:30
talk in all these companies everybody
41:32
wants to
41:33
invest now in black excellence and
41:35
that’s a great thing because there’s a
41:36
lot of black excellence to invest
41:38
in but what we really need to see are
41:41
the systems
41:42
changing we really need to see the
41:44
organizations
41:45
changing we really need to see the
41:48
country thinking
41:49
in a different way as well it’s hard
41:52
work
41:52
but i think we all can do it i’m very
41:55
optimistic
41:56
i’m very optimistic i think what we need
41:59
is a little more imagination
42:01
we’ve got to look at the imagination of
42:03
the individuals who are designing
42:05
and how implicit bias informs the
42:08
subconscious imagination and uh i think
42:11
there are ways
42:12
in which we can really deconstruct and
42:15
and rebuild
42:16
but i think as i say we’ve got to be
42:18
bold enough to have those uncomfortable
42:20
conversations
42:21
and we’ve got to be brave enough to
42:23
create room at the table
42:26
and invite people who don’t look like
42:29
us yeah to sit discuss
42:32
build together make it a better party as
42:35
you say
42:36
right i’m all for a better party what
42:39
seems like one of the complementary
42:41
issues there
42:41
is that of representation right uh so
42:45
people too often seeing people who look
42:48
like them
42:48
imprisoned or arrested or depicted as
42:51
being imprisoned or arrested
42:53
and it seems like that must do a number
42:55
on people’s psyches and identities i
42:57
would imagine well it’s
42:59
it’s more than that you know everything
43:00
is attached to history and when you look
43:02
at uh some of the ads
43:04
the advertisements uh from the uh
43:07
time just after uh enslavement
43:09
emancipation
43:10
and the fugitive uh slave flavor and the
43:13
kinds of ads
43:14
that were created when an uh an african
43:17
uh american individual
43:18
uh ran away from the plantation or
43:20
decided and you looked at the kinds of
43:22
ads that were put out there
43:24
there are certain images that were
43:25
created certain stereotypes
43:27
of individuals that were created so i
43:29
think there’s a stereotype
43:31
sometimes in people’s minds that when we
43:33
see black or brown we see criminal and
43:35
and
43:36
it really a lot of people for them that
43:38
changes particularly if you
43:39
have diverse experiences particularly if
43:41
you’ve traveled to other countries i
43:43
mean i am originally from trinidad
43:45
and tobago so i come from a country that
43:47
is very diverse
43:49
a country that’s very cosmopolitan in my
43:51
own family
43:52
i’m a mixture of afro trinidadian and
43:55
asian trinidadian so i have a
43:57
history of my family uh having lives
43:59
before
44:00
in india as well as in africa so i come
44:02
from a very diverse perspective
44:04
which makes me understand uh the beauty
44:07
and
44:08
i see beauty in all things so i think we
44:10
really need to embrace a more
44:11
multicultural
44:12
uh perspective and i i always believe in
44:15
honesty
44:16
and sometimes honesty hurts but you have
44:19
to go through that process you’ve got to
44:22
go through that process
44:24
and this is why i’m saying empathy
44:26
compassion
44:27
in ai this is why i’m saying we’ve got
44:29
to reimagine
44:30
ai ethics some of it looks really good
44:33
on paper
44:34
it sounds really exciting on paper but
44:36
what you’re seeing at the companies
44:38
what you’re seeing on the advisory
44:39
boards what you’re seeing is not
44:41
diversity equity and inclusion
44:43
so we really you know let’s start with a
44:46
place of honesty and i think we can
44:47
build from there yeah that’s fair enough
44:49
it also seems like you know there’s a
44:51
wide range of places where
44:53
representation can be improved right
44:55
like
44:56
vr we talked about this in a recent
44:58
episode uh
44:59
gaming i think i i heard you talk about
45:01
uh games and and the representation of
45:03
you know the difference between the
45:04
developers versus the consumers and
45:06
players of those games
45:08
uh you had a great quote something about
45:11
why can’t a black man
45:12
or black boy ride a dragon i think that
45:14
was a well i’ve read that that was
45:15
something that i actually read in
45:17
a magazine i had quoted that and it
45:19
really was you know
45:20
what about a black superhero and you
45:22
know let’s i mean we saw you know
45:24
a black superhero in the black panther
45:26
you know you know bless lest his soul
45:28
rest in peace uh
45:29
chadwick but you know what about when
45:31
are we going to get some more of those
45:33
and really it’s about
45:34
the imagination and this is what i can
45:36
keep saying what
45:37
informs the imagination does implicit
45:40
bias inform the imagination
45:42
so if we’re imagining if we’re using
45:44
this great technology
45:45
to design uh something that is new why
45:48
do we keep replicating
45:50
old stories biased traditions
45:53
all ways of thinking and operating that
45:55
we
45:56
are using new technology to create yeah
45:59
and it seems like especially
46:01
you know we we have evidence now i mean
46:03
black panther was
46:04
such a highly grossing film and uh the
46:06
spider-verse movie did really really
46:08
well and there’s been so many great
46:10
examples of
46:11
you know if you if you build it they
46:12
will come right like we need to actually
46:14
just
46:15
acknowledge that there’s a a market
46:17
dying for it and
46:18
plenty of multicultural consumers will
46:20
be out there
46:22
so hopefully fingers crossed you know
46:24
you talked earlier about
46:26
so much of the emphasis with ai and
46:28
criminology being on
46:30
uh policing and communities and street
46:32
crime but i i also
46:34
noticed that when we think about that
46:35
intersection of of ai and crime there’s
46:37
also
46:38
the angle of maybe new crimes that can
46:41
be committed
46:42
through the interventions of technology
46:45
right so
46:45
identity frauds cyber security financial
46:48
schemes and so on so i actually heard
46:49
you use the expression
46:51
crimes of the sweets in contrast to
46:53
crimes of the streets which is
46:55
brilliant i love a little word play do
46:57
you spend much of your time in that
46:58
space examining you know whatever types
47:01
of new crimes ai could help facilitate
47:03
and
47:03
and helping build that uh sort of
47:06
infrastructure
47:07
too to deal with that well actually
47:11
i’ve not been doing a lot of that right
47:13
now but it’s something that i
47:14
had done you know i’ve done in the past
47:16
so i do look at new crimes it’s
47:18
something that i i’m just doing on my
47:20
own looking at
47:21
new crimes but also uh i did a lot of
47:23
work on investigating white-collar crime
47:26
and the uh psychology of white-collar
47:28
crime and
47:28
profiling white collar criminals and
47:30
this is why i’m saying that
47:31
so many of the tools that are being
47:33
designed are all focused
47:34
on the street you know the low-hanging
47:36
fruit uh let’s use the technology to
47:39
to really look at other spaces uh where
47:41
crimes has happened and of course you
47:42
know there’s much work happening
47:44
in cyber security and looking for those
47:46
bad actors but there are other things
47:48
that we can do
47:48
uh when it comes to using the technology
47:51
and just not investing
47:52
in repo you know in over policing
47:54
communities that have been so
47:56
traumatized
47:56
by law enforcement in the past there’s
47:58
so much more money to be made from
48:00
prosecuting those big
48:02
big crimes anyway but you wouldn’t think
48:05
that right right one is just on the
48:06
streets
48:07
yeah it really does it’s like one note
48:09
it doesn’t play
48:11
right now it seems like also there’s a
48:13
with the release of the social dilemma
48:15
as you
48:15
alluded to earlier a lot of people are
48:17
chattering about tech addiction
48:19
and how we need to break our harmful
48:20
relationships with social media but
48:22
i’ve also heard a number of compelling
48:24
arguments from underrepresented
48:26
communities about how
48:27
important social media has been in
48:28
offering access and reach and
48:30
amplification
48:32
to you know voices who might not have
48:33
otherwise had a platform so
48:35
do you have strong feelings about the
48:37
healthiest framing here of the balance
48:39
between those those sort of pulls and
48:40
that dichotomy
48:42
well it’s about balance it’s definitely
48:44
about balance but i think what we saw
48:46
in particular when it came to the social
48:48
dilemma
48:49
would be a different kind of a form of
48:51
enslavement digital enslavement that’s
48:53
what
48:53
that was presenting but we also saw
48:56
undue influence
48:57
and manipulation and exploitation and
49:00
grooming
49:01
of children and that’s a particular kind
49:04
of abuse so
49:05
when you’re looking at designing these
49:07
ethical frameworks you’ve got to really
49:09
look at the
49:10
impact of technology that level of undue
49:12
influence that technology has
49:14
but of course it’s very i mean social
49:16
media is very powerful
49:18
uh and it’s done extremely uh positive
49:20
things so
49:21
in everything it’s about balance it’s
49:23
about bringing the requisite kind of
49:25
balance this is why i’m saying that
49:26
um it’s about data serenity it’s it’s
49:29
about
49:30
that and it’s about data ownership as
49:32
well and as much as we have
49:34
conversations about civil rights and
49:36
about
49:36
human rights we’ve got to start to have
49:38
conversations
49:39
about digital rights data is a new
49:42
language that we are using
49:44
uh most of us uh you know speak whatever
49:47
is the language of our mother tongue we
49:49
speak that
49:50
but we’re also speaking another language
49:52
and we’ve got to look at data and
49:53
algorithms
49:54
as a new language and we’ve got to
49:56
create the requisite uh kind of literacy
49:58
programs
49:59
to empower people so they can understand
50:01
what’s going on with their data
50:03
yeah it’s like in that metaphor i guess
50:04
what you know the language is being
50:06
spoken all around us
50:07
and some of us have the privilege of
50:09
understanding that language but many
50:11
many people
50:12
who are subject to the whims of that
50:14
language do not speak that language do
50:16
not understand that language and haven’t
50:18
had it explained to them so
50:20
that seems like a really important it’s
50:22
a i think what all tech is human was
50:24
asking for earlier with a sort of
50:26
a visual there’s a there’s a metaphor
50:28
that seems like it’s a really useful one
50:32
yeah so i also noticed a line in one of
50:34
your bios that described what you do as
50:36
using
50:36
ai to save lives which is a beautiful
50:39
turn of phrase
50:41
i wondered about the model that you
50:43
conceive of or the framework in your
50:45
mind as you think about that i often
50:46
refer to the united nations
50:48
sustainable development goals as a model
50:50
for what i think ai can align with
50:52
in trying to create a better world do
50:54
you use any different
50:55
framework or taxonomy of the ways in
50:56
which ai can be used to save lives
50:59
well i use the the taxonomy of the
51:01
streets as a criminologist when it comes
51:04
to saving lives and saving lives is not
51:06
just only about
51:07
life and death there are some lives that
51:09
are existing right now
51:10
that need to be saved lives that have
51:12
been impacted by the criminal justice
51:14
system
51:15
lives that have been impacted by
51:17
homicide and by violence
51:19
and by trauma and intergenerational
51:21
trauma there are lots of people walking
51:23
around
51:24
with a lot of pain and those lives are
51:26
lives that feel as though
51:28
they’ve been lost so yes i think about
51:31
ai within the concept of doing no harm
51:33
thinking about things like autonomous
51:35
vehicles and autonomous uh
51:37
weaponry and how these things could
51:39
impact and cause death
51:40
that’s real that’s real and but i also
51:43
think about trauma
51:44
and and saving lives from trauma and not
51:47
re-traumatizing communities and and
51:50
continuing to re-victimize communities
51:52
with data
51:53
and this is where i come back to me uh
51:56
finding ways to now look
51:58
at intergenerational trauma in data
52:01
and how do we remove that trauma how do
52:04
we create
52:04
a risk assessment tool that looks at
52:07
that and that’s why it’s so important so
52:09
definitely using ai to save lives and
52:12
really using ai
52:13
to build better futures yeah that gives
52:15
me chills on the back of my head that’s
52:17
such an
52:17
it’s such an imaginative uh sort of
52:20
combination of the idea of that
52:21
intergenerational trauma and the idea of
52:23
data and technology and trying to find
52:25
you know what is a pattern that can be
52:28
understood and how can we
52:30
how can we manage this and then pull
52:31
this out and uh that’s
52:33
it’s brilliant and so i’m so impressed
52:35
with you that’s where i’m
52:37
doing my work right now and that’s where
52:38
my mind is right now
52:41
well you know one other question i have
52:44
for you is
52:45
when you think about what we could do in
52:47
culture to stand a better chance of
52:49
bringing about the best futures
52:51
with technology rather than the worst
52:53
futures you know everybody seems to love
52:54
this kind of framing of dystopia versus
52:56
utopia and
52:57
i think we’re in alignment on the idea
52:59
that it’s it’s not
53:00
either or it’s kind of always both and
53:02
you you have to do the work
53:04
to steer it toward the best outcome so
53:06
what what do you think we can do in
53:08
culture
53:09
to to bring about those better those
53:11
better futures
53:13
i think we’ve got to understand uh of
53:15
course uh representation
53:17
is critical we’ve got to amplify our
53:19
voices
53:20
all voices need to to be heard i believe
53:23
always in collective
53:24
responsibility and individual
53:26
responsibility but
53:28
also it it really calls for
53:31
greater stakeholder engagement and
53:33
that’s something that i’ve not been
53:35
seeing
53:35
with ai i keep saying that ai speaks to
53:38
ai
53:39
data scientists speak to data scientists
53:42
and that’s what we’ve been seeing
53:43
so we definitely need a more
53:46
multi-disciplinary
53:47
approach we need more criminologists
53:49
more social workers more educators more
53:51
psychologists
53:52
we need to bring different types
53:56
artists different type musicians are you
53:59
know
53:59
into the conversation because this is a
54:03
technology as i said to you i’m so
54:04
passionate about it
54:06
it’s power it’s potential it’s
54:07
pervasiveness and and its promise
54:10
it’s extraordinary promise but we’ve got
54:13
to ensure that promise is for
54:14
all and we’ve got to look at the
54:17
inherent uh privilege
54:19
and power in this technology and we’ve
54:22
got to
54:23
try to find more ways to do more
54:26
stakeholder diverse
54:28
stakeholder engagement in ai
54:31
and we’ve got to bring more diverse
54:33
voices in the technology
54:35
and i think because of the lack of
54:38
diversity
54:39
and inclusion i i think uh
54:42
you know the imagination of ai
54:46
is being cheated
54:49
yeah that i like the way you put that
54:52
because it does feel like to me some of
54:53
the things that
54:54
we’ve talked about here today and that
54:55
i’ve heard you speak about since i’ve
54:57
been
54:57
following your work are far more
54:59
imaginative and creative than i hear
55:02
in any other from any other expert in
55:05
any other any other circle
55:06
and i think it does probably stem from
55:08
you know this
55:09
reorientation you saying criminology has
55:13
to be the conscience of ai
55:14
and what i feel like you mean and this
55:16
may be not correct but what i’m going to
55:18
tell you what i
55:19
hear is the idea that we need to
55:23
really understand the humanity of those
55:26
in the criminal justice system is that a
55:29
fair way to characterize that
55:30
we’ve got to we’ve got to and how do
55:33
they enter the criminal justice system
55:35
how do they enter and why is it that a
55:37
particular group
55:39
enters faster than any other group and
55:41
spends more time
55:42
in there why is that what is wrong with
55:45
the system what is wrong with society
55:48
and i think we don’t understand the
55:51
impact of
55:52
trauma but we experience it
55:56
we don’t understand how it impacts a
55:59
generation
56:01
how it impacts a generation and how many
56:04
people in a family
56:08
become victims of that experience and i
56:12
think
56:12
one of the things that we’re seeing yes
56:15
when it comes to ai
56:16
is that you know we think that somehow
56:19
uh criminal justice and persons who are
56:21
incarcerated
56:22
and persons with a propensity or
56:24
proclivity to
56:26
you know to do crime or people who’ve
56:28
done
56:29
crime before or people who have a
56:31
criminal record people who are known to
56:33
the police
56:33
somehow it makes them open season and it
56:36
doesn’t
56:36
it doesn’t make them open season to
56:38
create technologies
56:40
uh to further uh
56:43
traumatize or victimize or those
56:46
community it doesn’t
56:47
and i think we’ve got to think about
56:49
that we’ve got to think about that and
56:51
when we look at the criminal justice
56:53
data it’s data that
56:54
i’ve worked with for many years how do
56:56
you enter a gang database
56:58
how do you get into that you could be in
57:00
the community you could be on the street
57:02
you could be at the street corner
57:03
where there’s a rest done uh police you
57:06
know there’s sometimes
57:07
there could be victimization included in
57:10
that data
57:11
mistaken identity how does a 10 year old
57:14
or 11 year old
57:15
end up in a gang database so the fbi
57:17
database maybe we’ve got to think about
57:20
how we’ve been collecting criminal
57:22
justice data
57:23
and if we’re using that to design
57:25
technology what are we going to get
57:28
yeah yeah i i think
57:31
what i wanted to also point out or
57:33
include here is a link to our our peers
57:35
our colleagues over at
57:37
all tech as human have published this
57:40
responsible tech
57:41
guide in which you are featured and i am
57:44
but you are
57:45
you’re profiled in this uh in this guide
57:48
and i think there’s gonna be some value
57:50
for uh
57:52
for our listeners for our audience here
57:54
to
57:55
to look into this and understand a wider
57:58
array
57:59
of you know the kind of brilliance it’s
58:01
a brilliant document i call it a jewel i
58:03
think david has given us a gift
58:05
and i really appreciate the work that
58:06
he’s been doing and his great team at
58:08
old tech
58:09
is human and we’ve got to think about
58:11
responsibility
58:13
individual collective responsibility
58:16
what is responsible check and you’ve got
58:18
to think about accountability and
58:20
transparency and explainability and
58:22
auditability and all of these great
58:24
things
58:25
and you’ve got to always remember due
58:27
process
58:28
due process is so critical when it comes
58:31
to data
58:32
and ai and duty of care when it comes
58:35
to data and ai and this is where i come
58:38
from
58:39
that that space of understanding that
58:41
the people have rights
58:43
yeah people describe justice it feels
58:45
like too when you’re talking about that
58:47
the the one thing that
58:48
i think it was even mentioned in the in
58:50
the movie 13 uh
58:52
that that um prisoners rights are sort
58:55
of the last
58:56
frontier of human rights you know
58:57
there’s so many as you said people seem
58:59
to
59:00
write off people who have been
59:01
incarcerated or who have been through
59:03
the criminal justice
59:04
system and it seems it’s so critically
59:06
important that we
59:07
move our minds into that space to be
59:09
able to
59:10
accept the full humanity of every human
59:13
every person
59:14
in the criminal justice system and
59:16
beyond and as if we’re talking about
59:18
rehabilitation and if we’re talking of
59:20
restorative justice and
59:21
and reentry and these individuals are
59:23
returning citizens and what do they get
59:25
when they return i mean
59:26
we’ve got to come from a place of
59:28
compassion and my criminology comes from
59:30
a place
59:31
of compassion because i have seen pain i
59:33
have seen pain in the criminal justice
59:35
system and when you
59:36
work with children in the criminal
59:38
justice system
59:39
it really brings you to a place of of
59:41
asking
59:42
certain questions and when i see
59:44
something like an
59:45
algorithmic decision-making system being
59:48
used
59:48
when it comes to juvenile justice and
59:51
being applied to
59:52
who’s deemed a juvenile delinquent and
59:54
who gets the chance and who has to go
59:56
through the criminal justice system
59:58
you don’t want to put someone through
59:59
the criminal justice system what you
60:01
want to do
60:02
is reduce contact with the criminal
60:04
justice system
60:05
and those are the things we need to be
60:07
thinking about using ai to do
60:09
not to over surveil communities and to
60:11
to use facial recognition to misidentify
60:14
and mistakenly identify and use
60:16
technologies to
60:17
to trap and to incarcerate um yes those
60:20
things
60:21
are there as part of the legal system
60:24
but
60:25
let’s use our imagination use our
60:28
imagination
60:29
i love that it’s a very boring
60:31
interpretation of it if you just do it
60:33
that way i love your interpretation i
60:35
think that’s beautiful
60:36
i have on the screen the link to
60:39
responsibletechguide.com
60:41
and since this will be an audio podcast
60:42
that will be helpful for people
60:44
uh please uh feel free to go and check
60:46
that out read
60:47
uh renee’s profile it’s brilliant she’s
60:50
been interviewed in that and featured
60:52
and it’s it’s
60:53
it’s gorgeous and i think you’ll find
60:55
just as much inspiration
60:57
there as you have from hearing from here
60:58
today although
61:00
i’m honored that i got to have this
61:01
conversation with you directly
61:03
so last question is how can people find
61:05
and follow you
61:06
and your work online well i’m on all
61:09
social media platforms most people reach
61:11
out to me uh via lincoln uh
61:13
linkedin uh twitter instagram
61:17
facebook wherever i always respond and i
61:20
i do uh some mentoring uh with uh
61:23
you know women in ai ethics another
61:25
great organization that has been
61:26
bringing a lot of diversity equity and
61:28
inclusion
61:29
to the space and another organization
61:31
that has been celebrating of course
61:33
black women in ai which is
61:35
uh so important black and brown women in
61:37
ai which is really critical at this time
61:39
so
61:40
uh yeah so whenever they reach out i
61:42
always respond so
61:43
wonderful you you heard it here people
61:45
you can reach out
61:46
uh do your own interview bring renee to
61:49
more stages and more platforms because
61:51
this creativity needs to be heard i
61:53
i love the compassion and the creative
61:55
approach
61:56
that you bring to this and obviously
61:57
your passion it’s it’s so clear what
62:00
comes through so
62:01
i want to thank you very very much for
62:02
joining me here today
62:04
and thank you to all our viewers and to
62:06
our listeners on the podcast
62:09
uh everyone i hope that you have a
62:11
beautiful week weekend ahead
62:13
and thank you again renee thank you for
62:16
the extraordinary work that you’re doing
62:17
it was an honor being with you
62:19
thank you very kind thank you bye-bye
62:21
everyone
62:23
bye-bye