The Tech Humanist Show: Episode 5 – Dr. Safiya U. Noble

About this episode’s guest:

Dr. Safiya Umoja Noble is an Associate Professor at the University of California, Los Angeles (UCLA) in the Department of Information Studies where she serves as the Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of a best-selling book on racist and sexist algorithmic bias in commercial search engines, Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press).

She tweets as @safiyanoble.

This episode streamed live on Thursday, August 13, 2020. Here’s an archive of the show on YouTube:

Podcast Highlights:

2:00 What has it been like in your life and work to have authored a category-defining book?
4:16 how the conversation has changed
6:57 career arc
7:06 theater!
09:09 influences
10:55 audience question: when you’re teaching on this, what activities resonate with your students
16:36 “what the humanities and social sciences do is they give you a really great vocabulary for talking about the things you care about and for you know looking at them closely”
17:36 algorithms offline?
19:38 what is the Center for Critical Internet Inquiry at UCLA doing? (site: c2i2.ucla.edu)
20:17 big announcement!
29:07 the challenges for companies want to address the oppression in their own tech
47:56 what makes you hopeful? (BEAUTIFUL answer)

About the show:

The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.

Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.

Full transcript:

02:06
all right hi everyone
02:08
i hope we’ve got some photo we do have
02:10
some folks already turning out
02:12
uh give me a little comment or thumbs up
02:14
and make sure we can hear the audio
02:16
since we have had some
02:18
recurring issues with audio
02:22
let’s get some some comments go in from
02:25
folks say hi
02:26
i want to hear who’s out there and make
02:29
sure we’re
02:30
we’re coming across so um we’ll get
02:33
rolling in just a moment here
02:36
as soon as oh we’re getting more folks
02:39
some signals starting to come in
02:44
glad to see folks turning up i know
02:46
you’re all excited
02:47
i’m excited too i’m about to introduce
02:51
the guest here in just a moment but um
02:54
yeah just make sure let me know you can
02:56
hear me
02:57
little thumbs up that thumbs up
03:03
ah thank you john frater thumbs up all
03:05
right yay
03:07
hi sebastian thank you for being on i’m
03:09
glad that you all are hey
03:11
melton all right everybody’s turning up
03:13
hey we got you guys are coming in from
03:14
linkedin live i’m so glad that um we
03:16
just got that set up today
03:18
so it’s exciting that got a new channel
03:20
added to the live stream
03:22
all right and mark bernhardt says he’s
03:24
tuning in from wisconsin all right we’ve
03:26
got we’re everywhere right now
03:27
chris mclean thanks for the thumbs up
03:29
we’re gonna go ahead and get started
03:30
then
03:31
i’m glad you guys can hear me i’m glad
03:33
that sounds coming through and
03:35
we’re good to go so
03:39
pull up my notes here
03:43
all right i know that you all are
03:44
excited because the moment the
03:46
announcement went out
03:47
about today’s guest there was
03:50
so much uh joy and so much excitement
03:53
and everybody was
03:54
retweeting and sharing it like crazy so
03:56
i i know you all are excited to hear
03:58
from
03:59
dr sophia umojan noble who is an
04:01
associate professor
04:02
at the university of california los
04:04
angeles ucla
04:06
as better known as in the department of
04:08
information studies where she serves as
04:10
the co-director of ucla center for
04:13
critical
04:13
internet inquiry she’s the author of a
04:16
best-selling book on racist and sexist
04:18
algorithmic bias in commercial search
04:20
engines
04:20
algorithms of oppression how search
04:22
engines reinforce
04:24
racism that’s from nyu press and we have
04:27
Safiya with us
04:29
right now hi sophia you’re on thank you
04:31
so much for being here
04:33
hi kate it’s so great to see you and i’m
04:35
so happy to be here
04:37
wow it’s such a thrill and it’s such a
04:39
fun thing that everybody got real
04:41
excited you know i think it’s just
04:42
everybody knows that these are important
04:44
conversations and then of course to have
04:47
people
04:47
you know who i i’m able to bring some
04:49
folks into the conversation uh
04:52
in these last few weeks who are leading
04:54
this conversation truly and you have
04:55
been leading this conversation
04:57
you came out with this book what is it a
04:59
couple years ago now right
05:03
what was it 2018 uh
05:06
yeah yeah but you know it has been
05:10
uh it was a work in progress for a long
05:12
time before that
05:13
yeah i know it must have been because
05:15
you even say in the book that you know
05:17
you were tracking these issues
05:18
for a few years uh and so it’s it’s one
05:22
of the first i think that comes to mind
05:23
for people when they think
05:24
about algorithmic bias so that’s really
05:28
cool what’s that how’s that been like
05:29
for you in your own life since you wrote
05:31
sort of the best-selling category
05:33
defining book on a topic like what’s
05:35
that done for you in your in your life
05:37
and your work
05:38
it’s really been uh shocking to be
05:41
honest
05:42
because when i was writing the
05:45
dissertation
05:46
which eventually a version of that
05:48
turned into this book
05:50
i there was very little agreement in the
05:52
world uh
05:53
or with me that technologies could be
05:56
racist and sexist i mean there were
05:58
people like
06:00
wendy chun who is you know hero of mine
06:03
she’s at
06:03
simon fraser university who were writing
06:06
about the implications of software
06:08
and um you know there were like a
06:11
handful of women of color
06:13
really lisa nakamura um
06:17
anna everett kind of talking about all
06:19
these different cultural dimensions of
06:21
technology
06:22
but when when it got to talking about
06:25
that the code
06:26
could in fact be implicated in
06:28
structuring
06:29
racism and sexism there was really very
06:32
little agreement that that was even
06:33
plausible and so it’s
06:35
it’s strange to go from uh
06:38
writing about it a decade ago in you
06:41
know in the face of no agreement and
06:43
really having a hard time
06:44
getting people to be in a conversation
06:48
with me about that to now
06:51
you know i mean i’m at the beauty shop
06:54
and i meet somebody and they ask me what
06:56
i work on and i say oh
06:57
you know i do i study and research
07:00
racist and sexist
07:01
algorithmic bias and discrimination and
07:03
they’re like oh yeah let me tell you
07:04
about this
07:06
[Applause]
07:07
okay show me about it so
07:10
in that way it’s really it’s joyous to
07:13
to know that
07:15
there are a lot of people who want to
07:17
have this conversation now and i’m
07:19
really
07:20
um grateful because there’s the
07:22
implications of
07:23
of what’s happening is so dangerous and
07:26
it’s so important
07:27
that many many many people need to be
07:29
thinking about it and talking about it
07:31
yeah i mean i’d argue everyone who
07:33
especially who works in technology but
07:36
beyond technology too because we’re all
07:37
affected by it
07:39
certainly yeah and so what have you seen
07:41
change in the industry since then since
07:43
2018 your book comes out
07:46
you know literally the conversation’s
07:48
changing you helped change that
07:49
conversation
07:50
so now how do you feel like that
07:54
discourse
07:54
exists what’s changed about that well
07:57
this is the interesting part
07:59
that i think the conversation
08:03
has become more mainstream in the tech
08:06
sector let’s say and also maybe a little
08:08
bit more
08:09
in academia which is of course deeply
08:11
tied to the sector
08:13
uh but i’m noticing that
08:16
the real uh critique of
08:19
the implications of the work is getting
08:22
kind of defanged and de-politicized
08:24
and so how you see that now is that
08:26
people are talk instead of talking about
08:29
algorithmic discrimination or oppression
08:31
which are
08:32
what kinds of words i use people are
08:34
talking about things like bias
08:36
um and i think one of the things that
08:39
that does
08:40
is it really um
08:43
you know it neutralizes the power of the
08:46
critique
08:47
by kind of devolving it into a
08:50
set of arguments that you know everybody
08:53
is biased everything is biased
08:56
and that’s not helpful when we’re
08:58
talking about
08:59
the implications of like life or death
09:01
technologies
09:02
and also these things are structural
09:05
they’re not just
09:07
um you know living at the individual
09:10
level of how a coder
09:12
thinks right or how a programming team
09:14
thinks or
09:15
how an engineering team is oriented
09:18
and i think that is one of the things
09:20
that’s really changed in the last 10
09:22
years is that now
09:24
people talk about ai and ethics but they
09:27
want to talk about it
09:29
i think sometimes in a very thin
09:31
register and
09:32
so you know we have a lot of work to do
09:35
and this is like many
09:36
many kinds of movements in the world
09:38
where um
09:40
you know you’re trying to argue
09:43
about how health disparities
09:47
impact women differently and more
09:49
powerful how
09:51
women are more likely to be let’s say
09:53
you know um
09:54
victims of um cancer reproductive
09:57
cancers
09:58
and then next thing you know like a pink
10:00
ribbon is on
10:01
an iced tea bottle and you’re like
10:03
that’s that’s actually not what we’re
10:05
trying to talk about that’s all we’re
10:07
gonna do
10:08
that’s not it so i think that’s the
10:11
thing to be watching for right now
10:13
is the way in which these conversations
10:16
are increasingly being
10:17
de-politicized that’s such an important
10:19
insight and you so you had a journey
10:21
yourself getting to
10:22
this level of awareness right you came
10:24
from advertising
10:26
and to come into academia and to be so
10:29
fully entrenched and aware
10:30
of what’s going on within that space how
10:33
did you
10:33
make that transition how did you get
10:35
from where you know you were working in
10:37
that advertising world to where
10:39
you know you’re on the front lines of
10:40
making sure that people
10:42
recognize the weaponization and the the
10:44
oppression that’s happening there
10:46
yeah well you know when i was an
10:48
undergrad i studied
10:50
sociology and i was really into kind of
10:52
the social sciences and humanities i did
10:55
theater
10:55
i was like a theater nerd too and i knew
10:58
we were going to be good friends
10:59
[Laughter]
11:02
i’m like i you’re getting me to tell all
11:04
my secrets now
11:07
okay so i i understood the power of
11:10
things like art
11:12
and also statistics to stories
11:16
and i was um i went into corporate
11:18
america really
11:19
idealistic about the kind of change i
11:22
felt that people who were
11:24
activists and um really like coming out
11:27
of the fields of ethnic studies women’s
11:29
studies were the people who could really
11:30
make the difference in corporate america
11:32
and i was motivated and oriented that
11:34
way and
11:35
you know um yeah
11:38
corporate america can really dull your
11:40
knife you know if you
11:42
aren’t careful and so by the time i was
11:45
leaving the industry of course i was
11:47
um i had been on the internet since
11:49
probably i don’t know 88 or 89
11:52
and i was seeing all the changes that
11:55
were happening
11:56
inside the industry and when i went back
11:59
to school which is where i really felt
12:00
like i belonged and kind of the place
12:02
i’ve always felt at home
12:06
i just saw what a disconnect there is
12:08
between
12:09
how people are making in corporate
12:11
america
12:12
and how products and services come to
12:14
market and how
12:16
academia is sometimes lagging woefully
12:19
behind
12:20
um in in an assessment and so that’s
12:23
kind
12:24
of it was just like i don’t know two
12:26
worlds kind of colliding and i felt like
12:28
in academia though i would
12:29
have the space to interrogate
12:33
and maybe make a difference about some
12:35
of the things i was seeing in a way that
12:36
i
12:36
felt i couldn’t do when i was in my
12:39
corporate job
12:40
but it’s such an interesting background
12:41
to bring to it because you are able to
12:43
bring
12:44
uh such a truth-telling you know clarity
12:47
to to the work that you’re doing where
12:49
you can cut through and say
12:51
this is not just a search engine this is
12:52
not just a social platform it is an
12:54
advertising platform and that’s
12:56
fundamentally what’s what’s happening on
12:58
what’s underlying
13:00
you know the the complications of the
13:02
matter right and that seems like that’s
13:03
part of what makes
13:04
your work so so potent is that you have
13:07
that perspective
13:08
yeah well you know there were so many
13:10
people who influenced me when i was in
13:13
graduate school that i was reading i
13:14
mean i read this
13:15
book you know siva’s the googlization of
13:18
everything
13:19
really had a huge impact on me because i
13:21
felt like he was
13:23
writing what was happening what i was
13:25
witnessing and of course his you know
13:27
his new book um anti-social media you
13:30
know i i felt like there were people who
13:32
were writing i mean
13:33
frank pasquale there were a lot of men
13:36
quite frankly who were writing
13:37
um in ways that touched me and
13:41
yet i felt that there was this dimension
13:43
of like race and gender that was kind of
13:45
missing
13:46
or you know the way in which scholars
13:48
sometimes write
13:49
in a universal kind of paradigm about
13:52
you know what happens to society
13:54
but of course my own life living in
13:57
society you know i know that there are
13:58
many worlds that are very different many
14:00
realities that are different
14:02
i occupy different worlds and realities
14:04
from
14:05
from people who aren’t black and people
14:07
who aren’t women
14:08
and so that was the part i thought i
14:11
could contribute
14:12
um kind of in dialogue with all of these
14:15
other amazing brilliant people who
14:17
laid the groundwork but i think you know
14:21
uh again sometimes when women and people
14:25
of color when we’re pushing a boulder up
14:27
a mountain which is what it feels like
14:30
you know you’re not sure if that
14:31
boulder’s going to roll back on you and
14:32
crush you
14:33
or if you’re actually going to get it
14:34
over and kind of you know
14:37
get some like momentum and you know
14:39
velocity and so
14:40
i’m really grateful that i could kind of
14:42
get this conversation
14:44
up over the mountain yeah i think a lot
14:46
of us are grateful too
14:48
first of all uh nicole radziwill says
14:49
nice shout out to siva
14:52
so yay everybody’s appreciating that i
14:54
have a question from
14:56
dr siv brown who says when you’re
14:57
teaching on this topic what activities
14:59
resonate most with your students and
15:01
audience
15:03
well that’s such a great question okay
15:06
so
15:07
on a basic level around search i’ll
15:09
often have my students
15:11
do their own searches on google
15:14
and you know other large commercial
15:19
search engines and i’ll ask them to look
15:22
for identities that matter to them that
15:24
are kind of
15:25
like either their own identities or
15:27
identities they care about
15:29
um and it’s interesting to see the kinds
15:31
of searches they do so
15:32
i say you know go do these searches come
15:34
back next class session we’re gonna
15:36
discuss everybody’s gonna get a chance
15:37
to talk about what they found and what
15:38
it means to them
15:40
and you know for some people they you
15:42
know i will never forget what i was
15:44
teaching at the university of illinois
15:46
at urbana champaign
15:47
i had just like a disproportionately
15:50
high number
15:52
of white women in predominantly white
15:55
sororities
15:56
and i think almost all of them had done
15:58
a search on
16:00
sorority girl and were pissed
16:04
[Applause]
16:05
so where i don’t think like necessarily
16:08
that like searches on black girls and
16:10
latino girls and asian girls that would
16:12
surface porn
16:14
um at the in those years um
16:17
meant that much to them or like they
16:19
couldn’t think that they were
16:19
sympathetic but not empathetic when they
16:22
looked
16:22
for their own identity they were
16:24
disgusted
16:25
so you know often it’s just those kinds
16:28
of
16:29
uh experiences that are really helpful
16:32
to
16:33
help students feel the impact
16:36
of the the work that you’re talking
16:38
about but you know i also do things like
16:40
i teach my students a lot about museums
16:43
and libraries i’ll have them go look for
16:46
the same
16:47
uh people and communities that they care
16:49
about in the library
16:50
many of my students believe it or not
16:52
have never even walked the stacks
16:54
in a major research library um
16:57
so i said go do that and this is also a
17:00
place where they
17:01
see the subjectivity of knowledge
17:04
where they start to realize oh you know
17:07
i was looking
17:08
up like um my sexuality
17:12
and it was in all like in a
17:15
cluster of books about sexual deviance
17:19
and i’m not feeling that right yeah it’s
17:22
right
17:22
and so then they start to see that like
17:25
knowledge
17:26
is subjective and it’s political and
17:28
it’s meaningful
17:30
and this destabilizes their trust in
17:33
just like getting an answer in 0.03
17:35
seconds that’s so
17:36
great to create an experience for people
17:39
where they can truly dimensionalize and
17:42
create the empathy that they may not
17:44
have been able to connect with uh
17:46
otherwise and i feel like i it’s my
17:49
experience and
17:49
tell me if it’s yours that once you kind
17:52
of can once you can connect that empathy
17:54
it stays with you and you can you can
17:56
extend it in different ways in ways that
17:58
you may not have been able to before
18:02
i think so you know i i had a colleague
18:05
say to me once
18:06
you know if you can just like touch one
18:08
student in a class
18:10
that and inspire them to like maybe go
18:12
on to graduate school or
18:14
to like really care about the things
18:15
you’re teaching them about
18:17
that’s you know like you’ve done your
18:18
job and i and i
18:20
think more than one like i need to
18:22
return greater than one
18:23
but i could be meant by it um but i i
18:27
definitely see
18:28
that you know getting to know your
18:30
students
18:31
is a great way to then pull in resources
18:35
that are relevant to them so i always
18:37
try to know
18:38
all the majors for example that are
18:40
represented in my class
18:42
and i will adjust the reading schedule
18:44
right after the first day of class
18:46
to start to match up with things that
18:49
they care about so that they can see
18:51
like oh you you want to do cognitive
18:54
science
18:56
why don’t you take a look at these
18:57
things um that are always kind of
18:59
bringing
19:00
them back to a critical interrogation of
19:03
their own work and then the things they
19:05
care about but also of these systems
19:08
of control surveillance and power in our
19:10
society
19:11
because in many ways those things are
19:13
going to have a huge impact
19:15
on their work and on their own personal
19:16
lives and i i feel they have to leave
19:19
the university knowledgeable about that
19:21
that’s so smart and we’ve talked about
19:23
this a couple times on this show before
19:24
about how you can almost
19:26
matrix out all the different facets of
19:28
life and then put it alongside
19:29
technology and say
19:31
you need to know about how technology is
19:33
going to impact economics across
19:35
different sectors of different segments
19:37
of society you need to know how it’s
19:38
going to impact politics across
19:40
different segments of society so it’s
19:42
it’s brilliant to take people’s majors
19:43
and what they’ve already sort of
19:45
declared an interest in and say now
19:47
here’s how this is going to play out or
19:49
allow them to do the discovery about how
19:51
it’s going to play out in their field
19:53
that’s right i mean i always for example
19:55
with with their final papers in my class
19:58
ask them to take up the readings that
20:01
we’ve looked at and the kind of critical
20:03
thinking skills that i’ve taught them
20:04
about this
20:06
this domain and apply it to
20:10
something that’s in their field of study
20:11
or their expertise so that the computer
20:13
science students
20:14
and the engineering students get to
20:16
write about the things they care about
20:19
but interrogate them differently in a
20:20
way that i know they’re not doing in
20:22
their computer science courses
20:26
i know hopefully they will though i mean
20:28
i feel like that’s the change that
20:29
you know your work and and other folks
20:32
in the space that are creating that
20:33
momentum that
20:35
that kind of conversation will be part
20:37
of the technology
20:38
uh majors the the computer science
20:41
majors and all that
20:42
yeah well you know what the humanities
20:44
and social sciences do is they give you
20:46
a really
20:47
great vocabulary for talking about the
20:50
things
20:51
you care about and for you know looking
20:54
at them closely
20:55
having a close reading really being able
20:57
to articulate and
20:58
storytell what matters in
21:02
in the moment right or on the project
21:04
and i feel like
21:06
students and and people not just people
21:09
in the university but
21:10
all of us need to be able to better
21:12
storytell
21:13
why things work and why things don’t
21:17
work
21:17
so that we can influence and persuade
21:19
each other
21:20
to to move in directions that are
21:23
that are better you know better for for
21:26
everyone
21:27
just for a small sector and um that’s
21:30
hard work
21:31
but i um you know i i feel really lucky
21:35
to get to try to
21:37
do that kind of work yeah and so do i i
21:40
think that’s
21:40
that’s the most powerful work you know
21:43
one thing i wanted to ask you though it
21:44
seems like
21:45
when you wrote the book and you know in
21:47
in the time that you were doing the
21:48
research on it
21:49
it seems like we were mostly
21:51
concentrating on algorithms online and
21:53
you know kind of the
21:54
the experience of being at a browser or
21:57
whatever and typing in a search and
21:59
getting the query back
22:00
of course now we know that all our
22:02
experiences are so much more hybridized
22:05
so i wonder you know are you doing new
22:07
research that looks at that inner
22:08
integrated space and how
22:10
how you know that bias and the
22:12
oppression that comes into those
22:14
through those algorithms it’s affecting
22:15
people in their everyday lives
22:18
yes and i i really appreciate you
22:20
bringing that up you know
22:21
when i was writing algorithms of
22:23
oppression it was at the same time cathy
22:25
o’neil was writing her book offensive
22:27
mass destruction
22:28
and i met her at this convening that
22:31
meredith
22:32
whitaker um and kate crawford did
22:35
that was in partnership with the white
22:37
house this was during the obama
22:39
administration
22:40
and you know she was talking about um
22:43
how algorithms and ai are so deeply
22:46
embedded in like banking and finance and
22:48
all kinds of you know predictive
22:50
technologies and i was like
22:52
i’m writing about that too and so you
22:54
know we both of us our books were about
22:57
to come out i think hers came out
22:58
a few months before mine and um wow the
23:02
landscape has changed so much i mean now
23:03
there are so many more people
23:05
writing brilliant books um
23:08
that just are amazing you know i think
23:11
of rujob benjamin’s you know
23:13
new book race after technology that is
23:16
so
23:16
beautifully accessible and um and and
23:19
people
23:20
are studying virginia eubanks you know
23:22
the word this work on
23:24
social welf welfare systems so yeah
23:27
there’s no shortage
23:28
of predictive analytics that are working
23:31
across every sector of our society
23:33
and um from the time that algorithms of
23:37
repression came out
23:38
you know until now uh that the speed
23:43
by which that has happened has been uh
23:46
um remarkable um
23:50
astounding so i’m thinking about um i
23:53
think people are doing really good
23:55
you know jobs with that uh at the center
23:57
for critical internet inquiry at ucla
24:00
you know i’m i’ve been looking at things
24:02
like
24:03
how the tech sector um
24:06
undermines democracy by doing things
24:08
like not paying taxes
24:10
by you know pulling the cream of the
24:12
crop of the best students in the country
24:14
into their projects how it abuses
24:16
workers um
24:18
you know just the uh how it unleashes
24:20
products
24:21
on society with no oversight
24:24
by anyone um other than
24:27
you know themselves and and what the
24:30
implications of that will
24:32
be long term so those are kind of the
24:34
things that i’m i’m working on and
24:36
writing about right now
24:37
so you just had a big announcement today
24:39
relating to your work there
24:41
yeah so we did so we are really grateful
24:45
um
24:46
we’ve been working with julia powell’s
24:47
dr julia pals and bianca wiley
24:50
who are um at the mindaroo foundation
24:53
who have um established a global network
24:58
of critical scholars who are kind of
25:00
taking on big tech or the implications
25:02
of big tech let’s say
25:04
and uh we’re so at ucla we are going to
25:07
be one of the nodes in the network and
25:08
we
25:09
were just um gifted 2.9 million dollars
25:13
to over five years to stand up this
25:16
initiative
25:17
and one of the things we’ll be doing is
25:19
working on policy
25:21
around these kinds of things that i’m
25:22
talking about but also
25:24
culture culture making counterculture
25:26
making activities because we feel like
25:28
you know the sector has really um
25:32
created a culture that fetishizes itself
25:35
right and we’d like to introduce
25:39
you know other ways and other kind of
25:41
cultural takes hot takes on what on what
25:44
the implications of these
25:45
projects are so um i hope that people
25:49
who are interested in this will like go
25:51
to our site at c2i2.ucla.edu
25:54
subscribe to our mailing list follow us
25:57
on twitter
25:58
and just um be in conversation with us
26:00
because we really want to
26:03
do culture jamming and and policy
26:06
jamming kinds of work and i think it’s
26:08
going to be a very experimental
26:11
and exciting time to to be working on on
26:14
these things
26:15
for sure and congratulations by the way
26:17
we have uh dr
26:18
again pops up to say awesome
26:21
congratulations
26:22
in all cats so uh thank you for sharing
26:25
your congratulations dr sid
26:26
uh yeah i noticed that your agenda as
26:30
stated on
26:30
on the site is so tackle lawlessness
26:33
right
26:34
empower workers and reimagine tech and
26:37
those are
26:37
three powerful declarations of what you
26:40
plan to focus on yeah i mean my
26:44
focus in all of that is i’m thinking
26:47
about things like
26:48
the lawlessness of tech uh in
26:51
again kind of tax evasion um
26:55
you know undermining democracy
26:59
not just in the united states but in
27:00
many modern democracies around the world
27:03
what does the sector owe back to the
27:06
public
27:07
in terms of repair and restoration
27:11
and so i’ll be working on things like
27:13
restoration
27:14
and reparations and imagining
27:18
um and you know again i will put it like
27:20
in this paradigm
27:21
um and kind of where i’m writing right
27:23
now um there was a time when
27:26
people couldn’t imagine the american
27:28
economy without big cotton
27:30
and the labor relations of enslaved
27:32
africans and
27:34
of um occupation of indigenous lands
27:37
like that was the model
27:38
and it was um no one could imagine
27:41
beyond it but there was a small group of
27:43
people who were abolitionists and
27:45
i certainly probably would characterize
27:48
myself as a
27:50
i fancy myself a tech abolitionist and
27:52
an abolitionist
27:54
i mean i love it and that you know
27:56
there’s just a small group of people who
27:58
are always saying like this
28:00
the morality of this isn’t right the
28:03
immorality
28:04
we have to reimagine the american
28:08
economy
28:08
and you know i i’ve been um laughing you
28:11
know i’ve been telling the story
28:13
that i’m sure my when my mom was
28:15
delivering me
28:16
in the hospital in fresno where i grew
28:18
up where i was born
28:20
um had like a cigarette hanging from his
28:22
mouth you know i mean there’s like
28:23
no question that um you know
28:27
the 17’s were just all about that and um
28:30
in the 80s and
28:32
and then we had a paradigm shift about
28:34
big tobacco
28:36
and the people who had been doing the
28:38
the activist work and the researchers
28:40
who had been saying
28:41
big tobacco is a public health creating
28:43
a public health crisis
28:44
we can’t afford this it’s too extractive
28:48
it’s it we’re paying too high a price
28:50
they shifted the paradigm and you know
28:52
my students now they can’t imagine
28:54
that people were like chain smoking in
28:57
the hospitals
28:58
right you know in all kinds of places so
29:01
i think
29:02
about big tech you know in in that model
29:05
that
29:05
um how could we look at other historical
29:08
moments and take a longer view
29:10
on this era that we’re in and maybe
29:12
reimagine
29:13
something uh far less harmful and maybe
29:16
even
29:17
helpful wonderful we have a question
29:19
from the audience
29:20
she says laura laura says i’m curious if
29:24
you think that predictive analytics
29:26
can be used ethically emphasis on can
29:28
our company has shied away from using
29:30
them but it can make it challenging to
29:31
compete with companies that do use them
29:33
heavily
29:35
this is the challenge with predictive
29:37
analytics okay of all cat
29:38
all technologies and systems that rely
29:42
upon classification
29:43
and categorization systems and this to
29:45
me i mean this is where
29:47
my library science nerd is really
29:50
uh has me kind of anchored to
29:54
the fact that all forms of
29:56
classification and categorization
29:59
um have implications so the question
30:03
is what are the implications of the
30:05
kinds of classification systems that
30:07
you’re using
30:08
in order to make your predictive
30:10
analytics most of the ways in which the
30:12
technology is oriented
30:14
is really around like binary
30:16
classification systems
30:18
um you know binary code and that it
30:22
already um is a problem
30:25
it’s certainly a problem around gender
30:27
it’s a problem around
30:28
um uh race um the
30:31
the things that these kinds of
30:33
classification systems do to reify
30:36
you know power imbalances um and
30:39
exploitation
30:40
are very important so i think the
30:42
question is is it possible
30:44
to make classification systems that are
30:47
not
30:47
um harmful and that is actually
30:52
probably a more important you know
30:54
question we have to ask before we can
30:55
get to the deployment of the predictive
30:57
analytic
30:58
yeah it really is an interesting uh
31:00
question there because so i’ve said for
31:01
years that my favorite book title is
31:03
george laykov’s
31:04
women fire and dangerous things what
31:07
categories reveal about the mind
31:09
and that taxonomies are not neutral like
31:10
there’s nothing neutral about any kind
31:12
of categorization or classification you
31:14
can ever do you’re imposing some sort of
31:16
opinion
31:17
judgment whatever into the
31:19
categorization
31:20
so yeah it’s an important point you make
31:23
that’s really
31:23
it and you’re also creating social
31:26
structure
31:27
through those categories and what we
31:30
know is that those categories
31:32
have always existed at least in the
31:34
western context
31:35
as hierarchical so if your
31:38
categorization system
31:39
puts you know it has you have a racial
31:42
classification system like we have the
31:43
united states and many other parts of
31:45
the world
31:46
where white is the highest valued
31:49
and most um resourced and most powerful
31:53
and black is the antithesis of that and
31:55
the binary and
31:56
everything in between is vying for its
31:59
relationship
32:00
to power or powerlessness
32:03
um that those systems become real
32:06
so the question is um you know how do we
32:10
create systems that aren’t hierarchical
32:13
and where power is not distributed
32:16
along those lines of classification or
32:18
categorization
32:19
and we have not solved that um instead
32:22
we are reinforcing those systems of
32:24
power over and over and over again
32:26
right right so i know a lot of companies
32:29
want to try to do something
32:31
about this within their own systems or
32:33
at least there’s
32:34
lip service given to wanting to try to
32:36
do something about
32:37
the bias and the the oppression that
32:39
happens within their systems but
32:41
i i saw that in 2019 according to the
32:43
artificial intelligence index report
32:46
uh put out by the stanford university’s
32:49
human centered ai institute
32:50
it said only 19 of large companies
32:53
surveyed said their organizations are
32:55
taking steps
32:56
to mitigate risks associated with the
32:57
explainability of their algorithms
32:59
and 13 are mitigating risks to equity
33:02
and fairness such as algorithmic bias
33:04
and discrimination so clearly
33:06
even if there’s a lot of lip service to
33:08
it there isn’t a lot of action and i
33:10
wonder if you
33:11
have um concrete steps or
33:14
recommendations that you’re able to
33:16
offer and
33:16
academia doesn’t typically offer you
33:19
know concrete steps into
33:20
corporations but i wonder if you have
33:23
recommendations for
33:24
for companies that want to you know do
33:26
some mitigation and make sure that
33:28
they’re
33:28
they’re taking steps yeah i it’s
33:32
it’s difficult because the algorithms
33:34
and the ai and the predictive analytics
33:36
that are coming out of
33:37
industry are optimized for profit so
33:41
that’s one of the challenges here is
33:43
that
33:44
let’s say capitalism and
33:48
multiracial democracy might be at odds
33:51
with one another
33:52
um so we have to figure out the degree
33:55
to which
33:56
industry feels um it must be responsible
33:59
for in which it’s shaping our societies
34:02
um and we haven’t done so well there
34:08
so i think you know if companies are
34:10
serious about this
34:11
then they also have to be serious about
34:13
the role that their companies are
34:15
playing
34:15
in the world um you know should it be
34:19
profit at all costs
34:20
um when is enough enough um uh you know
34:24
when is being profitable enough
34:26
enough and i think that of course there
34:29
are many great scholars
34:30
out here i think for example of a a
34:34
group of scholars uh that i work with
34:36
out of nyu
34:38
um called the center for um critical
34:41
race and digital studies
34:43
and um you can find us
34:46
with a kind of simple query on nyu’s
34:50
uh center for critical race and i’ll
34:51
send this out and digital studies
34:53
we have a lot of readings there um
34:56
where you can get educated people who
34:59
work in industry
35:00
and tech leaders can get deeply educated
35:03
about the implications of their work
35:05
um at c2i2 on our resource page we also
35:08
have kind of 15 plus books
35:10
at the intersection of race and
35:11
technology that we think people in the
35:13
industry
35:14
a variety of industries should be
35:15
reading i know they’re in
35:17
you know since the week of june 8th and
35:19
um you know in this
35:20
moment where we’re calling for um
35:23
justice for george floyd and brianna
35:25
taylor
35:26
and organizing uh in this new civil
35:29
rights you know extended let’s say
35:31
civil rights movement um of black lives
35:34
matter
35:35
that many tech companies in particular
35:37
but also other companies financial tech
35:39
and others
35:40
are reaching out to scholars and
35:43
bringing us in and asking us to educate
35:45
them
35:46
so i think you know it takes that and it
35:48
also takes a real
35:49
you know a longer term concerted effort
35:52
to figure out
35:53
um you know the ethical
35:56
um framework uh that a company is
35:59
is working in and um those are
36:02
complicated
36:02
long term conversations yeah that makes
36:05
me it makes me think
36:06
makes me wonder how much of the
36:08
underlying issues and the reluctance to
36:10
mitigate
36:10
those issues have to do with either
36:13
incentives
36:14
or the transparency that it requires or
36:16
the equity that isn’t shared
36:18
at the table as opposed to say the
36:20
technical difficulty
36:22
that it would take to go through and
36:23
sort of clean up that system yeah i
36:26
think it’s both
36:27
i mean i think the fact that we have no
36:29
meaningful regulatory framework right
36:31
now in the united states is a huge
36:33
issue because we know for example that
36:36
most companies
36:37
that have historically been implicated
36:39
in um
36:41
uh discrimination all the way to the
36:44
technical level which i would say would
36:46
be by
36:47
banking finance insurance right as like
36:50
right out
36:50
right off the bat um those companies
36:54
those industries didn’t really
36:55
start to um address redlining um in
36:58
their products and services
37:00
until it was against the law so we know
37:03
we have to have more than just kind of
37:05
the fox guarding the hen house we need
37:08
regulators to get serious about the
37:10
discriminatory effects
37:12
of many of these technologies and that
37:13
will be one way that companies will be
37:16
forced to kind of comply with the law
37:19
i mean facebook you know itself never it
37:22
you know we know that like they hadn’t
37:24
even considered things like eeoc
37:26
or civil rights act um or housing
37:28
discrimination
37:29
law in their own products and services
37:32
and it wasn’t until
37:33
um you know congress got serious about
37:34
calling to them to the carpet or the
37:36
federal trade commission
37:37
got serious about that that they um
37:40
started to
37:41
um examine their products at a technical
37:43
level
37:44
and of course at a technical level the
37:46
challenge now is that for the big tech
37:48
companies
37:49
they um they don’t know how to fix
37:52
some of these problems um they they
37:56
think or maybe let’s say they um profess
37:59
that they will solve these things with
38:01
ai that they will kind of
38:02
automate the fixes but we know that in
38:05
fact their automation
38:06
are human beings and this is where the
38:08
work of people like my colleague and
38:10
collaborator sarah roberts and her work
38:12
on
38:12
you know helping us understand this
38:14
these armies of content moderators for
38:16
example
38:17
around the world but it’s human beings
38:19
who are implementing these decisions
38:22
and um the real policy decisions are
38:24
getting made by the lowest paid most
38:27
vulnerable workers who touch content
38:29
that might be discriminatory
38:31
these cannot be fit necessarily at a
38:33
technical level
38:34
and um this is where i think you know
38:38
we’ll have to figure out are these
38:40
companies in fact just
38:41
too big um to fix
38:44
their the problems of their own making
38:47
yeah
38:48
that’s a huge conversation we got into
38:49
it uh the last few episodes too about
38:52
the human moderators and the work that
38:53
casey newton and other
38:55
uh investigative journalists have done
38:57
in exposing some of the conditions that
38:58
some of the facebook moderators work in
39:00
and
39:01
for example but it seems like i mean it
39:03
is going to be a real thorny
39:05
thing to try to figure out how to create
39:08
the right kinds of regulations or what
39:09
what is
39:10
what how much of it is going to be about
39:12
breaking up pieces of the companies how
39:14
much of it is going to be about
39:15
creating the right incentives or
39:17
disincentives how much is going to be
39:18
about requiring the right kinds of
39:21
uh transparency in the algorithms and
39:24
the ai
39:25
uh and and and more i’m sure
39:28
more than what i’m i’m thinking of yeah
39:30
i mean the challenge here
39:32
is that really in the in the
39:35
social media space in the search space
39:38
um
39:38
you know we’re talking about uh and in
39:41
the hardware i would say
39:42
space you know we’re talking about
39:44
monopolies
39:45
and um so the very um
39:49
you know ground upon which
39:52
internet-based companies
39:53
came to the fore was in the wake of
39:56
breaking up
39:57
big um telecom monopolies right
40:00
and so i think it’s interesting now that
40:03
um
40:03
you know that in internet-based um
40:06
companies are
40:07
in fact monopolies and this means that
40:12
consumers have very little choice it’s
40:14
very difficult
40:15
to have harms addressed or redressed
40:19
because we don’t have
40:20
a legislative apparatus that is
40:24
literate enough quite frankly to even
40:26
understand what these technologies are
40:28
and what their harms are and i think
40:30
that you know that could change
40:32
um so you know we have a lot of work to
40:35
do
40:36
uh to not only help the public
40:38
understand what’s at stake but
40:40
you know when when social media
40:43
companies and and big tech companies
40:45
like google
40:46
say you know youtube you know say that
40:48
they’re not media companies
40:49
and try to skirt responsibility for the
40:51
content that moves through their
40:52
platforms you know where
40:54
you know we’re in dangerous territory
40:55
because um
40:57
they indeed are responsible for the
41:01
content that moves through their
41:02
platforms and um
41:04
much of their content i mean one of the
41:05
most dangerous things we can see right
41:07
now
41:07
is the flood of disinformation that’s
41:10
moving through
41:11
these platforms as we uh uh rapidly
41:15
uh you know pummel toward the
41:17
presidential election
41:19
and all the down ballot elections and
41:22
um uh a lot is at stake
41:26
if we aren’t serious about looking at
41:28
this sector
41:29
yeah and so i’ve been uh advising
41:33
groups in other countries as well as
41:35
they kind of hurtle toward regulations i
41:37
know brazil has been
41:39
uh dancing around different kinds of
41:41
regulations and they’re having a lot of
41:42
the same issues around misinformation
41:44
and coming up on
41:45
on various political elections that are
41:47
are critical to
41:48
to those countries so uh he comment from
41:52
bruce celery says crazy to think that
41:53
yesterday’s disruptors are today’s
41:55
monopolies but that’s exactly right
41:57
it was a great point sophia
42:00
yes that is the situation and you know
42:03
it’s uh
42:04
when i think about brazil and the united
42:06
states and other places
42:07
around the world you know the uk um and
42:10
the disruption to kind of the way
42:13
democracy works and
42:14
in all of our countries um
42:18
you know people thought that when the
42:20
cambridge analytica
42:22
um i mean i wouldn’t even call it a
42:25
scandal because it’s just like the
42:27
the business operations of cambridge
42:30
analytica
42:31
when that came to the fore people were
42:33
you know like stunned
42:35
and i thought i remember first reading
42:37
about cambridge analytica before it
42:39
became a big story
42:40
and i remember kind of watching them and
42:44
i was like the whole internet is
42:46
cambridge analytica
42:49
internet is brokering and selling
42:52
and making data profiles about us and
42:54
micro targeting
42:56
us and making digital profiles about us
42:59
that we’ll never know about
43:00
we’ll never be able to see that we can’t
43:02
intervene upon and that are
43:04
um making again opportunities and
43:07
foreclosing others
43:09
and this is where you know shashana
43:11
zuboff’s you know book the age of
43:12
surveillance capitalism
43:14
is so um important you know i always
43:16
thought if my that my book might be a
43:17
book that you shouldn’t read at night
43:19
before you go to bed because it might
43:20
give you nightmares but then i read
43:21
shawna’s book and i was like well
43:23
well well wow that category
43:28
and you know again the thing that i
43:30
would add to
43:31
her critiques about how these predictive
43:34
technologies
43:35
are so um opaque and embedded in in
43:39
every move all the way down to like
43:41
you know our ovulation uh you know
43:45
tracking our ovulation tracking you know
43:48
every pimple
43:48
i don’t know tracking every question you
43:50
answer right or wrong in the learning
43:52
management system when you’re
43:54
nine years old in the fourth grade um
43:56
and and what those will
43:58
amount to is really really um
44:00
frightening but then when you overlay to
44:02
me
44:03
the racialized and gendered power
44:06
systems in our society and you know then
44:08
that this will have
44:10
far more severe consequence for people
44:13
who are already poor
44:14
who are already marginalized who are
44:16
already living um
44:18
you know under uh you know uh threats of
44:21
civil and human rights
44:22
just by virtue of who they are in the
44:25
world
44:26
um then i think you know to me there’s
44:28
nothing more interesting to talk about
44:30
and to to work on and um
44:33
and of course this includes working with
44:36
artists
44:37
and people like you and you know smart
44:39
people just to have these conversations
44:41
i mean these are the kinds of
44:42
conversations everybody should be able
44:44
to have
44:45
at the dinner table um because they
44:47
really affect all of us
44:48
uh you know imagine what it’s like for
44:50
your your four-year-old
44:52
whose life is being documented on the
44:54
web in every possible way
44:56
and you know that you’re unknowingly
44:58
creating digital profile for that child
45:01
you know imagine what it’s like that you
45:04
will inherit
45:05
the digital legacy of your family
45:06
members and that that will
45:08
affect your ability to get a mortgage or
45:10
go to college or
45:11
or do things that you want to do in the
45:13
world because somebody in your social
45:14
network
45:15
said something against the government
45:17
and of course we already see these
45:18
things happening
45:20
so it’s not uh the dystopian future
45:22
we’re just talking about right now
45:25
yeah yeah i think what’s really
45:26
important is that you have brought such
45:28
clear language to it because as you say
45:30
you know it’s the sort of you have the
45:32
opportunity to talk about things like
45:35
bias or you have the opportunity to talk
45:36
about them in terms that actually call
45:38
them out as
45:39
oppression and i what i fear is that
45:42
what it takes to really understand what
45:44
happens at that whole the whole internet
45:46
is cambridge analytica level
45:49
is that you have to be able to think
45:50
about meta systems upon meta systems and
45:53
understand not just how you know the
45:56
internet is connected and how
45:57
data collection and monetization systems
46:00
happen but
46:01
obviously of course underneath that the
46:03
the societal structures of systemic
46:04
inequities and everything that happens
46:07
at that meta system level and so to have
46:09
those kinds of conversations in
46:11
an articulate disciplined way
46:14
you need to be able to call them out as
46:16
the truths they are and i think you know
46:18
using words like oppression certainly
46:20
help bring that clarity
46:22
to to it yeah i really agree
46:25
that um all of these are interlocking
46:29
systems of oppression this is one of the
46:32
reasons why
46:33
i think people who are like
46:35
intersectional you know who use
46:37
intersectionality
46:38
as theory to understand or feminism or
46:41
black feminism or critical race
46:43
you know that’s what we study and write
46:46
about
46:47
our interlocking systems of oppression
46:49
and
46:50
it’s very important that we have um
46:53
clear vocabulary words so we all know
46:55
what we’re talking about
46:58
and of course our words are in dialogue
47:01
with other people’s words
47:03
like technology is liberatory
47:07
or that it engenders more freedom or
47:09
more connectivity
47:11
i mean it was astounding to watch mark
47:13
zuckerberg
47:14
in the anti-trust hearings um uh a
47:17
couple weeks ago
47:18
where while the senator you know that
47:20
this the members are saying you know
47:23
could you talk to us about the way in
47:25
which your products have collapsed
47:26
democracy
47:28
and you know the answer is like more
47:29
connectivity
47:31
i mean it’s like could you talk to us
47:33
about you know
47:35
um discrimination that happens on your
47:37
platform
47:38
and the ways in which um people have
47:40
been threatened with genocide
47:42
through your your project and he’s like
47:44
we’re connecting more people
47:47
so it’s like the discourse that’s coming
47:49
out of these spaces are really really
47:51
powerful
47:52
and amplified so much uh more powerfully
47:55
than you know what i’m saying so i think
47:58
we have to be really clear and get our
48:00
vocabulary words um straight
48:02
as we do this and it also seems like
48:04
because when you start
48:06
evaluating discussions around things
48:08
like how are we going to tackle the
48:10
issue of content moderation and you
48:12
talked about human content moderators
48:14
but then you think about
48:16
when there really are ai solutions which
48:19
you know
48:19
mostly there are not yet but but as
48:22
there start to be
48:23
uh more and more machine learning uh
48:26
solutions deployed against this
48:28
i worry about that too i worry about the
48:30
encoding of
48:32
you know biases there and what’s going
48:34
to happen when
48:36
uh you know using gans for example to to
48:38
try to simulate bad behavior as
48:40
just came out a few weeks ago that
48:42
facebook’s doing uh to try to create
48:45
some rules that they’re able to process
48:47
against
48:48
which in theory sounds great on one
48:50
level of abstraction and then when you
48:52
think about what are you really
48:53
codifying and what are you bringing into
48:55
scale
48:56
that’s what i worry about so yeah
48:58
there’s these systems upon systems
49:00
yeah yeah i mean you’re right to worry
49:02
about that because
49:04
at the level right now of a business
49:08
operation
49:08
used you know across many of these
49:10
platforms youtube
49:12
i mean all kind of many many different
49:14
major platforms
49:17
the human beings can’t get it right
49:20
so if you already have you know a
49:24
a corpus of decisions that have been
49:26
made
49:27
by moderators and those have not been
49:30
good decisions
49:31
well those are also the decisions or the
49:33
actions that that are informing your
49:35
machine learning model
49:37
so of course we know and you know this
49:39
one of the things i really try to
49:40
impress
49:42
just like make this point in the in my
49:44
book is
49:46
if you are a part of an oppressed class
49:50
let’s say you’re black in america but
49:52
you’re only 13
49:54
of the population you actually will
49:56
never be able to impact
49:58
the algorithm even if all 13 percent of
50:00
the population
50:01
of the black population was in agreement
50:04
which we are not
50:05
about adjudicating harm racist
50:08
propaganda disinformation this kind of
50:10
thing
50:11
still can’t even impact the broader
50:16
whole so even using these like so-called
50:19
democratic methods
50:21
are really um unfair um
50:24
so we have to be more complex uh about
50:26
what we’re talking about and i’ll tell
50:28
you
50:28
the thing that’s been interesting about
50:30
watching countries like germany and
50:32
france
50:32
as they’re working through regulating
50:34
things like hate speech or propaganda
50:37
and these big platforms is that they
50:39
have a much closer
50:41
relationship in history to the holocaust
50:43
and understanding for example the role
50:45
that
50:47
anti-semitic anti-gay um
50:50
anti-black propaganda played in
50:54
the relationship like the the the
50:56
legitimation
50:58
of cost right in the in the dulling of
51:01
the senses of the public
51:03
that to the point that when people would
51:05
stand on train platforms
51:07
going to work in in um
51:10
germany jewish people would be
51:14
being loaded onto train cars to be sent
51:17
to
51:17
concentration camps while other germans
51:19
were getting on train cars and going to
51:21
work
51:22
imagine that level of desensitization
51:25
that would have to happen and that
51:27
happens through
51:29
fascist propaganda racist anti-semitic
51:32
these kinds of violent subtle forms of
51:36
of discourse in a society and that’s
51:39
what’s happening
51:40
in these platforms and it’s being fully
51:43
normalized and of course it’s having a
51:44
tremendous impact
51:46
i was just gonna say it’s the
51:46
normalization of that speech and you
51:48
just hit it there at the end yeah
51:50
it’s it’s having that just be
51:51
surrounding you all the time and it’d be
51:53
normalized into how can anybody have the
51:56
stamina
51:57
to keep up with fighting against that
51:59
level of disinformation and
52:00
misinformation
52:02
yeah so it’s it’s all that’s all very
52:06
bleak and i know
52:07
we we’re working very hard to try to
52:09
keep the worst from happening but
52:11
what do you think that we have to be
52:13
optimistic about
52:14
what what do you look at in this space
52:17
and what makes you feel hopeful
52:19
well i feel so hopeful because i am
52:24
uh you know part of a community
52:27
of people in the united states who have
52:31
um you know gone from
52:34
uh you know lynching um public lynchings
52:38
um you know being shadow slavery
52:42
not being considered human to at least a
52:45
greater
52:46
possibility for our humanity to be
52:48
realized and so
52:49
that is always for me so in the
52:52
forefront of my own
52:53
lived experience and my family
52:55
experience and so
52:56
how can i not be hopeful because i feel
52:59
like
53:00
black people and black women in
53:02
particular have been at the forefront of
53:05
so many important movements for justice
53:07
in this world and i feel really
53:09
grateful i was born into this package
53:11
even though it’s
53:12
also been difficult and um and that
53:15
makes me feel i i feel hopeful that
53:18
there has been hope for hundreds of
53:21
years in this country
53:23
to make change and to realize change
53:27
and that’s the kind of legacy that i
53:29
feel like my work is
53:30
is in or tradition that my work is in
53:33
and so how can again i
53:34
i feel um grateful
53:38
and and um and um
53:42
open-hearted that the world
53:46
will be better that it can be better
53:49
that it has already become better
53:51
in some ways and i also know that these
53:53
struggles
53:54
are not um static it’s not like you win
53:57
your civil rights
53:58
and you keep them forever um you know
54:01
african-americans had tremendous
54:04
civil rights the the moment in our
54:06
history where we had the most
54:08
rights was the period following
54:10
reconstruction
54:11
after the civil war a lot of people
54:13
don’t even realize that
54:14
um and um then those rights were rolled
54:17
back through jim crow
54:18
um laws and um knight riders and the
54:22
clan and
54:23
and terrors terrorizing our communities
54:25
and burning down
54:26
our cities and our our banking
54:28
institutions and stealing our wealth and
54:31
we had another civil rights movement in
54:32
the 1960s
54:34
um and we’re undergoing another civil
54:36
rights movement right now after the
54:38
rollbacks of the civil rights
54:39
that’s it that have happened over quite
54:41
frankly the last four years most
54:43
intensively
54:44
so um you know we should never take our
54:47
rights for granted
54:48
and i think we should read and learn
54:50
from history and
54:52
that is a source of strength and power
54:54
for all of us
54:55
that’s beautiful i think it’s also such
54:57
an important point that you made earlier
54:59
that progress and freedom for
55:02
everyone is not going to happen as a
55:04
result of you know
55:06
majority rules like the the people who
55:08
need the power
55:10
because they need you know equity aren’t
55:13
going to necessarily be able to take
55:14
that
55:15
by by power it needs to be something
55:17
that everyone
55:18
kind of awakens to and that there’s uh
55:22
an evolution of we we
55:25
evolve to our higher selves that we we
55:27
think about uh you know our roles in
55:29
society and culture
55:31
in in that more evolved way and i
55:34
have to uh bring up a comment that just
55:36
got posted
55:37
never stop talking sophia never ever
55:40
please and thank you
55:42
yeah thank you it’s really hard and i’m
55:45
really grateful
55:46
that um i i’ll just say every people
55:49
don’t even realize
55:50
every little tweet at me and every nice
55:53
thing
55:53
is very encouraging because um you know
55:57
you never know people’s private
55:58
struggles and what they’re you know the
56:00
interior of their life is like and
56:02
i will tell you that there was a time
56:04
when i was so
56:06
painfully powerfully insecure to speak
56:08
my thoughts
56:09
and um it’s really been um
56:12
a journey and i’ll just say that anybody
56:15
who’s ever felt
56:17
small um should try to let that go
56:21
and just speak your heart because um
56:24
that’s that’s what we need in the world
56:26
and i’m really glad that
56:28
uh you know hugs by uh like a thousand
56:31
hugs and and and comments and
56:33
encouragements
56:34
really as is what has gotten me here
56:37
that’s wonderful well thank you so much
56:39
for that transparency and that
56:41
that uh that kind open-heartedness too
56:44
uh we’re getting such love from the
56:47
comments by the way uh dr siv says this
56:50
is such a necessary conversation can’t
56:52
we
56:52
wait to share with belmont students and
56:54
to that end she also asked earlier
56:56
uh can you recommend websites for
56:58
educators to use to teach about
57:00
algorithms of bias and oppression i use
57:02
the digital divide filter bubble and
57:04
facial recognition resources greatly
57:06
appreciated so
57:06
if you think of some right now that you
57:08
can share that’d be great if you just
57:09
want to send me some and i can include
57:11
that in show notes uh that would be
57:12
awesome too anything come to mind right
57:14
away
57:15
yeah so the two we have some syllabi um
57:19
at the um critical race and digital
57:21
studies um
57:22
website for edit for nyu and i think
57:25
that the url
57:26
is critical race and digitalstudies.com
57:28
and um
57:30
and there’s some great syllabi there and
57:32
of course people have been
57:34
um organizing um many syllabi through
57:36
twitter
57:37
um and so i would say uh we also have
57:41
resources up
57:42
on our resource page at c2i2.ucla.edu
57:46
um and you know we’ll keep populating
57:49
that
57:49
um in the weeks and months to come
57:51
because uh
57:54
it’s fun to teach students these things
57:57
it does seem like doom and gloom but it
57:59
really isn’t because with knowledge
58:01
comes power
58:02
and people do feel empowered when they
58:04
know more
58:05
i tell my students i just try to i want
58:06
you to be the most interesting person at
58:08
a cocktail party
58:09
i know you’re 18 and you’re not at
58:11
cocktail parties yet but give it a
58:12
minute
58:13
um you know you need to know things and
58:15
to be an interesting person and so
58:17
i i would say um those two places are
58:20
good places to look for
58:21
articles and books and videos and things
58:23
that we think um
58:24
are important yeah i feel like i talk
58:26
about this a lot too that
58:27
it’s not necessarily that being i talk
58:30
about being an optimist and it’s not
58:31
that
58:32
being an optimist means you don’t
58:33
acknowledge the bad things that happen
58:36
or can happen or are happening right
58:38
like i think it’s really important
58:40
to be fully realistic and conscious of
58:43
everything that’s happening but the work
58:45
of optimism is to recognize the good
58:47
that can happen and then steer our work
58:50
in that direction
58:51
and i feel like there is no one who does
58:53
that more clearly and with more eyes
58:55
wide open
58:56
uh goodness than you and i appreciate so
58:59
much you coming on and talking to us on
59:01
the show
59:01
thank you sophia i’m happy to be here
59:04
anytime
59:05
i really appreciate that this
59:06
conversation and
59:08
i’m going to have a great day i’m so
59:10
glad can you let our viewers know where
59:12
they can find more about your programs i
59:14
know you’ve given a few urls already but
59:15
just
59:16
where can they find you and your work
59:18
yeah you can find me
59:19
um i like to hang out on the internet uh
59:22
at sophia noble on twitter where i
59:25
always try to retweet
59:27
and send out good things that i think
59:29
are happening and if you catch me on a
59:31
late night tip i might be
59:33
being a little bit of a smart mouth so
59:34
just just leave it
59:36
and i’m on instagram safiya.noble.phd
59:42
and i try to post some things there too
59:44
and share out and
59:46
of course you can always email me center
59:49
staff
59:50
ucla edu and um
59:54
uh i think wait go to our twitter
59:58
because i could be lying about that
59:59
that’s a brand new
60:01
we’ll see c2it.ucla.edu
60:04
is the the centers contact us
60:08
and you’ll get right to me awesome thank
60:10
you again thank you so much
60:11
thanks to all of you for tuning in and
60:13
thanks to our listeners out there
60:15
on the podcast when that comes out
60:18
sophia
60:19
have a beautiful rest of the day we’re
60:21
going to disconnect thank you everyone
60:23
thanks

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.