Podcast: Play in new window | Download
About this episode’s guest:
Dr. Chris Gilliard is a writer, professor and speaker. His scholarship concentrates on digital privacy, and the intersections of race, class, and technology. He is an advocate for critical and equity-focused approaches to tech in education. His work has been featured in The Chronicle of Higher Ed, EDUCAUSE Review, Fast Company, Vice, and Real Life Magazine.
He tweets as @hypervisible.
This episode streamed live on Thursday, July 16, 2020. Here’s an archive of the show on YouTube:
Episode highlights:
9:35 on his recent article about Facebook profiting from hate (“Facebook Cannot Separate Itself From the Hate It Spreads”: https://onezero.medium.com/facebook-cannot-separate-itself-from-the-hate-it-spreads-967ec9b8793c)
16:51 on regulations
17:58 “the notion that people should be able to throw something out into the wild, see what damage it does, and then maybe clean up some of it afterward is pretty disturbing and faulty”
19:41 “there have been many people who for years have been saying that facial recognition should either not exist or should be heavily regulated and for years many of those people have been told ‘the horse is already out of the barn’ — that basically that a thing can’t be banned once it’s been invented… which is patently false.”
25:21 “many folks are actively engaged in remaking society right now”
30:04 on digital redlining
About the show:
The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.
Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.
Full transcript:
00:46
and I think that we are
01:39
Dr. Chris am I pronouncing your last name right yeah it’s Gilliard Gilliard Gilliard
01:44
so sorry Gilliard who is a writer professor and speaker his scholarship
01:50
concentrates on digital privacy and the intersections of race class and
01:54
technology is an advocate for critical and equity focused approaches to tacna
01:59
education and his work has been featured in The Chronicle of Higher Ed, EDUCAUSE
02:03
review, Fast Company, Vice, and Real Life Magazine. Chris I have been looking so
02:09
forward to having this conversation with you thank you so much for being here
02:12
oh thanks thanks for having me I really appreciate it it’s like I’ve been
02:16
admiring your presence on Twitter and really like the sort of dedication you
02:21
have to chronicling what’s going on with the intersection of data and privacy and
02:27
race and social justice and kind of overall protection of humanity within
02:31
areas like Amazon Ring and Rekognition sort of facial recognition so I’m so
02:39
excited because I think that you’ve done an amazing job of carving out this space
02:44
and so I wanted to dissect that a little bit with you so how did you find
02:48
yourself getting into that intersection of justice and technology uh it’s purely
02:53
accidental to be honest you know I I originally I mean so I have a love-hate
03:01
relationship with Twitter as many of us do you and I originally got on Twitter
03:08
just for just to learn right just for community just to find other people who
03:15
were interested in some of the same things often and you know found a lot of
03:23
them you know like Audrey Watters and Frank Pasquale and David Columbia and
03:28
like tons and tons of people and we’re all like very cool uh very gracious and
03:35
very open about sharing their scholarship
03:41
and I don’t know I mean I just I’m I guess I was on there enough that once in
03:50
a while people started paying attention to what I said and that I don’t even
03:55
know how it happened it’s purely not planned at all yourself a little blue
03:59
checkmark which kept you out there certainly other people are far more you
04:14
know deserving of it it’s supposedly not some indication of of deserves got
04:21
nothing to do with it find that that surveillance technology for example was
04:31
an area that was always a concern for you or is it something that you know as
04:36
technology became more sophisticated and corporated more like machine learning
04:39
and things like that that you found yourself being drawn more and more into
04:43
that area I mean it’s definitely always been a concern you know I grew up in in
04:49
Detroit and the example I always use is there’s a there was a vice group and
04:57
Detroit called STRESS which stood for Stop The Robberies Enjoy Safe Streets
05:02
and there’s been a lot of good work on that but it was a group that of police
05:09
and law enforcement that their job was and in their mind their job was to to
05:15
clean up the city of Detroit but basically what it wound up doing is
05:20
surveilling and profiling lots of black folks and you know by the time they were
05:26
disbanded I think had killed and I don’t know the exact number but if like they
05:33
had killed 14 people and 13 of them were black and this is in the span of a
05:38
couple years and you know I mean I’ve obviously learned a lot more and you
05:47
know so for instance I always point to Simone Browne’s work Dark Matters that
05:53
talks about the history of surveillance and black
05:56
folks and you know a thing again I think it’s important to note that marginalized
06:04
communities actually are no stranger to surveillance it’s just that the
06:09
technology has changed right and so it’s always been something at the forefront
06:15
of my mind but as the technologies change and become more pervasive and
06:22
more insidious in some ways I think it’s important to chronicle that or pay
06:28
attention to it a lot of people think this stuff is new it’s not new
06:33
like the way it it takes shape is new but many of these things have been going
06:39
on for hundreds of years right I mean it seems like you could talk about
06:42
surveillance in the context of you know us and sort of corporate presence of
06:47
surveillance or citizen surveillance and things like that but it obviously now
06:51
that you have the the pervasiveness as you said of things like the Amazon Ring
06:56
doorbell and you’d have I mean even just people carrying around phones with
07:01
cameras on them all the time everywhere which is you know kind of been a mixed
07:05
blessing right we’ve got that constant surveillance right it’s a it’s changed
07:11
the game for I think for human experience as a whole but certainly as
07:17
you say marginalized communities are seeing the rough end of that it seems
07:21
like right absolutely and I you know I think again a lot of times I think the
07:29
the line that comes from tech companies often is about the you know say Facebook
07:36
about giving people voice or if it’s you know Apple about you know the the
07:41
benefits of having a camera in your pocket but we want to be you know are at
07:45
least I encourage people to be super careful about what that means you know I
07:49
was just reading an article today an interview with someone about Facebook
07:55
and you know the the so I’m paraphrasing but essentially you know Facebook’s
08:06
claim is that they do was connect and so they’ll cite you know
08:12
and Zuckerberg has many times they’ll cite the fact that the George Floyd
08:17
video was was put on Facebook this is true but it’s also the central
08:24
organizing site for vast swathes of you know hate groups and misinformation and
08:31
things like that and you know that so a really great piece on Medium that the
08:39
profitability of hate you know Facebook he talked us through sort of the main
08:44
points there ’cause I know this big this big ad boycott was supposed to I think
08:48
bring visibility to to you know this the ecosystem have advertisers and what is
08:56
what kind of approved of and and sort of subsidized by the ecosystem and by what
09:03
society says like oh we understand Facebook you have to have you know ads
09:06
to support your business and all that but really the hate that’s going on
09:11
that’s happening and is being pushed into sort of more remote corners of
09:16
Facebook because they’re they’re pushing it into groups and things like that
09:19
that’s it’s getting harder to justify and I think you did a really brilliant
09:23
job in that piece sort of laying out the claims that are happening from
09:28
officially from within Facebook but also you know kind of the reality is that
09:31
that check against that yeah yeah so you know it’s driven by rage you know as as
09:38
most of my writing is you know I was really tired of seeing you know Facebook
09:44
our Zuckerberg or Sandberg or various other people um you know who represent
09:50
the company spout sort of the standard lines you know about connection you know
09:55
as as a good within itself or you know the other line that they don’t profit
10:02
from hate and so I started out by comparing Facebook to a factory you know
10:08
in a factory that not only promises the store and individuals toxic waste
10:14
but multiplies that waste and continuously you know disseminates it
10:19
into you know the environment
10:23
that way later on you know after the after much of that waste has been
10:31
distributed you know claims makes a claim that
10:36
they’ve cleaned up 90% of it and and thinks that that should be allotted and
10:43
so when it happens to be the largest factory is still more than the universe
10:52
has ever been exposed to in one concentrated amount right right and so
10:56
the thing is you know and so they always say we don’t profit for me so like I
11:02
think that’s a lie you know but that in print you know but I think it’s a lot
11:09
but like even if you accept what they’re saying so there are things that follow
11:14
from that so if they don’t profit from it there’s only a couple of other
11:19
explanations they keep it up even though it’s not profitable right so I mean the
11:26
implication there is not good or that they are unable to complete it and we
11:35
don’t accept that in most other industries you know we wouldn’t accept
11:41
it you know we don’t accept it for food when I accept it for you know car safety
11:50
well you don’t you know like there’s there are very few things and and the
11:55
other thing is like they caused the problem it’s right there’s no industry
12:01
I’m aware of I mean and again like there’s there’s limits and weaknesses in
12:06
this analogy right I mean we still have companies that pollute and we still have
12:13
you know meatpacking plants with all kinds of problems and things like that
12:17
right I acknowledge that um yeah you’re not you’re not advocating those things
12:23
by using the analogy those yes it’s fair yeah
12:27
they caused a problem that that is a such magnitude that they can’t fix it
12:34
and so again so the the at least in my the way I view it is if you caused such
12:44
massive damage to society this isn’t that not like my opinion or something I
12:48
mean like we look at Myanmar or the Philippines or you know recent elections
12:55
or you know misinformation around the coronavirus or vaccinations you know on
13:01
and on and on right there they’re kind of missteps or mistakes as they call
13:07
them are well documented so they caused the problem but they’re by their own
13:14
admission um it’s a thing that they’re unable to eliminate altogether and so if
13:22
you think is doing something good right that the existence of Facebook is a good
13:27
yeah like it’s not necessarily a position that would take good then the
13:45
question becomes you know what how much evil are we willing to accept for
13:50
whatever good they do yeah you know how much hate you know is how much hate is
13:58
acceptable all right and yeah I got some people who commented on my piece you
14:04
know and again sort of using that same articulation well how many parts per
14:09
million of hate you know would be acceptable you know thanks or well you
14:14
know that there’s a certain amount of rat feces that’s allowed in food… people’s gymnastics
14:23
are very interesting yeah is really interesting about about the
14:29
dilemma is that it’s very easy to see how mechanics like outrage are part of
14:37
the the underlying algorithmic sort of optimization of Facebook right you
14:42
definitely can feel that the things that get the most traction feel like they’re
14:48
the things that get the “angry” responses and the “wow” responses and things like
14:53
that so you know there’s there’s already this this capacity for outrage that’s
14:58
that’s sort of built into the optimization of the feed and what what
15:03
gets propagated I think you just have to ask what is the difference sort of
15:07
experientially between outrage and hate and how do you how long how many steps
15:12
does it take you to get from propagating outrage to propagating hate it’s not a
15:17
very difficult equation it seems like yeah and even by their own metrics their
15:23
own internal studies you know they are I forget the exact number sixty-four
15:30
percent or something like that and one of their own studies sixty-four percent
15:34
of the people who joined is extremist groups have them recommended to them by
15:38
Facebook you know and again like it’s a long list I mean in terms of like you
15:45
know auto-generating— algorithmically generating categories like “Jew haters” I mean they
15:51
like the list is almost endless of the things they’ve done and yeah I
16:00
and so it’s not even that again to go back to that factory analogy it’s time
16:07
that they store it right they multiply it and amplify it and and recruit people
16:13
to add to it and so it would be one thing you know I mean the old sort of
16:17
publisher you know publisher a printing press analogy which is super— right
16:26
it’s not even that they just like give it a place to exist they magnify it and
16:31
push it out right this there’s no policy that comes close to the skill right like
16:36
I come in my work I approach these topics often from the perspective of
16:41
capacity and scale and and when you have the kind of scale that they have you
16:45
also have a responsibility at scale that has to be at least proportional in some
16:51
way right like so so that comes to the the topic of like regulation and
16:55
responsibility and YouTube I know we were just talking about before we got
16:59
live that you testified before the house was the Financial Services Committee oh
17:04
sorry yeah yeah that’s November mm-hm and I know you were talking there a
17:09
little bit about regulations what do you see the well first of all I guess what
17:14
would you see as the threshold for regulations when does an issue become
17:19
sort of worthy of consideration of having regulations around it does that
17:24
that make sense as a question yeah I I mean I’ll be perfectly honest I mean I
17:30
think many of the models for this stuff are extremely broken and you could argue
17:36
that they’re not broken because they’re working to the benefit of the powerful
17:40
which is what they’re supposed to do but um I think they’re extremely broken and
17:47
that many of these products you know the so the idea or so so and this is very
17:56
simplified but the the notion that people should be able to throw something
18:02
out into the wild see what damage it does and then you know maybe clean up
18:07
some of it afterwards it’s pretty disturbing and faulty yeah you know and
18:13
again like we typically don’t allow that in other in other arenas or in the to
18:22
the extent that we have allowed that society as you know often realized that
18:27
that was a mistake and done it differently right close we are to the
18:34
experience or to sort of the innate understanding of what
18:38
thing is like food it’s very easy to connect with on a you know organism
18:43
level and understand that we need safety and protocols in place there and I think
18:49
technology just feels so abstract to so many people that it’s hard to get you
18:53
know kind of a consensus about where that threshold is where it starts to tip
18:58
over into you know damaging or harmful but at a consensus level obviously I
19:03
think when you’re working in it and around it as you and I do we understand
19:08
that there’s no neutrality here it’s not a neutral thing you’re amplifying your
19:14
own biases you’re amplifying you know the values they have but but I think
19:18
it’s really hard to create the the social discourse that sort of that
19:22
accepts that there’s a need for it for that for it for regulation but and then
19:28
again we’ve seen the Overton Window moved so much in just the last few weeks
19:32
right right right yeah I mean I and this is an example you know certainly we have
19:38
a long way to go but you know there have been many people who for years have been
19:45
saying that facial recognition should either not exist or should be heavily
19:54
regulated and for years many of those people have been told that you know the
20:03
horse is already out of the bar and like you can’t you know you could put a tech
20:07
you know like once attack is out in society like you can’t take it back and
20:10
you know whatever formulation you want to use but that basically like that a
20:17
thing can’t be banned once it’s been invented which is patently false right
20:21
um but but it you know not even a year or two ago you know many of us are being
20:30
told that this was a ridiculous proposition that it was had been
20:36
invented and that there was nothing we could do about it
20:38
but we’re seeing different cities townships municipalities you know we’re
20:44
banned facial recognition we’re seeing you know and again however pessimistic
20:51
or or you want to be about it we were seeing
20:54
companies step back in some form or another from facial recognition to
21:15
calculate or you know slap Jeff Bezos on the back or anything like that um but
21:22
you know we’re seeing cracks however small you know fissures and and having
21:30
some victories you know activists journalists
21:34
technologists you know on and on who pushed back against these things and so
21:39
now we see a moment where it does seem possible and when I say that again like
21:47
many people recognize that it was possible and I’ve been fighting super
21:51
hard to make that a reality really more people are coming to the
21:56
realization that it is possible and probably necessary that to think about
22:03
some of these technologies as things that need to be you know abolished
22:08
controlled regulated you know however you want to think about that but that as
22:13
a society we’re not just stuck you know once some clown like put something out
22:19
there like we’re not just stuck with it and whatever way that that clown
22:24
decides that it should exist right like that generally not how um I mean
22:30
well I was gonna say how societies work like so we’re seeing the ill effects
22:37
that when society works that way like we’re seeing a lot of the negative
22:44
consequences of that and so I think you have this moment which is revolutionary
22:50
in in so many different ways I think that is an important realization that
22:56
many people are coming to — yeah I want to ask you about that if that’s alright
23:01
it feels like the the moment sort of moment is what we — everybody kind of
23:07
keeps euphemistically referring to like that’s the sort of window since
23:11
George Floyd’s death and and the black lives matter protests and sort of the
23:16
rise around the world really of a sort of consciousness raising in a sense of
23:21
like oh you know the anti-colonialism and anti racism and and what’s been
23:27
interesting of course I think about about that is well as you said this has
23:32
been going on for years the works been being done for years but I think also
23:36
that as I mentioned earlier you know the defund the police / abolish the police
23:41
discourse was was certainly present but it feels like that Overton Window has
23:47
moved in that those those positions have become more defensible and sensible to
23:53
people like there’s more of an acceptance that these are not radical or
23:59
revolutionary ideas or they are revolution ideas but that their time has
24:02
come perhaps I don’t know exactly how you know you might characterize it in
24:06
the social understanding but what I’m looking for is you know do you see
24:10
opportunities to seed other ideas similarly like to take advantage of this
24:17
momentum and try to create you know sort of a social acceptance for other what
24:23
might seem revolutionary radical ideas like the banning of certain types of
24:27
technologies would you see yeah I mean um you know so I’ve said this before so
24:33
apologies to anybody who listens to me on other things but you know what I
24:40
would say is that I’ve heard a lot of people and maybe less so now because it
24:48
seems so I’ve heard a lot of people say well what is society going to look like
24:52
after the virus and I’ve heard fewer people say that late because judging by
24:59
the inaction of many of who are supposed to be our leaders like there may not be
25:05
an after the virus or it may not be for a long long time in America anyway but
25:14
what I think people many people do recognize but what I encourage people to
25:19
think about is that many folks are actively engaged in sort of remaking
25:26
society right now you know and and and some in ways that you know want to
25:34
sacrifice us to capitalism or make our lives worse you know in in some very
25:40
concrete and specific ways and so I think now that without sort of silent
25:51
conspiratorial like there’s terrible people like that right and they’re one
25:56
of them and so they we are all kind of deciding in in different ways we’re all
26:06
kind of deciding what we want society to look like moving forward and and then it
26:12
can again it’s not like we’re not always doing this but this this is a very time
26:19
that certainly in my lifetime I’ve never seen um when radical what we’re seeing
26:25
as radical propositions are now seen as much more acceptable and possible yeah
26:31
so whether that’s you know defunding the police or whether that’s you know
26:38
abolishing certain technologies or you know I mean there’s lots of different
26:44
things that in ways of thinking that people previously hadn’t accepted that
26:50
now seemed again not only possible but desirable and I think we have to take
26:58
advantage of that to the extent that we can
27:00
yeah so horrifying and I mean just like it’s an atrocity and you know I’m trying
27:10
to say this the right way so the the moment that we are for right now like
27:18
stems from an atrocity right it stems from the order of not only one person
27:24
you know George Floyd but any others and so I don’t want to in any
27:30
way like George Floyd I don’t think you know I can’t speak for him but like he
27:35
didn’t want to die for this like I don’t you know you know I don’t know him but
27:41
right it shouldn’t it shouldn’t take that for us to per society to change
27:49
like globally the outpouring of activism from this is a moment and again it’s
27:58
probably not even the right word moment but it’s yeah and so I think to the
28:13
extent that we in and again when I say we what I tend to mean great activists
28:21
scholars and journalists like people anybody who’s interested in kind of a
28:26
more equitable or just society it’s a it’s an opportunity to push those ideas
28:33
happen because certainly the people who like what society to be worse are not
28:40
resting so yeah hearing before the House Financial Services Committee about the
28:52
problematic nature in logarithmic bias I know you talked about two concepts one
28:56
was digital redlining and one was predatory inclusion and you know I wonder if you
29:02
could just sort of recap briefly the steps for you know people who are
29:06
listening and watching and may not have encountered those those concepts before
29:10
yeah so predatory inclusion as I came to understand it is mainly through reading
29:20
the work of Louise Seamster and I’ve only said this name out loud a couple
29:28
times so I apologize if I don’t say it properly
29:32
Raphaël Charron-Chénier — I think that’s how you say it and I don’t know where I
29:39
fail so if
29:40
I said it improperly I apologize but I came to that notion through their work
29:46
and so I would do it terrible I am a job of defining it on the fly so I encourage
29:54
people to read their work yeah mining is more an area that you focus on yeah yeah
30:03
and so just a little bit of background I mean if people aren’t familiar with the
30:09
history of redlining in this country again something worth reading up on
30:15
I think Richard Rothstein’s book The Color of Law does like a really deep
30:20
dive into it but there’s also yeah if you’ve taken from I think right like far
30:30
be it for me to encourage people to just google something but there’s lots of
30:34
scholarship one on redlining um but like having grown up again in Detroit like
30:40
it’s a place where the long-term effects of that are very visible
30:48
people know I’ve heard the term 8 Mile or have heard of the place
30:53
8 Mile as it’s associated with Detroit sort of like one of the dividing
30:58
lines between what’s Detroit and not Detroit and if you drive along 8
31:03
Mile or some other roads in in Detroit proper like Mack Avenue for instance
31:10
it’s very clear sort of you know 50 60 70 years later what it looks like still
31:17
what the after-effects are of these housing policies and so I I teach at a
31:30
community college and I started to see through a lot of the work with students
31:34
some of the ways that those effects became digital whether that was lack of
31:41
access to broadband or a lack of access to certain scholarly publications and
31:48
things like that and that you know there are a lot of ways that these these
31:54
things right so lack of access to broadband and again like this was true
31:59
before the pandemic right they have become more true lack of access to
32:04
Internet and can be tied to health outcomes can be tied to you know
32:10
long-term educational outcomes or employment opportunities and things like
32:15
that and often if you looked at it say a map a redlining map of the city of
32:26
Detroit and the suburbs or of Cleveland and the areas around it or of certain
32:33
parts of Coppa that many of the ways that these maps were drawn like
32:41
redlining maps like those a lot of the what would I call it the sort of
32:50
disproportionate effects of discrimination are still being felt by
32:55
the populations yeah in the areas and so what I call that is
33:01
Digital redlining and again um I call it that in part because people tend to
33:09
chalk these things up to sort of chance or a luck or something other than direct
33:17
government policy you know something other than decisions
33:20
by you know by platforms and companies you know so again the classic examples I
33:27
use like is the how Pro Publica determined that people are using
33:33
Facebook or could use Facebook to violate the Fair Housing Act right so
33:39
they and I don’t know if you want me to this is great I know I’ll just interject
33:46
that you know to me this is really fascinating because you know my previous
33:50
book before tech humanist my book that I had written just written was pixels in
33:54
place and to me this is a very pixels in place discussion right because not only
33:59
as you say is it that people would chalk it up to you know just bad government or
34:04
kind of that sort of thing but it also feels like people trivialize these
34:08
impacts because it’s just like well it’s just internet how could that possibly
34:11
have the impact that you’re really talking about but no I think this this
34:16
seems like a critical issue to understand fully as especially as we get
34:21
further into you know if we spend a little more time talking about you know
34:25
the algorithmic bias and how that amplifies through throughout the
34:29
different platforms and systems that that we’ve looked at and and how this
34:34
plays out across the ecosystems of surveillance Tech and in facial
34:39
recognition and so on so please please continue yeah and so I mean again if you
34:45
just looked at Pro Publica and in Facebook and redlining um it’ll pop up
34:50
but you know a Pro Publica did a really deep dive into some of the ways that
34:56
people could use Facebook targeting to discriminate against people in terms of
35:04
housing and so again so Facebook their big thing is is at targeting and for
35:12
some products like again again liveries the first of all like I think again
35:18
targeted advertising should be abolished like it should not exist if people okay
35:23
but there’s no radical take for the day or maybe one of many radical stands
35:30
you’ll take but there are some products for which you might be able to make the
35:38
argument that it’s relatively harmless right existing in a vacuum
35:42
okay it doesn’t but for things like jobs and housing and things like that
35:54
it not only is severely problematic but actually is against the law and so
36:02
ProPublica essentially found that if you wanted to that it was possible to target
36:11
say a housing ad and have certain groups not see that ad and and again in one of
36:22
the most pernicious effects of it or parts of this is that um only so people
36:29
don’t know what they’re not seeing and so it took technologists and journalists
36:37
and and researchers to find what’s happening so that I wouldn’t know so I
36:43
don’t use Facebook but someone using Facebook wouldn’t know that they’re not
36:49
seeing an ad for housing or that they certain age they wouldn’t see ads for
36:54
certain job or things like that right it would take a dig into Facebook by people
37:00
who know how to do that to find that out and so what again so I would lump that
37:06
under what I call visual redlining again like this isn’t an accident I mean
37:12
Facebook would call it a mistake but but I think it’s incumbent upon us to to
37:21
realize that these are the products of decisions that Facebook made yeah and
37:26
also a severe lack of diversity in in in companies like them I
37:34
mean I’m at that and it’s actually interesting there that relates to a
37:38
comment we got on the feed that digital redlining is a verb digital divide and
37:43
other concepts that describe this are announced
37:45
someone made choices that made this and that that feels like a really nuanced
37:49
observation that there is this as you say this kind of active agency involved
37:54
in the process of that kind of suppression it’s it’s not an accident
37:59
that that those ads don’t get shown that to certain populations it’s not an
38:04
accident that you know that these these divides exist you know they’re they’re
38:09
being put into place yeah with the decisions they’re being
38:13
made you know and yeah and I think and and again to that’s a reason I find
38:20
discussions about redlining so important is that you know why you know so people
38:27
act as if certain groups of people just decided to live near the incinerator you
38:33
know or certain groups of people just you know I mean with no or you know
38:38
don’t know the history of how for so long in America the way generate a
38:44
central way that generational wealth was accumulated it was through housing right
38:49
but you know large segments of the population were denied that through
38:53
government policy you know and so by the time so an example I use is that I when
39:00
I went to college it wasn’t uncommon for my my people who are were in the same
39:07
year as me or things like that like their parents might give them a down
39:12
payment on a house or give them their old house or something like well there’s
39:16
a reason they were able to do that yeah and there’s and there’s a reason that it
39:23
was much more likely that their parents were able to do that if they’re away
39:26
than my parents were able to do that me being black yeah sure do you feel like
39:33
there’s a parallel to within that generational wealth within housing to
39:38
what we see within the digital ecosystem and technology like the the Internet age
39:42
is there is there a sort of landed gentry of a sort you know within the the
39:48
technology space within the digital ecosystem that you have you seen that as
39:53
you apply the concept of digital redlining well it’s not unrelated in
39:57
that I mean if you think about who they tend to hire and you know or what
40:02
schools they tend to hire from for instance and something just came out
40:07
with Facebook’s hiring numbers and I’m not you know I
40:10
pick on Facebook because they’re one of the biggest and and easily you know it’s
40:15
easy I can’t take on them right I’m one person and it’s a multi-billion dollar
40:19
company but well I use them as an example often because but it’s not
40:25
confined to them I mean Uber Lyft or Airbnb or Twitter or
40:29
you know whoever you cannot answer whoever you want in there and actually
40:34
want to get to that because I think it’s important that we do kind of look at the
40:37
parallels across these different platforms and technologies but but
40:41
continue with your thought please do think you’re get onto a really important
40:44
point there I mean something oh gosh I wish I had it
40:49
right in front of me I don’t know the exact number but there you know Facebook
40:54
just released their diversity numbers and so I just posted about this on
41:01
Twitter not too long ago yesterday and the headline so I grabbed that half of
41:06
hyperbole Twitter feed and just post it in the comments and we’ll see it will
41:10
have it in real time that would be great something like Facebook inching towards
41:16
their diversity goals and they had gone from like 4% to 5% or something like
41:24
that you know and over time they’ve offered many different excuses for why
41:31
their diversity numbers are what they are you know including like blaming it
41:36
on on like a pipeline issue and things like that of course yeah of course yeah
41:42
but you know I mean it’s always amazing that these companies who on the one hand
41:49
want us to believe that they can in she achieve the impossible right through
41:54
code so on the other hand it’s so it’s like super impossible to like
41:58
fine black people you can work here you know right no it’s an important lie to
42:06
surface I I so but I also interested in in you know as we mentioned kind of
42:11
looking at not just I think you’re right it is easy to use Facebook as the
42:16
example of all that is bad and unholy but you know we know that there’s so
42:21
many other examples too of of this kind of algorithmic bias and the kind of
42:27
scale even that’s affecting people in their daily lives like I know you talk a
42:32
lot and you even had your username at one point on Twitter reflecting the the
42:37
Amazon Ring doorbell like “one Ring Doorbell to rule them all” I think was what you had for a while? so there is this seems to me there’s this whole
42:46
ecosystem of Amazon their Rekognition algorithm, facial recognition package and then
42:56
the Ring doorbell so that’s that’s distributing through the internet of
43:01
things you know into consumer hands the surveillance technology that that’s
43:05
feeding a whole bunch of resource reference information back into you know
43:10
that the system and then yeah NextDoor which I know there have been partners
43:15
right like with that company and with the the weird dynamics that take place
43:22
socially within that the sort of carrying that takes place on a next day
43:27
these are more is there more to that ecosystem that I’m I mean gosh I mean
43:35
there’s there’s Clear you know Thomson Reuters you know Motorola’s and this you
43:41
know Axon you know Vigilant Solutions you know I mean Palantir you know again
43:52
it’s a long long list and you know the there’s some weird carve-outs ending
44:01
again weird is if wasn’t planned right there are a lot of carve out so
44:06
that and private companies are allowed to do things
44:10
some law enforcement are not allowed to do but then law enforcement that can
44:17
then go to the private company and just purchase that information and so or even
44:24
yeah if we if we talk about Ring and Neighbors and NextDoor who have
44:30
actively cultivated these relationships not only have sort of a consumer facing
44:37
product but to make the connection or the access to law enforcement
44:44
you know friction okay these companies have come out with statements about like
45:10
NextDoor came out with a their Black Lives which I think and I wrote
45:16
something on this, too, which I have taken to calling “black power washing” that
45:24
Great term
45:26
that where companies say that they stand with the black community and I don’t
45:34
really know what they think that means but they’re the products themselves are
45:39
our products that undermine that every step of the way you know I think NextDoor
45:45
being an example that and again much like Facebook you could point to
45:52
some positive things that have come from NextDoor
45:54
but at its root it’s a it’s a snitch app — right — it’s a app that or platform is to
46:01
people’s anxieties about who should and shouldn’t be in their neighborhoods and
46:06
what kinds of activities they think are — acceptable or not acceptable
46:11
neighborhood and to make the connection between that and local law enforcement
46:18
you know super easy and you know it’s not hyperbolic to say
46:24
endangering the lives of Black folks and Brown folks yeah yeah it just seems like
46:32
it’s a really important concept for people to have an understanding of
46:36
because I feel like that that is not if the conversation reaches the masses
46:42
about algorithmic bias and and about the the surveillance technology it feels
46:48
like mostly it’s about very abstract applications of AI and recognition like
46:56
people don’t necessarily tie it to the doorbell that they bought that they
47:01
loved and that you know gives them updates of who’s delivering packages and
47:05
when or whatever you know they don’t necessarily make that connection so I
47:08
think it’s a really a really clear important relationship to draw but it
47:14
also feels like there’s there’s even more right like when we think about I
47:18
mean even Twitter which you and I both actively use and Google which you know
47:22
we’re sort of offhandedly sending people to to go search for for research there’s
47:30
there’s always going to be some conflict complexity and complication to the
47:34
relationship between what that technology is doing at scale and how it
47:39
affects the populations who use it right is that yeah absolutely and again I
47:47
haven’t mentioned this but if people haven’t read Algorithms of Oppression
47:51
by Safiya Noble — she just agreed to be a guest on this show so she’ll be on here in a few weeks —
47:59
yeah so but and so part of a big part of the problem
48:03
unfortunately though is that a lot of these effects are there are they are
48:09
invisible they’re disproportionately or desperately impact marginalized
48:15
populations so it’s hard to so I have talked spoken and written about ring and
48:25
you know Amazon Ring a lot a time the problem with talking about it is that
48:31
most people rate and this is pre-pandemic and and through many people
48:38
who would invest in a Ring in that you know I’ll just be exclusive of this many
48:43
white people who would invest in a Ring only think of law enforcement in terms
48:48
of a institution or body that works for them yeah never as one that’s going to
48:54
work against them sure and so it’s very hard and so what I’m what and so they
49:01
place often place the safety of their packages that’s more important than the
49:06
safety of Black lives — right I mean yeah good but many people even
49:15
even if you explicitly ask them — they would still say well I need to get
49:22
you know whatever product it is you know and so until that changes there until we
49:33
can get that formulation to change and some of these probably aren’t going to
49:38
change and I realize that’s a very pessimistic statement that’s okay you’re
49:44
talking to the the “optimistic futurist” so we’ll balance each other out a little bit well
49:48
and that leads me to I
49:49
guess you know as we’re sort of wrapping into an hour time frame I want to make
49:54
sure we’re sort of summarizing some of this for folks do you feel like there
49:59
are technologies that you are optimistic about in terms of the the impact that
50:03
they can have at scale and and what would those be Oh
50:08
so my short answer is no I mean I think and so the the problem is or the way I
50:16
approach it right is I think that scale what would be the right way to say this
50:23
scale in itself creates many of the problems that we’re dealing with you
50:31
know whether it’s Facebook be having being too big to eliminate hate from
50:36
their platform I don’t think you or you know before so if we talk about
50:42
Neighbors or NextDoor or something like that right if I saw someone driving down
50:47
my street or riding a bike or selling water or something like that
50:51
I didn’t think that they should be doing that I didn’t have the ability to
50:56
broadcast that to everyone in my zip code so many of these things because
51:01
like the sheer scale on scope of them actually magnify whatever was
51:07
problematic behavior to begin with and also creates a lack of the ability or
51:16
incentive to police it not and not the right word right but to to to monitor it
51:24
in ways that would keep people safe so and the other thing is that I’m very
51:30
wary of a notion that we can take some of these tools and just point them in
51:35
the other direction and they’re going to all of a sudden be in service of not
51:40
power right yeah I mean body cams being it being an example right that’s a great
51:47
example yeah and so yeah I I think that that scale is
51:53
in itself often a big part of the problem because it the notion of growth
52:01
right and the the and also that most of these companies I’m sorry there’s like
52:08
not a short answer to this yeah there isn’t a short answer I mean even what
52:12
like and I’ll just interject that you know when when I’m you know talking
52:16
about the the framework of optimism it’s about work like the the idea that if you
52:21
if you can see something can be better that you’re ethically obligated to make
52:26
it better and and that’s that’s not a way of letting people laugh the hook or
52:30
letting platforms off the hook and saying like oh we can change the
52:32
direction and you know sail it toward something that’s positive and and
52:37
everything will be a wonderful because we have you know we’re amplifying human
52:41
issues and humans are inherently flawed right like we have those we have those
52:45
problems and our society has those problems the systems we built have those
52:49
problems so we’re going to have those problems but I think the the work of
52:53
someone like you I hope that my work is helping contribute to you know a wider
52:58
awareness of some of this and I think it’s it’s really important that we have
53:03
these kinds of conversations by the way one of the comments we
53:05
is uh I know Chris is gonna say something good when he says I don’t know
53:09
how to say this which I love I think it’s literally you’ve said some very
53:16
profound powerful things during this discussion so are there are there things
53:22
you think that that we need to be more cautious about I mean you’re raising a
53:27
lot of sort of red flags on a lot of the surveillance technology we talked about
53:32
you know big data as it relates to the financial model all right there are
53:37
other things that you feel like just don’t get nearly enough air time in
53:40
terms of the the maybe even the existential threat that they pose to to
53:45
people yeah I mean oh gosh I mean you know I like I feel like we’re we’re you
53:52
know sort of the two sides of the coin because I mean in as much as I I think I
53:59
have a job you know sort of that is not teaching you know that’s not my my
54:04
normal job it’s to point out that these things are terrible oh and so but I
54:12
think one of the things if I can find some optimism right is the notion that
54:19
before that that is becoming more real which is that we really need to think
54:24
about the effects of these things or potential so oh gosh and I could I wish
54:31
I could remember this person’s name I know them from Twitter but they talk
54:33
they do a lot of speculative work I think it’s Casey does a lot of
54:39
speculative work with technology like what are the potential harms of this
54:43
thing I think before you put it out right so when Zoom came out and said we
54:48
had no idea that people would use it to spread racism and you know misogyny yes
54:57
yeah right like they could they could have done that work right right and I
55:03
think one of the things I’ve seen that does give me a little bit of hope is
55:08
there are more and more people not only saying that we have to do that work but
55:13
being inside these companies and holding them accountable for doing it you know
55:18
not after the harm is done right not after the toxins
55:21
been released but before it it’s released yeah great great summation
55:26
statement I think honestly because it seems like people are always looking for
55:29
you know what’s the practical thing that what can I take back to the work that
55:33
we’re doing in my company what can I take back you know to how I’m practicing
55:38
you know technology and developing technology and it does seem like that
55:42
that is the thing right like we need to have more of a robust for examining what
55:48
we’re building for the downstream consequences like how this is going to
55:52
play at scale as you say like that’s where the problems creep up is it scale
55:56
yeah wonderful yeah Kim Crayton I don’t know if you know her but like for
56:02
principals and again like I I feel like I know one by heart but now that I’m
56:06
under pressure I can’t I can’t name it yeah is a is a risk oh gosh but sorry
56:18
Kim we should have but here’s a good opportunity to plug he does create
56:26
anti-racist workshops virtually she did when I did it ago everyone else look for
56:32
Kim Crayton and sign up for those because it’s super important that we’re
56:35
having a shared vocabulary and understanding about what it is we’re
56:38
talking about when we talk about anti racism and how to build you know better
56:42
technologies and make sure we’re keeping as much of the racism and other problems
56:48
out of our technologies so yeah I think so I’m glad you gave us a management
56:53
something but basically to sum it up really quickly how I take that is that
56:59
if zoom had had people you know if they had had a better set of folks working on
57:05
these things they wouldn’t have to later on say we
57:08
didn’t know people were going to do X right because black folks know right
57:11
black women know that’s like marginalized populations know that these
57:15
things are gonna happen yeah instead it’s not just like we want diversity
57:20
numbers so we can have better diversity numbers like there’s ways that it’s
57:25
going to make the company work better right and or these the people who are
57:30
recognizing the potential problems are not being
57:32 in the organization right so that’s a that’s a note to take back to the
57:36
organization – right like listen when people surface the issues and become
57:40
huge — yeah exactly I think this has been a fantastic conversation I hope
57:45
everyone who’s tuned in has gotten a lot out of it
57:49
Chris where can people find you online and follow your your wonderful wisdom uh
57:54
well I’ve been entirely too much time on Twitter and as “@hypervisible” I
58:01
occasionally write as you said I had a recent piece of Medium and that’s it
58:08
yeah you if they just follow you on twitter that you linked to that Medium
58:12
piece right from there too right yeah yeah my it’s my pin tweet right now and
58:17
just one last little note here that I see in comments it says Kate’s mom is
58:22
watching so hi mom thanks for being in the audience it’s wonderful Chris I
58:28
cannot thank you enough for being part of my little experiment here and and
58:32
finally getting a chance I think for us to have this conversation you know face
58:37
to face quote-unquote yeah well yeah thanks for having me my pleasure and
58:42
well I’m think we’ll do this again I I would love to have you back on sometime
58:46
in the future and if everybody has any comments or any questions feel free to
58:52
follow up with me or with Chris I’m “@kateo” on Twitter “@hypervisible” and I
58:58
will see you guys next time next week we’ll be back and I hope to see you then
59:02
bye bye