Podcast: Play in new window | Download
About this episode’s guest:
David Ryan Polgar is a leading voice in the areas of tech ethics, digital citizenship, and what it means to be human in the digital age. David is a global speaker, a regular media commentator for national & international press, and a frequent advisor & consultant on building a better tech future. He is the co-host/co-creator of Funny as Tech, a NYC-based podcast & occasional live show that deals with our messy relationship with technology, and is the founder of All Tech Is Human, an accelerator for tech consideration & hub for the Responsible Tech movement. David recently developed a digital citizenship class for adults and a tech ethics hub for college students. He serves as a founding member of TikTok’s Content Advisory Council, along with the Technology & Adolescent Mental Wellness (TAM program).
He tweets as @techethicist.
This episode streamed live on Thursday, August 6, 2020. Here’s an archive of the show on YouTube:
Highlights:
1:20 David Ryan Polgar intro
3:21 weird coincidence?!
4:40 and a tornado?!
6:05 previous podcast discussion — will update here with a link when it goes live!
7:23 attorney and educator?!
10:44 “no application without representation”
11:56 the politics of technology
15:55 impact over intent
16:25 social media and free speech online
21:13 content moderation: humans and AI
24:32 the role of friction in tech
27:32 distinguishing between thought and action in law
28:24 “your unfiltered brain is not what should be out on the internet”
28:50 brain to text
30:59 “are we an algorithm”
37:14 “do we even want these systems”
46:05 “I wanted to put the agency back on us”
46:28 “the future is not written”
53:55 “everybody needs to add their voice”
54:54 How can people find you and follow your work? (alltechishuman.org, [email protected]; funnyastech.com; @techethicist; David Ryan Polgar on LinkedIn; techethicist.com; davidryanpolgar.com)
About the show:
The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O’Neill.
Subscribe to The Tech Humanist Show hosted by Kate O’Neill channel on YouTube for updates.
Full transcript:
00:14
all right
00:15
welcome to the tech humanist show uh
00:19
i hope that we’ve got some folks
00:21
watching uh where there’s a couple
00:22
people starting to show up
00:24
if you hear me give me a little
00:28
thumbs up or smiley face or some kind of
00:30
okay in the comments
00:32
you know that if you’ve been tuned in
00:34
the last few weeks you know that we’ve
00:35
been having
00:36
some weird audio issues so we’ll just
00:38
make sure that everything is
00:40
golden and we’re off to the races
00:44
and i’m excited that we get to have
00:47
today our guest today
00:50
is uh oh wait hang on one second i’m
00:53
getting i’m getting
00:54
signals getting signals
00:58
i want to make sure we’re all set here
01:02
all right
01:09
there we go brian shulman we got you
01:11
cato rob left thumbs up blend it hand
01:13
loud and clear
01:14
loud and clear glad to hear it folks
01:17
thank you so much you are
01:19
life savers all right well i want to
01:21
bring on
01:22
my guest david ryan paulgar oh my gosh
01:26
one of my favs one of my faves and i
01:28
know he’s a lot of your faves too
01:30
because when i made the announcement
01:31
that he was going to be on the show this
01:32
week there were like
01:33
there was much rejoicing there was uh
01:35
there was all this joy from the land
01:38
so let me just tell you a little bit
01:41
about david ryan polgar he is a leading
01:44
voice
01:44
in the area of tech ethics digital
01:47
citizenship
01:47
and what it means to be human in the
01:49
digital age is that perfect or what
01:52
david’s a global speaker he’s a regular
01:54
media commentator for national
01:55
international press
01:56
he’s a frequent advisor and consultant
01:58
on building a better tech future
02:00
he’s the co-host and creator of funny as
02:02
tech a new york city-based podcast an
02:04
occasional live show
02:06
back in the pre-covet days at least that
02:08
deals with our messy relationship with
02:10
technology
02:11
and is the founder of all tech is human
02:13
an accelerator for tech consideration
02:15
and hub for the responsible tech
02:17
movement david recently developed a
02:19
digital citizenship class for adults and
02:21
a tech ethics
02:22
hub for college students he serves as a
02:24
founding member
02:25
of tick tock’s content advisory council
02:28
along with the technology and adolescent
02:30
mental wellness
02:31
program so please help me welcome
02:34
david ryan polgar uh you are on my
02:38
friend
02:38
thank you so much for being here thank
02:40
you and you should have mentioned too
02:42
you were also a guest
02:43
live on stage in new york for funniest
02:46
tech so i always appreciated
02:47
uh not only kind of how you’re right
02:50
operating on the intersection of tech
02:51
and humanity
02:52
but you also uh you know apply certain
02:54
levity to that and i think that’s what
02:55
we need so that’s what i’ve been really
02:57
enjoying watching
02:58
uh the tech humanist show so far so uh
03:00
thrilled to be a guest and i know we
03:02
have
03:02
a lot to talk about so that’s why yeah
03:05
we do we really do we’re doing this for
03:06
seven hours right
03:07
seven hours is the plan yeah so i hope i
03:10
hope you’re all out there in the
03:11
audience land like
03:13
really ready to go seven hours get your
03:15
energy drink
03:16
you know uh you need two of the five
03:18
hour energy drinks and then you’ll have
03:19
a little left over
03:20
yeah yeah so what’s really funny is you
03:23
know i think
03:24
very few people know uh is that
03:28
i actually tried to start a podcast
03:31
about three years ago i got a few
03:33
recordings in
03:35
and then i was hit by a car and so i put
03:38
put the project aside and i didn’t come
03:40
back to it but few people know
03:42
that my very first guest was the david
03:45
ryan pulgar that you see here today
03:48
but david i don’t know if you remember
03:49
this but
03:51
i went back and listened to that audio
03:53
it would be interesting to go back to
03:54
see it the other day changed right
03:56
you did go back wow the archives and i
03:59
found something
04:00
i found something very funny given given
04:02
what happened to me a few months after
04:04
uh we got started
04:07
let’s say if somebody is in a bad driver
04:10
can you all hear this
04:12
that doesn’t mean they shouldn’t drive a
04:14
car there’s there’s car accidents all
04:16
the time it doesn’t mean that
04:18
you don’t teach we are talking about the
04:21
car driving analogy
04:22
uh because i think uh social media and
04:24
technology
04:26
there’s a lot of good things that can
04:27
happen so are you saying that he’s
04:29
really good i caused
04:30
yeah i think you caused it is what i
04:33
think happened
04:34
yeah life life is not final destination
04:36
although
04:37
2020 i think feels like final
04:38
destination so
04:40
what very few people also know is that
04:42
you just had a tornado
04:44
uh affect your oh my god i know yeah i
04:47
mean in my life i balance between
04:49
uh manhattan where i’ll take this human
04:51
the oregon run is based
04:52
and then also uh living in in
04:55
connecticut and connecticut got slammed
04:58
with that storm so
04:59
it’s like you know i’m kind of giving up
05:01
on 2020. and uh yeah knocked down a
05:03
bunch of trees
05:04
so hey i’m uh i’m actually i i like call
05:07
up uh you know
05:08
my mom uh i’m sitting at her house right
05:10
now because i think i’m out of power for
05:12
for a week so
05:13
hey the bonus is that i get to see my
05:15
mom more so that’s uh
05:16
i’m gonna go with that yeah we family is
05:18
important yeah family is important and i
05:20
gotta say
05:21
your mom’s bookcase is a pretty good
05:23
bookcase it looks looks really she’s a
05:25
reader
05:25
i gotta say so uh i i’ve seen it a lot
05:28
i’ve learned a lot
05:29
uh you know i’m trying to get uh get it
05:32
through osmosis my mom’s
05:33
uh her brilliant she’s she’s really good
05:35
at trip pursuit she’s a
05:36
great reader uh so yeah definitely a
05:38
very thoughtful woman yeah i feel like
05:40
this is an 11 out of 10 on one of those
05:41
rate my bookshelf
05:42
twitter accounts um we can just submit
05:45
it to you and see what they say
05:47
but i did i didn’t scan what actual
05:49
books are behind me so hopefully there’s
05:50
nothing kind of like embarrassing but i
05:52
think i think i think she’s
05:54
she’s above board that’s good
05:57
well you know i think this is really
05:58
it’s it’s going to be so fun to talk to
06:00
you and i know we’re going to cut into
06:02
so much of our time with just joking
06:03
around with each other it was really
06:05
also funny uh just to just to point this
06:08
out too
06:08
is that the last time that uh david and
06:11
i saw each other
06:12
it was days before the lockdown began
06:16
and we recorded and as far as i know as
06:18
yet to be
06:19
earthed you know heard show
06:22
uh but it was a fantastic discussion it
06:24
was supposed to be i loved it
06:25
yeah it was like tech ethicist versus
06:28
tech humanist
06:29
right although i think the versus did a
06:31
lot of work in that statement because
06:32
there’s
06:34
very little versus that is actually
06:36
coming coming up because
06:37
uh when you reached out my way i said
06:39
wait a minute kate and i we recorded
06:42
a podcast together in in march and
06:44
that’s when everybody was doing the
06:45
elbow bump remember that was a thing for
06:46
a while
06:48
so we did that when we saw each other uh
06:50
but no i guess it’s uh
06:52
it’s being fine-tuned so that’s going to
06:54
come out so cool
06:55
so we’ll include that in show notes uh
06:57
for folks if you know if you want to
06:58
check that out once once that’s live
07:00
i’ll make sure to circle back and
07:02
include a link to that
07:03
uh so yeah so what you know what’s
07:06
really interesting about you
07:07
david is not only do you have you know
07:09
this great tech ethics
07:11
perspective and you’re you know you yeah
07:13
as you mentioned
07:14
there’s this levity uh that you know you
07:16
you complimented me on that but of
07:18
course that’s right back at you and
07:19
and of course this human-centric
07:21
approach to what you do
07:22
but you’re also i think folks may not
07:25
know that you’re also
07:26
an attorney an educator by background
07:28
right so you’re bringing so
07:30
many perspectives yeah well i gotta say
07:33
too
07:33
something that doesn’t get discussed a
07:35
lot in the media is that there
07:36
are actually a lot of attorneys involved
07:39
in the responsible tech space
07:40
so for example with my role with tick
07:42
tock i’m dealing with
07:44
trust in safety content moderation types
07:46
of teams
07:47
the vast majority of people i interact
07:49
with are are attorneys and
07:50
one of the issues is sometimes when we
07:52
think of attorneys we think of it in
07:53
terms of like
07:54
you can’t do this a compliance type of
07:56
mentality right however when you really
07:58
think about the law
08:00
one the law is steeped in in the concept
08:02
of ethics but two
08:03
it’s also all about critical thinking so
08:06
when you think about your last
08:07
conversation you had with an attorney
08:09
guess what they tend to do they tend to
08:10
think of worst case scenario they tend
08:12
to think of permutations they say well
08:13
okay i see what you’re trying to do but
08:15
here’s how you could
08:17
this could go wrong well one of the
08:19
reasons why we actually
08:20
need more people like that in tech is
08:23
because
08:24
a lot of times a technologist would have
08:26
a certain bias and everybody has bias
08:27
right i have my own bias
08:29
but they would have a certain kind of
08:30
bias to saying okay here’s what i’m
08:32
creating as a product
08:33
and here’s the intended use of it
08:35
whereas a lot of the work that i try to
08:37
do and try to bring is saying
08:39
yes we we need to focus on the intended
08:41
use but also bring in the idea of the
08:43
unintended consequences of it the
08:45
downstream
08:46
impact the negative externalities so you
08:48
do actually need some worst case
08:50
scenario people in here because for
08:52
example when you take a a a
08:54
product like twitter jack dorsey has
08:56
come out in recent years when we’ve seen
08:58
how authoritarian governments have
09:00
exploited
09:00
the platform and he said well you know i
09:03
kind of wish i designed this differently
09:05
in other words it would have been
09:06
beneficial to have other people who say
09:08
hey i get how this can lead to the
09:10
spread of democracy but how would it be
09:11
used by authoritarian governments who
09:13
can utilize this to track
09:15
uh dissidents political dissidents how
09:17
would this be used when the actual
09:19
uh you know authoritarian type of
09:21
government is the one who’s clamping
09:23
down on speech
09:24
uh and and that’s something where we
09:26
need to incorporate
09:27
more ideas and more more voices yeah how
09:30
can this be weaponized right
09:32
that’s the big question across a lot of
09:34
technology and i think that’s
09:36
as i keep coming back to and i know you
09:38
you must as well is one of the reasons
09:39
why it’s so important to have diverse
09:41
teams and diverse perspectives that are
09:43
weighing in on the development of
09:45
technology
09:46
yeah well that’s the big thing kate that
09:49
i think we should always focus on is
09:50
that
09:51
we all have our limitations and one of
09:53
the mistakes that i
09:54
want to kind of point out that i still
09:56
see that it’s happening right now
09:58
is that we still focus on saying like
10:00
hey how can we get jack dorsey to do a
10:02
better job how can we
10:03
get mark zuckerberg to be more ethical
10:05
and yes that’s important
10:06
you so you want you know those people in
10:08
that position of power
10:10
to to to be more ethical and responsible
10:12
but you also need to say
10:14
it’s not about one person it from from
10:16
an american perspective
10:18
right that’s very dangerous to have a
10:19
consolidation of power
10:21
right one person should not decide the
10:23
fate of three billion people like that’s
10:25
antithetical
10:26
to the very idea of what it means to be
10:28
a
10:29
an american like that american kind of
10:30
kind of experience so i think
10:32
a lot of uh social media let’s say
10:35
what it’s going to do in the coming
10:37
years is it’s going to start
10:39
integrating itself into the social
10:41
structures to actually say like how can
10:43
we have representation
10:44
the quote that i’m kind of using lately
10:46
is i like to say no
10:47
application without representation right
10:51
well because think about it like the
10:52
phone that i’m carrying around in my
10:54
pocket
10:55
that impacts the people i interact with
10:57
it impacts the news that i get which is
10:59
really
10:59
how i view the world it’s it’s altering
11:02
the human condition
11:03
that is not like selling somebody tacos
11:06
right that’s not something that we
11:07
normally say this is a private company
11:08
they can do what they want go to another
11:09
taco stand
11:10
it’s not it’s actually impacting how i
11:13
see the world
11:14
that is so consequential so my big big
11:17
pitch is that
11:19
emerging technology especially social
11:20
media has such a profound
11:22
impact on us on an individual level and
11:25
also societal
11:26
that it behooves us to to be more
11:28
thoughtful about it and also incorporate
11:31
more of society in this process
11:34
so if your social media company there
11:35
needs to be more transparency there
11:37
needs to be more involvement
11:38
we’re starting to see uh a push that
11:41
that direction
11:42
well it has become so entrenched in the
11:45
political conversation right it’s it’s
11:47
now inextricable like the social media
11:49
platforms
11:50
and politics are inextricable from one
11:52
another in ways that they weren’t even
11:54
just a few years ago
11:56
we’ve entered i think the politics of
11:57
technology that’s the phrase that i kind
11:59
of think about is that
12:02
oftentimes we we like to say well you
12:04
know our congress they’re just not
12:05
technical enough and they’re they’re too
12:07
old right
12:08
okay yes we want to have more technical
12:10
uh expertise but even as we saw with the
12:12
recent anti-trust
12:14
hearing well guess what the our congress
12:16
now is more educated on these issues
12:18
because they have a lot of
12:19
you know talented advisers that are some
12:21
of them some of them
12:22
but but the the bigger point is that
12:25
when you take a an issue let’s say like
12:28
um facial recognition
12:30
that’s not about understanding what
12:33
facial recognition
12:34
is it’s about determining not only the
12:37
reliability and
12:38
bias behind it but also whether or not
12:41
we want to incorporate that
12:43
into communities that are going to be
12:44
impacted by it
12:46
so these are actually social problems we
12:48
like to frame them as technical issues
12:50
oh you know a it’s a tech question it’s
12:52
actually not it’s it’s a question of
12:54
power
12:54
and equity and access equality
12:57
these are these are classic issues so a
13:00
lot of what i think
13:00
we need to do is incorporate the
13:02
learnings that we’ve had throughout
13:04
society in hundreds of years and
13:06
and say wait these seem like new topics
13:08
we’re talking about social media but
13:09
but uh they’re not actually new topics
13:11
they’re they’re about access and power
13:14
it’s almost like you could matrix out
13:16
all the different facets of you know
13:17
technology and then different facets of
13:19
society and and
13:21
kind of mix and match and say well we
13:22
need to make sure that tech and politics
13:24
have
13:25
you know the right kind of mixing and
13:27
oversight we need to make sure that tech
13:29
and the economy have the right kind of
13:30
mixing and oversight from a social
13:32
justice perspective from an equality
13:34
perspective from
13:35
you know a an ethics perspective and
13:38
what we want the the future of humanity
13:40
to look like
13:41
well that’s the big thing i’m starting
13:43
to really uh think about with social
13:45
media platforms because
13:47
when you read a terms of service you
13:49
know traditionally
13:50
any social media company is going to
13:52
have the terms of service and their
13:53
community guidelines
13:54
and oftentimes when you read them if
13:56
you’ve actually gone through and said
13:57
hey let me let me
13:58
read what i agreed to uh it’s going to
14:01
talk about these community standards and
14:03
not offending the community standards
14:05
well what i’ve really been kind of
14:07
grappling with is
14:08
we’re assuming a lot right we’re
14:10
assuming that we understand the
14:12
community
14:12
so i i think people are starting to call
14:15
push back on that idea a little bit and
14:16
say okay well if this is truly a
14:18
community standard
14:19
then we actually need to understand what
14:21
the community desires
14:23
because at the end of the day it’s
14:25
usually a select few people who are
14:27
making decisions
14:28
in the name of community and i think a
14:30
lot of what’s going to have to happen is
14:31
we’re going to say
14:32
okay well what does the community
14:34
actually think about these issues
14:36
right so and that’s where it is the
14:38
messiness of a democracy we need to come
14:40
up with a system
14:41
that that understands it a little better
14:43
and i think we have to ask whether we’re
14:45
making
14:45
the concept of community do a little too
14:47
much work there in terms of a platform
14:49
right like
14:50
right community really is about some on
14:52
some level of shared identity and if
14:54
you’re talking about a platform that’s
14:56
supposed to host
14:57
a wide variety of identities and a wide
14:59
variety of discussions it feels like
15:01
you know maybe that’s more of a more
15:03
macro sense than what community is meant
15:06
to
15:06
be the model or metaphor for yeah well
15:09
and that’s where a lot of people
15:11
kind of use reddit and subreddits as an
15:13
example where you almost have these
15:15
micro communities that can have their
15:16
own
15:17
individual types of standards and then
15:19
within the umbrella of reddit you would
15:21
have
15:22
a thousand different standards so that’s
15:24
kind of interesting because
15:26
i do think it’s very difficult to come
15:28
up with one
15:29
one community standard because even if
15:31
you look across different platforms like
15:33
or even any platform it’s got people
15:35
across
15:36
geography across ages you know uh
15:39
different different uh
15:40
different backgrounds and that’s gonna
15:41
influence what their expectation
15:44
of of uh inappropriate behavior
15:47
is sure and then i think you know i
15:50
think you sort of alluded to this in in
15:52
uh some
15:52
conversation we had before the show
15:54
began that you know there’s
15:56
there’s a concept in in social justice
15:57
about impact being more important than
15:59
intent right and so i think when we talk
16:01
about you know the nature of uh
16:04
community and content moderation which
16:05
i’d like to get into a little more
16:07
it feels like we’re really concerned far
16:10
we should be
16:11
really more concerned with the impact on
16:13
the humans who are
16:14
part of right those communities are part
16:16
of the platform who are experiencing
16:19
what the platform you know is kind of
16:21
serving up to them
16:23
well we have to in what’s what’s tricky
16:25
about
16:26
social media and speech online is we
16:28
tend to apply
16:30
our kind of free speech first amendment
16:33
uh understanding of it and one this is
16:35
complicated by the fact that
16:36
social media companies are private
16:38
companies and if you you know in the
16:39
constitution
16:40
what the first amendment actually
16:41
applies to is what the government cannot
16:43
do
16:44
so this is all tricky and it’s getting
16:45
into a lot of the issues with trump and
16:47
twitter and issues like that
16:49
but but anyways even if you take this
16:51
first amendment understanding
16:53
when you think about speech and this is
16:54
something that doesn’t get pointed out
16:56
enough
16:58
it’s always actually a balance between a
17:00
individualistic standpoint and a
17:02
collectivist
17:03
standpoint and a classic example that
17:04
everybody most everybody knows
17:07
is we like to say you cannot yell fire
17:09
in a crowded theater
17:10
but let’s back up that statement a
17:12
little more nobody in the world is
17:14
saying that you cannot yell
17:15
fire so when you say you cannot yell
17:17
fire in a crowded theater
17:19
crowd the crowded theater is is is very
17:23
contextual
17:23
saying that well if you’re young fire
17:25
inside of a crowded theater
17:26
then it’s going to have a foreseeable
17:29
impact on the broader community to
17:31
create chaos which would lead to to
17:33
violence and an injury therefore we’re
17:36
going to weigh your right as a speaker
17:38
to yell fire
17:39
and then say what’s the impact of it
17:41
which then is going to overwhelm it and
17:43
say don’t yell fire in a crowded theater
17:45
again nobody said you can’t yell fire
17:47
totally different things and
17:49
what we’re starting to see with social
17:50
media is people oftentimes get confused
17:53
because they say well wait a minute this
17:54
is this is my right
17:56
but the very idea of being in society
18:00
means that you’re constantly in a
18:02
compromise of
18:03
what you want to do versus what society
18:06
determines right so like if you have as
18:08
an individual say
18:10
i want to be a nudist right you can
18:12
think that but but if you start
18:14
walking in your community without
18:15
clothes on uh
18:17
but maybe a mask right hey you’re you’re
18:20
healthy that way
18:21
but but you had no clothes 2020
18:23
appropriate here 2020 appropriate so
18:25
so you just have you just have a mask on
18:27
but no clothes
18:28
well guess what you’re going to get in
18:29
trouble because even though your
18:31
individual right is being infringed on
18:34
because it is
18:35
right because your determination of what
18:36
is is right
18:38
it still is balanced against a community
18:40
type of type of standard
18:42
so that’s the the tricky part with
18:44
social media is
18:45
if we’re going to determine more of this
18:47
community standard
18:48
it’s going to have to be more on a uh
18:51
elected type of type of mentality so for
18:53
example
18:54
we make our or our laws based on saying
18:58
like okay let’s have
18:59
a you know a congress that is the
19:01
legislative branch and then the
19:03
executive branch carries it out and the
19:04
judicial branch interprets the law
19:07
and i think if you apply that to social
19:09
media people aren’t talking about this
19:11
now but i think they will
19:12
in that the problem that’s actually
19:15
happening
19:15
that people get frustrated with with
19:17
facebook and twitter
19:18
and and snap and and uh tick tock is
19:22
they say well wait a minute
19:23
you’re creating a law and then you’re
19:26
enforcing the law
19:27
and you’re interpreting the law that
19:29
that’s highly
19:31
that’s highly problematic in the sense
19:32
that it creates a little bit of like
19:35
reverse engineering and that’s why
19:36
people a lot of times say well wait a
19:38
minute like
19:38
how are you treating you’re treating
19:40
people differently are you treating
19:41
trump differently
19:42
than you would different than you treat
19:44
somebody else and the reason why that’s
19:46
important is because
19:47
when we think about free speech we like
19:50
to say
19:50
that you cannot be arbitrary and
19:52
capricious right so
19:54
so you have to treat like you and i have
19:56
to be treated the same
19:57
under the law and if we’re not then then
19:59
the law should be struck down
20:01
likewise people are starting to relate
20:04
to social media laws as as being
20:08
consequential to their life and whereas
20:10
we have not recognized them we’re saying
20:11
wait
20:12
it’s just a social media company guess
20:13
what as soon as i uh joined uh
20:16
tick tocks content advisory council i
20:18
got slammed with so many emails
20:21
from from people who say wait a minute
20:23
my account just got closed down
20:25
and i know everybody on the board is
20:26
facing this and
20:28
and my takeaway from that was one
20:31
now my inbox is too full but two two
20:34
is that i said wow like this is really
20:37
important to people
20:38
i mean that’s something we cannot forget
20:40
is that every time we’re making a
20:42
decision
20:43
it’s impacting somebody’s ability to
20:45
communicate it’s impacting their ability
20:47
to connect
20:48
with their friends it’s impacting
20:49
frankly a lot of times their ability to
20:52
to make money because a lot of people
20:53
are making money on you know influencer
20:55
economy and stuff like that so
20:57
it’s a big deal so so you can’t be
21:00
you can’t be flippant about the fact
21:02
that that that
21:04
social media companies have a lot of
21:06
power and i don’t think frankly they
21:07
want this power but right now they have
21:09
a lot of power
21:10
and we’re trying to decide how best to
21:12
to structure that
21:13
so then the job of content moderation
21:15
becomes really complicated and i think
21:17
we’ve seen
21:18
you know this the the evaluation of how
21:21
that content moderation should be done
21:23
in terms of what you just discussed like
21:26
the
21:26
the social rules and the law and how
21:29
that’s applied
21:31
but in terms of the the actual tactics
21:34
of how content moderation is done
21:36
you know in terms of human moderation
21:38
versus ai
21:39
and then human review you know what are
21:42
you seeing in terms of the trends there
21:44
and and how does it play out to have uh
21:47
human moderators versus artificial
21:50
intelligence in the process
21:52
well uh people have exaggerated
21:56
the ability for ai to handle speech
21:59
online
22:00
uh it it frankly uh does not do a good
22:03
job it does a great job with like
22:04
pornography
22:05
right because you can start training it
22:07
on images and they have like well that’s
22:09
where it started i think a lot of
22:11
content moderation
22:12
really began there with microsoft and
22:14
that was actually a good example of
22:15
where they had industry sharing so
22:17
microsoft created this photo dna system
22:19
uh for for uh child pornography and then
22:22
shared it with the larger system because
22:23
of its
22:24
major societal uh value for it
22:27
but with with ai and speech
22:30
uh it it’s very hard to to to do so i
22:33
think it
22:34
can be helpful to to limit the amount of
22:37
human exposure to
22:38
violent activities right because nobody
22:40
should have to be looking through
22:41
beheading videos but but guess what
22:43
there are people who frankly usually
22:45
don’t get paid a lot of money who are
22:47
sifting through
22:48
the garbage the sewer of the internet i
22:50
mean think about it like
22:51
we we don’t like to think about it
22:53
because it’s very it’s very dour
22:55
but it’s still it’s still something that
22:57
is being done so like
22:59
all of this stuff that the worst the
23:00
worst that’s going on it’s still usually
23:02
somebody’s job to
23:03
to actually say okay is this is this
23:05
inappropriate is this something that
23:06
needs to be
23:08
reported to to to the authorities how do
23:10
you how do you get people involved so
23:11
it’s always been a promise
23:13
to have more ai involved with that
23:16
because then you would have less
23:18
less focus uh but as we’ve seen in a lot
23:20
of the reporting with
23:21
casey newton’s the verge and then even
23:23
before that with the
23:24
researcher out of ucla who’s done a lot
23:26
of great work on this sarah t
23:28
roberts uh on the really impact of it
23:32
now and i know when i talk to people in
23:34
that space they
23:35
they have a lot of uh methods for um
23:38
almost like uh i don’t know like
23:41
mindfulness types of
23:42
methods about how you deal with the fact
23:44
that you’re seeing things that
23:45
that no individual should should have to
23:48
see it can be very
23:49
we shouldn’t have to see one of them let
23:51
alone over and over and over
23:53
be subjected to that well and and that
23:56
is why
23:57
uh that’s why a lot of my focus has
23:59
actually been on saying okay
24:01
we tend to view it as human moderation
24:03
versus ai
24:04
and the reason why and this doesn’t get
24:07
pointed out enough but the reason why
24:08
social media
24:09
is is different than a media company is
24:11
because if you’re the wall street
24:12
journal
24:12
you filter and then you publish which
24:14
means you have a lot of control over
24:16
what gets published
24:17
whereas social media companies because
24:19
they’ve been so hyper focused on the
24:20
scalability of their
24:22
their product they say well let’s just
24:24
publish and then filter
24:26
and then they use some of the community
24:27
to help filter and then they have a
24:28
content moderation team mixed with human
24:30
and ai but where i’m obsessed with and
24:33
something i’m i’m writing about lately
24:35
and doing some work on
24:36
is that if we could incorporate a
24:39
greater level of friction
24:41
in the original process you would you
24:43
potentially could
24:44
massively lower the amount of initial
24:47
toxic
24:48
behavior that gets pushed out into the
24:50
system because we like to think of
24:53
and this is where you need psychologists
24:54
right but because we like to think of
24:56
the what’s on social media as like
24:58
humans are humans
24:59
and i would say that’s completely and
25:02
utterly
25:02
not how how this works because we’re
25:05
influenced by other people
25:07
we’re influenced by our environment
25:09
right so and that’s why we study
25:10
like what is what does the text box say
25:12
does it say like say something nice
25:14
or does it not say anything if it
25:15
doesn’t say anything you know where’s
25:16
your brain go
25:17
so what you’re going to see over over
25:20
time
25:20
is more of a focus to say can we lessen
25:24
the initial bad behavior that goes out
25:27
on the web and if we could then you
25:29
would have less people who
25:30
have to look at terrible behavior online
25:33
uh and then less dependence on
25:35
on ai systems so and less misinformation
25:37
i think you know unless
25:38
yes you published an article recently
25:40
about this idea
25:42
of friction and technology and you know
25:44
you had the example of
25:45
uh instagram asking you are you sure you
25:47
want to post this which
25:49
speaks to the the first example right
25:51
about cyber bullying and so on but the
25:53
the second one the second example you
25:55
gave was about twitter
25:56
having the uh the introduction into the
25:59
process
26:00
of retweeting where hey it’s like hey do
26:02
you want to read this
26:03
before you retweet it so yeah so
26:06
misinformation is is being slowed or
26:08
potentially being slowed a little bit
26:09
there
26:10
too well i’m glad you pointed that out
26:12
because
26:13
one of the things that that i i feel
26:15
like we need to have a different analogy
26:17
to
26:17
is oftentimes we like to say that
26:19
twitter is a megaphone
26:21
and i would say that’s not actually how
26:23
it works
26:24
right like twitter is not a megaphone
26:26
for trump or for other like heavy actors
26:29
on social media platforms it actually
26:31
works more like a symphony
26:32
where you have the one player who is a
26:34
conductor and then you have all these
26:36
other individuals who are playing
26:37
instruments a lot of times they’re
26:39
unknowingly adding
26:41
to the to the overall noise that a
26:44
person has so that’s why
26:45
if you’re twitter you would say we want
26:46
to prompt you to actually say did you
26:48
read this because
26:49
even though you you say you you care
26:51
about you know
26:52
misinformation and things like that you
26:55
might unintentionally
26:57
be part of the the the larger issue
27:00
right and this is where we’re talking
27:01
about the overall kind of health of the
27:02
information
27:03
ecosystem so that’s going to be that’s
27:06
going to be a major part
27:07
because i bet you you know if you’re the
27:09
same way
27:10
uh when you were growing up you probably
27:12
watched uh you know like jiminy cricket
27:14
remember the idea we had jimmy cricket
27:16
who was on your
27:17
your shoulder yeah right yes so what was
27:19
the idea like you had
27:21
you had the devil you had the good and
27:23
you say well what what do you think
27:24
about this
27:25
the very concept there is is supposed to
27:27
be about reflection
27:29
right and that reflection allows more
27:32
time to
27:32
to consider if something is good or not
27:36
and and frankly that is something that
27:38
is a part of
27:39
all of of law when you think about
27:41
criminal law
27:42
in any type like follow any murder trial
27:44
they’re always going to focus on the
27:46
actus reyes and the men’s ray i like the
27:48
act in the mental state
27:50
and we like to say in in law that
27:52
there’s there’s no thought police
27:54
right and the idea behind there is that
27:57
you don’t want to be able to read
27:58
somebody’s brain because their brain is
28:00
constantly filled with terrible
28:02
thoughts and that it only matters if
28:04
they act upon that like there’s a
28:05
concert between
28:06
the action and their mental state but
28:09
where
28:10
where a lot of emerging technology is
28:11
getting in the way of something we’ve
28:13
known for hundreds of years
28:15
is that the more you get closer to your
28:18
your actual initial impulse the worse
28:21
it’s going to be i mean
28:23
it’s painfully obvious because your
28:25
unfiltered brain
28:27
is not what should be out on the the
28:30
internet we know that like an example i
28:32
i always like to give uh in like
28:35
speaking and stuff is i say well
28:36
imagine if i had a device where i could
28:38
read everybody’s thoughts here
28:40
and people get really kind of like
28:41
embarrasses oh my god and they kind of
28:43
like snickered
28:44
like wait a minute like i probably
28:46
thought like these terrible things that
28:47
i should be arrested for
28:49
everybody has and what a lot of emerging
28:52
technology is doing especially if you
28:54
talk about
28:54
brain to text which is something you
28:56
know facebook and others are working on
28:58
frankly i’ve made amazing kind of
29:00
innovations on this right
29:02
is that if you can get closer to your
29:05
head
29:05
right right but the less distance
29:07
between head and side
29:09
the more problematic it’s going to be so
29:11
uh yeah i think uh
29:13
brain detects there’s going to be
29:14
obviously a lot of a lot of
29:16
fun issues with that because uh yeah we
29:19
we don’t want to be
29:20
mel gibson and read somebody’s uh no
29:23
right but
29:23
i think it’s also maybe let’s just cut
29:25
it there you don’t want to be mel gibson
29:26
don’t have email gibson that’s the end
29:28
of the podcast
29:29
no i think it’s a really interesting
29:32
point you know brain to text is a really
29:35
compelling idea for an interface and i
29:37
think you know anyone can imagine what
29:39
some of the
29:40
the potential there is on on a good side
29:43
right like it seems like
29:44
a lot of productivity uh but it’s very
29:46
very easy to see where that can be
29:48
weaponized and go wrong of course
29:49
the the other thing that i’m thinking
29:51
about is that we know
29:53
that predictive policing is increasingly
29:55
a field that’s uh that’s problematic and
29:57
that we’ve had
29:58
predictive analytics kind of tracking
30:00
everything that we’ve been doing
30:02
online and constructing these sort of uh
30:06
projected three-dimensional ideas of who
30:09
we are and what our aspirations are
30:11
uh based on sort of targeted advertising
30:14
but now it’s being taken into the realm
30:16
of criminal justice and and there’s
30:18
predictive policing that’s playing out
30:20
uh and of course that’s
30:21
disproportionately affecting communities
30:23
of color
30:24
uh so when we think about you know brain
30:26
detects and we think about you know
30:28
those types of applications
30:30
those are for sure the the applications
30:32
where we’re going to need to think about
30:34
weaponizing
30:34
you know how that gets weaponized and
30:36
what the impact of the experience of on
30:38
people is going to be so that’s a super
30:40
important
30:41
point well and what i would love to say
30:43
with that too
30:44
is that brings up something that we’re
30:46
obviously both
30:47
very very active in with this
30:49
intersection between technology and
30:50
humanity
30:51
and i think that the common question
30:53
that it’s actually getting at which is
30:54
something that people
30:56
dramatically disagree about is are we
30:59
an algorithm or are we kind of unique
31:03
right
31:03
like like i always joke like since i’m
31:05
i’m sitting right here my mother’s
31:07
library i can say this right my mom
31:09
always said right like i was a snowflake
31:11
i mean
31:11
i’m unique right there’s no nothing like
31:13
you and
31:15
there is nothing like that
31:20
and by the way you’re the jiminy cricket
31:21
on my shoulder i always think
31:23
what would david do when i go to post
31:25
something
31:26
well that’s why you know with so many
31:28
speaking gigs canceled uh what i’m
31:29
working on right now
31:30
is a a miniature uh david ryan polgar
31:33
that’s about six inches tall and then
31:34
you
31:35
you put it on your shoulder and perfect
31:37
and then you’re gonna be the first
31:38
person
31:39
so you’re gonna have no idea how badly
31:41
i’ve needed this
31:43
you know i have it with like a large
31:44
head so it looks a little more comical
31:45
and then you just put on your shoulder
31:47
and then anytime you’re having a bad
31:48
thought he says well no no no
31:50
you know yeah and then you pull the
31:52
string on the back it has a couple
31:54
uh catch phrases but from head to side
31:57
was that was that what you said earlier
31:59
like that’s going to be one of the
32:00
phrases
32:01
well but i think the other one is going
32:02
to be are you
32:04
an algorithm or are you unique because
32:07
because when you mention predictive
32:08
policing
32:09
and even when you think about you know
32:12
amazon and recommendations
32:14
it’s it’s thinking that your past is
32:17
predicting
32:17
your your future and that with enough
32:20
data
32:21
we can uh you know accurately determine
32:24
where your next step is right or even
32:26
with auto suggestion and things like
32:28
that
32:28
well what’s getting tricky is is is that
32:32
true
32:32
or is it subtly going to be off and then
32:35
the other
32:36
other part that i always like to bring
32:37
up which frankly doesn’t get enough
32:39
attention in the media
32:41
is that with a lot of these auto
32:44
suggestions based on our previous
32:46
behavior let’s say
32:46
like text right so like if i if i’m
32:48
texting somebody
32:50
it’s going to say okay this is how you
32:52
want to end your your statement this is
32:53
probably what you’re going to say
32:55
well the question i always like to think
32:56
about is how often
32:58
am i influenced by what they what they
33:01
said i should say
33:02
so if i want to write like have a and it
33:05
says great day
33:07
well maybe i was thinking great day but
33:09
maybe i was going to say
33:10
good day and it’s subtly different but
33:12
it it’s also influencing kind of my
33:14
my volition right and that’s i think a
33:16
larger larger question
33:18
is that now we’re being influenced by
33:21
the very
33:21
technology that’s pushing us in a
33:23
certain direction
33:25
and we like to think of it well it’s
33:26
already based on you but then
33:28
but then that has a certain cyclical
33:30
nature to actually
33:31
almost like um quantum human
33:34
consciousness or something
33:35
exactly like the moment you observe it
33:37
it’s changed right
33:39
this is why it’s going to be a seven
33:40
hour discussion because now i i think we
33:42
brought up more
33:43
questions than uh than answers well you
33:45
know i
33:46
also want to point out so uh lee odden
33:48
commented and asked predictive policing
33:50
as in minority report and i i think
33:52
there may well be people who are
33:53
watching who aren’t aware
33:55
of this trend of predictive policing so
33:57
i want to make sure that people
33:58
are aware that you know we’re not uh
34:00
speaking hyperbolically like there
34:02
there are you know kind of constructed
34:04
algorithms that are looking across
34:06
patterns of behavior and drawing lines
34:08
to demographics and enforcing you know
34:11
it’s kind of it’s broken windows
34:12
policing but with algorithms applied
34:15
so so there’s police police departments
34:18
that are enforcing
34:19
their or their deploying their units
34:21
more
34:22
based on what they expect is going to be
34:24
happening and of course it
34:26
disproportionately affects
34:27
uh communities of color and poorer
34:30
neighborhoods
34:30
so you know you know wherever you see
34:32
the police deployed more
34:34
often there are bound to be enforcing
34:36
more often so you get things like
34:38
this is the term broken windows policing
34:40
if there’s a broken window
34:42
you’re going to get arrested for that
34:43
are you going to get a ticket for for
34:44
something that
34:45
you know other people wouldn’t get a
34:46
ticket for white people driving down the
34:48
street aren’t going to get pulled over
34:49
for something
34:50
that a black driver being going down the
34:52
street might get pulled over for
34:54
so it’s a it’s a real concern and it’s
34:55
it’s a real issue
34:57
that uh that’s growing so i’m really
34:59
glad that lee you asked for that
35:00
clarification because i think
35:02
folks do need to know that that that’s
35:04
really happening out there
35:06
uh yeah in in amongst all of the other
35:08
kind of algorithmic bias
35:10
and predictive analytics uh kind of
35:12
quagmire that we’re dealing with
35:13
that that big over difficulty is
35:17
happening as well
35:18
yeah well but i think the the the other
35:20
part about predictive policing or just
35:22
that larger kind of concept with it is
35:25
one of the things that i’ve kind of
35:26
noticed uh with with altx human
35:29
is that there’s a real frustration
35:32
because people feel like they’re being
35:34
impacted by the technology on their
35:36
communities
35:37
before they have the ability to voice
35:39
their concerns
35:40
so one of the things that i’ve kind of
35:42
taken away from that is that
35:44
really the frustration and we like to
35:46
call it tech lash whatever
35:48
but the frustration that a lot of
35:49
society feels
35:51
is is seemingly between the gulf between
35:54
how fast technology moves and how slow
35:57
our consideration
35:58
of technology moves so that’s why you
36:01
know i like to say hey we’re forming
36:02
this accelerator for tech consideration
36:04
because
36:05
one of the traps that we oftentimes get
36:06
into is we like to say
36:08
well you can’t slow down innovation i
36:10
mean how many times have you heard that
36:11
well you can’t do that and it’s true
36:13
like you don’t want to but a lot of
36:14
uh a lot of like hurdles in the way of
36:17
an innovator right because
36:18
they want to be unchecked with within
36:20
term with in terms of their innovation
36:22
and creativity
36:23
yeah again we’re making innovation do a
36:25
lot of work as a word
36:26
oh true that’s true that’s true that is
36:29
a very
36:30
very large catch-all now but i think the
36:33
key part
36:34
is to say well what if we could also
36:35
speed up our ability
36:37
to think about how this technology would
36:39
be utilized
36:40
so before predictive policing policing
36:43
is happening in the communities and now
36:45
you know we’re talking about that it is
36:47
we would need to say okay
36:49
this needs to be in front of our you
36:51
know our elected officials we need to
36:53
have a political
36:54
conversation about these because again
36:56
these aren’t technical issues
36:57
these are issues about what do we want
37:00
so this is where a lot of predictive
37:02
policing
37:03
and uh you know facial recognition it’s
37:05
actually a two-part question because
37:07
it’s
37:07
it’s one about making these systems more
37:09
fair but then
37:11
the larger question before you even get
37:13
to the fairness is
37:14
do we even want these these systems
37:16
right because you you could decide as a
37:18
you know society that we said well no
37:19
this is something that we
37:21
we don’t want to have and that’s
37:23
obviously happened with a lot of um you
37:24
know
37:25
cities and facial recognition right now
37:26
saying let’s put it on pause till we
37:28
really
37:29
figure this out a little more yeah this
37:30
technology itself and then there’s how
37:32
it affects human experience and that’s
37:34
kind of the under underpinning of this
37:36
show is to examine
37:37
you know how data and big data and and
37:40
technology sort of especially emerging
37:42
technology
37:43
kind of impacts humanity at scale and
37:46
when you think about the deployment
37:47
across society and all these different
37:49
facets that you’re talking about
37:50
politics
37:51
and how you know sort of social
37:54
experiences or
37:55
experiences in community in in
37:57
neighborhoods there’s so many different
37:59
facets for this and it goes so far
38:01
beyond
38:02
what the technology is capable of
38:04
believe me technology is capable of so
38:06
much right now and i say believe me to
38:08
our audience i know of course
38:10
you are very familiar so i actually want
38:11
to sort of steer there because you you
38:13
mentioned
38:14
with um yeah alltech is human and and
38:16
the uh
38:18
curating and accelerating all these
38:20
ideas i wonder
38:22
it strikes me that the one of the things
38:24
you and i have in common is that we are
38:26
surrounding ourselves with people who
38:28
are big thinkers
38:29
in technology and so it’s it’s not just
38:31
that we’re
38:32
thinking and speaking and working in the
38:34
space but that we are curators of the
38:36
space in a sense so i wonder
38:37
in your work curating both funny as tech
38:40
and all tech as human
38:42
what have been some of the ideas that
38:44
you’ve encountered
38:45
from some of your guests that have been
38:47
the most
38:48
mind-blowing or cool or that you might
38:51
not have otherwise encountered
38:53
well actually the the biggest thing and
38:55
i say this because this is like such my
38:57
obsession lately you know so i’m trying
38:59
to do more mono tasking in life
39:01
the biggest takeaway that i’ve learned
39:03
just from kind of interacting with a lot
39:05
of people is
39:06
lately i have a lot of college and grad
39:08
students who are reaching out because
39:09
they want to get involved in the space
39:12
and i was having a conversation recently
39:14
and they said well you know like how did
39:15
you
39:16
how did you get involved in this space
39:17
and i said well you know kind of by like
39:19
brute force
39:20
like you know i just kind of said this
39:21
this needs to be a thing and let’s make
39:23
it a thing over over years
39:25
and then they they kind of push back on
39:27
that a little bit and they said well
39:29
that that probably also a little bit of
39:31
like privilege behind that because uh
39:33
you know you assume that you could just
39:35
make a space or do
39:36
do something or create your own kind of
39:38
career
39:40
and i really took that to heart to say
39:42
wow you know
39:43
that is that is kind of kind of true so
39:46
uh
39:47
something i’ve been really kind of
39:48
working on is like well then
39:50
what does this space look like how do
39:52
people actually get involved because
39:54
we’re all saying hey we want more people
39:55
who are
39:56
thinking about responsible tech stuff
39:57
like that but one of the things that
39:59
i noticed by having all these
40:01
conversations is that people
40:02
said that sounds great i’m excited i you
40:05
know i see all this media coverage
40:07
but nowhere in there did they say like
40:10
what are the jobs what what kind of
40:12
school what are the skills that i
40:14
that i need like like now you see all
40:16
these like responsible innovation like
40:17
facebook just announced this and you see
40:19
salesforce they’ve got a lot humane tech
40:21
and stuff like that that they’re doing
40:23
and they say okay well how do i how do i
40:25
do that
40:26
and one of the other things that i that
40:28
i have learned over time
40:30
is that so many people who are actually
40:32
in those positions of
40:34
power you know i’m having a conversation
40:36
with them and they say
40:38
uh well yeah i kind of have like a funny
40:40
background it’s not necessarily just a
40:41
traditional tech background and i said
40:43
you’re perfect for this
40:44
exactly exactly because that is
40:47
something that we need
40:48
to emphasize over and over and over
40:51
again because
40:52
the problem that is is still occurring
40:54
is that
40:55
people think that they’re not welcome
40:58
and i and
40:58
you know i really want to emphasize that
41:00
because technology is affecting the
41:03
human condition right now emerging
41:04
technology right in social media
41:06
so we need the saints and the poets and
41:09
the attorneys
41:10
and the psychologists and the
41:11
sociologists and the social workers we
41:13
need everybody because these are
41:14
social problems we frame them as
41:16
technical issues and you want to solve
41:18
the technical issues
41:19
but but really these are social problems
41:22
therefore we actually need more social
41:24
problem solvers
41:25
and then it’s going to be like a hybrid
41:27
so you know i recently saw that like uh
41:29
you know a lot of organizations like af
41:31
were all they’re focused on like okay
41:33
how can we kind of like incorporate
41:35
a broader degree of like uh competence
41:38
in
41:38
certain areas and i think that’s the
41:40
future is that you’re going to have
41:42
these kind of like hybrid
41:43
hybrid types of careers so to kind of
41:45
fully answer your questions like
41:47
i’ve learned from from a lot of people
41:49
that uh
41:50
we need actual onboarding uh that was
41:53
like the big aha
41:54
it’s like nobody has focused uh on
41:57
onboarding people
41:58
into the responsible tech ecosystem so
42:01
uh that’s where that’s what i’m trying
42:03
to do with all texas human we’re you
42:04
know putting out this like guide in
42:05
september and
42:07
uh you know it’s gonna focus on that to
42:09
actually say like here’s somebody’s
42:10
education journey here’s somebody who’s
42:12
getting a phd in this right now
42:13
here’s somebody who who works at a
42:15
center or wherever you know
42:17
focus on ai ethics and then here’s job
42:20
titles that
42:21
are coming up here’s the organization
42:22
here’s here’s like 500 organizations in
42:24
the space let’s hear from them
42:26
because we tend not to have the human
42:28
side of it we you know we have these
42:29
like media stories around it but nobody
42:31
actually says like
42:32
how do i do that we need to have
42:35
like like think about all the problems
42:37
we have today we need
42:39
the the 20 year old who’s like excited
42:41
about this to say
42:43
i have a path that there’s an actual
42:45
pathway
42:46
to to to be a uh you know to be involved
42:49
so so basically i’m just trying to like
42:51
you know find all my replacements
42:53
yeah and that’s great i think that’s so
42:55
important and i get asked that question
42:57
a lot too
42:58
and there’s there’s never a good answer
42:59
so i’m i’m
43:01
just fascinated and excited that you’re
43:03
putting that together i
43:04
my answer is often that you know
43:06
whatever field you came up through
43:08
is the field right to kind of come from
43:12
in order to do this you just need to
43:13
bring this well-rounded perspective
43:16
that you have and a lot of times it
43:18
feels like the people that i know
43:20
that have similar uh kinds of focus to
43:23
to you and
43:24
my and to myself are coming out of user
43:27
experience backgrounds or product
43:29
backgrounds and that’s wonderful i think
43:30
those are great
43:31
kind of holistic backgrounds within tech
43:34
but of course as you’re saying i think
43:35
we need the humanist perspective
43:37
from beyond tech as well and it needs to
43:40
overlap with that
43:41
well i mean one of the big kind of
43:44
classic points that happened
43:45
let’s say with social media uh for this
43:48
was a bunch of years ago you know you
43:49
had this father whose daughter died and
43:52
then
43:52
he kept on seeing photos of his of his
43:55
uh deceased daughter and it really
43:56
bothered him because he said well i
43:58
didn’t i don’t want to see this
43:59
but because we weren’t thinking about
44:01
the the kind of
44:02
impact of over you know seeing somebody
44:05
uh somebody’s loved one who’s now
44:07
deceased
44:08
we were just thinking like this happened
44:09
a year ago therefore i’m going to
44:11
display this
44:12
and that’s where like at facebook at the
44:14
time they had to hire their compassion
44:15
team that they call it which since
44:17
changed its name but it really pointed
44:19
out that the need to say
44:21
wait a minute like anything we’re
44:23
working on and it’s like
44:24
you mentioned some of the the you know
44:27
really interesting but
44:28
uh also kind of like hey we’re gonna
44:30
have a lot of issues technology like
44:31
uh you know uh brain computer interface
44:34
and then you know we didn’t talk about
44:36
virtual reality but obviously with
44:37
virtual reality and augmented reality
44:39
and xr
44:40
uh there’s there’s a whole host of
44:42
issues that are totally like
44:44
unsolved right now but that’s also like
44:46
it’s it’s scary but also exciting
44:48
it’s exciting because somebody listening
44:50
to this right now
44:52
could actually be the person who like
44:55
really helps out and solves the problem
44:57
and that’s something that i think
44:58
sometimes we forget it’s like it’s like
45:00
the classic idea it’s like
45:02
somebody’s got to do this so right now
45:04
we have major issues that need to be
45:06
solved
45:06
and it’s like somebody needs to solve
45:08
these we need more people who are like
45:11
let’s get in the mud and let’s have
45:13
these these important conversations
45:14
yeah i think it’s really always
45:16
important to to be able to think about
45:17
the full range of possibilities
45:19
with tech or with anything and know that
45:22
this could go really badly
45:24
and also this could go really well and
45:26
in my mind
45:27
if once you see that range of
45:29
possibilities it is your
45:30
ethical obligation to work toward making
45:33
it go really well
45:34
you just want to plan out what do we
45:36
have to do to make it go really well and
45:38
avoid the really bad
45:40
well i remember when uh clearview ai
45:43
there was a cashmere
45:44
hill article in the new york times a few
45:46
months back and that was that one that
45:48
kind of
45:48
you know presented a lot of problems
45:49
with like facial recognition and one of
45:51
the
45:52
i believe investors was quoted in the
45:54
piece and he said something akin to
45:57
well you know it might lead to a
45:59
dystopian future but
46:00
what can i do and i said well that’s the
46:02
exact problem that’s so defeatist
46:05
and one of the reasons why we you know
46:07
and i named it all tech is human
46:09
is i wanted to put the agency back on us
46:11
because this is not coming from the
46:13
heavens
46:14
right anything that we create is usually
46:16
within the larger structure of deciding
46:18
what we want to incorporate in our
46:19
society what we want to allow what we
46:21
don’t want to allow
46:23
we should not forget we should not kind
46:25
of like forget our responsibility
46:27
behind this the future is not written if
46:30
the future was written
46:31
then i would be riding around on my 3d
46:34
printed segway
46:35
wearing my google glass that i bought
46:36
with my cryptocurrency but guess what
46:39
guess what we’re using skype i think
46:40
that was like invented in estonia like
46:42
15 years ago
46:43
right so the future is not written
46:46
we are the ones who are determining the
46:48
future let’s not forget that
46:51
yeah that’s a that’s why i have david
46:53
and pogar on my shoulder
46:54
by the way everyone yes you’re waiting
46:56
for that you’re
46:58
such an important point and i also think
47:00
you know it’s also good to be
47:02
able to be optimistic about technologies
47:05
and think about what the
47:06
the opportunity is you know while as you
47:09
say
47:10
you know kind of containing the the bad
47:12
outcomes so
47:13
what is it that you are most optimistic
47:15
about with
47:16
the future of technology or the future
47:19
of human experiences that technology can
47:21
provide i mean it’s hard it’s obviously
47:24
hard to find a silver lining with with
47:26
everything going on covet which is
47:28
obviously
47:28
a you know a major major kind of
47:30
emotional and physical
47:32
disaster for for so many so many of us
47:34
but one thing that i i
47:35
do think it is causing is to say what do
47:38
we want from our technology
47:39
so for example over over the last couple
47:41
years we’ve talked about like let’s say
47:42
telepresence machine
47:44
or you know using robotics like spot the
47:46
dog from boston dynamics
47:48
but now it’s actually a time where we
47:49
need that stuff like if you had
47:51
somebody’s loved one
47:52
in the hospital that you couldn’t visit
47:54
because because you would get sick
47:56
that is actually a time where you would
47:58
want a telepresence machine that is
48:00
effectively like a robot that’s got an
48:01
ipad to it that would zip in there
48:03
right so a lot of these technologies
48:07
can enhance these these these moments or
48:10
can extract
48:10
more more value from it so i think now
48:13
is also a time where we’re saying okay
48:15
how can we then better align
48:17
you know you know what i’m saying like
48:18
the the best and brightest of our
48:20
generation
48:21
should not be focused on getting us to
48:23
to you know click on an ad
48:25
it should be focused on improving
48:27
humanity and if there’s one thing that’s
48:28
pointed out
48:29
with 2020 as it’s gone so far it’s that
48:32
we have
48:33
major societal issues but we also have
48:36
the talent and
48:37
you know expertise to solve some of
48:39
these and another area
48:40
that that i think should be focused on a
48:42
little more
48:43
is we are really missing out on human
48:47
touch
48:47
and i think that you know if this
48:49
continues for a couple months
48:50
it’s really going to start impacting us
48:52
i mean i frankly i feel impacted by it
48:54
because we we need to hug one another we
48:57
need to
48:58
we need to shake hands as americans and
49:00
i know some people you know
49:02
would disagree but we need
49:05
warmth we need presence of somebody and
49:08
i would love it right now if there was a
49:11
way that like
49:12
we ended this this conversation and like
49:15
we had some type of haptic feedback
49:17
where like you could like pat me on the
49:19
shoulder or something like that
49:20
because everybody right now isn’t is an
49:24
avatar so i’ve gotta i’ve gotta have
49:26
something to say like
49:27
kate like you and i are our friends we
49:30
know each other
49:31
right so i want a greater connection
49:33
with you
49:34
than than with any other video that i
49:36
can watch online right now
49:38
you are more important than that other
49:40
video but right now it’s still very
49:43
very two-dimensional and i’m not feeling
49:45
anything from you and i think
49:47
there’s going to have to be a lot more
49:48
focus on saying how can i feel this
49:50
conversation more
49:51
because i mean listen i mean people are
49:53
sick and tired right now
49:54
like not another zoom call you know it
49:57
but if if there was some kind of feeling
49:59
behind it
50:00
then you would say okay i feel nourished
50:02
whereas right now
50:03
you sometimes feel exhausted yeah i felt
50:06
nourished like i
50:07
felt somebody then that that actually
50:09
could be really powerful
50:11
someone just shared and i i retweeted it
50:14
but someone shared with me
50:15
on twitter uh a denim jacket i think it
50:18
was
50:18
maybe zara or ikea or h m or someone i
50:22
can’t remember who was pushing it
50:24
but it was one of these devices where
50:26
you know you wear the jacket and someone
50:28
else wears the jacket and
50:29
if you sort of tap the shoulder it’s
50:32
going to tap the shoulder of the person
50:33
who’s wearing the jacket
50:34
i don’t know but i see a lot of these
50:36
kinds of things yes
50:37
gifting like there’s a whole kind of
50:40
trend
50:41
in gifts that there’s haptic and and
50:44
sort of visual feedback like if i give
50:46
you a device that sits on your table and
50:48
i have one sitting on my table
50:50
anytime i think of you if i just you
50:52
know tap the device it lights up in your
50:54
room
50:55
oh kate’s thinking to me that’s nice
50:57
right right there’s a lot of promise for
50:59
it because again
51:00
we’re not trying to replace humanity but
51:02
we’re trying to always do
51:04
is no matter where you stand on an issue
51:07
at the end of the day we’re actually
51:08
pretty basic we want more friends
51:11
we want more love right they’re actual
51:14
face emotions
51:15
and and i think covet has really set
51:17
that in motion to say
51:18
hey we can disagree about a lot in life
51:20
but what we’re trying
51:22
to do is is get more value be happier
51:25
as as humans and be more fulfilled
51:29
be more kind of educated and like
51:31
stimulated
51:32
and technology has a major role in that
51:35
and now it’s about saying like
51:37
how can it be more focused on that as
51:39
opposed to something that is more kind
51:40
of a
51:41
extractive in nature that’s wonderful
51:43
and as obviously you know as i would say
51:45
it would be about meaning and meaningful
51:46
experiences and that’s
51:48
that’s what for me always comes back to
51:50
is there a concept that it always comes
51:52
back to for you
51:52
is there kind of this one core theme
51:54
that you always find yourself returning
51:56
to
51:58
uh well i think that’s why i went with
51:59
all tech as human because it did come
52:01
down to that because
52:03
when i was thinking about algorithm bias
52:05
things of that nature it always would
52:07
come down to say
52:09
it’s about our human agency it’s about
52:12
what we’re going to allow
52:14
this is not happening to us this is us
52:18
we are part of the process we are not
52:20
just like magically watching something
52:22
take place and i think
52:23
we oftentimes forget that so yeah for me
52:25
i always just
52:26
would come back to say hey alltech is
52:28
human so i’m like you know
52:29
i’m gonna run run with that as an
52:31
organization hey
52:32
if you’re a brander you might say that’s
52:34
a terrible idea you got to put
52:35
foundation or something like that in
52:37
in the name but uh you know i’m gonna
52:39
stick with it
52:40
i think it’s great of course you know
52:42
it’s so similar to my own work so i’ve
52:44
always loved it
52:46
we’re so aligned in that uh but you know
52:48
you talk you’ve been talking a lot about
52:50
this idea that
52:51
you know tech is not neutral and we
52:52
quote kim creighton a lot on this show
52:54
it’s
52:55
happened several times over the the last
52:57
few episodes that we’ve talked about kim
52:58
creighton and her her
53:00
principles and tech is not neutral is
53:02
one of those principles so
53:03
i want to make sure that folks are aware
53:05
of the wonderful work and scholarship
53:06
she’s done
53:07
and promoting that notion so uh
53:10
hey what um what do you think that we
53:13
could do
53:14
in culture just in general to stand a
53:16
better chance of bringing about
53:18
the best futures with tech as opposed to
53:20
the worst features
53:21
yeah well what we can do is actually get
53:25
more involved in the process one i think
53:27
that the part of the issue is we don’t
53:28
feel like we can
53:30
but we’re going to have to demand that
53:31
we are and i think a lot of this
53:33
frankly is going to come down to
53:34
political involvement to say that
53:37
we want we want these these
53:38
conversations to to to
53:40
be happening we don’t want uh it to be
53:43
going in two different directions we
53:44
don’t want something to be adopted
53:46
and deployed right before we’ve had a
53:48
chance to actually say what
53:50
what do we actually desire so that’s the
53:52
biggest part is that
53:54
uh everybody needs to to add their voice
53:57
uh because these are political issues uh
53:59
and right now sometimes people think
54:01
well i’m not
54:01
i’m not a techie uh guess what i mean
54:04
listen we’re uh
54:05
if you’re caring around as smart all the
54:06
more reasons
54:07
[Laughter]
54:09
exactly we need everybody so that’s the
54:12
biggest thing and i say that because
54:14
that’s probably the most common uh
54:16
misconception that that happens all the
54:17
time
54:17
you know i i had somebody that the other
54:19
day who who said uh you know something
54:21
like
54:22
yeah you know like with my background as
54:23
social like i didn’t think that that i
54:25
could get involved with something with
54:26
like machine learning it’s like
54:27
then they were like excited that it’s
54:29
like wait there are these
54:31
opportunities right because technology
54:34
is is much larger so technology is
54:35
society right so it’s like these are
54:37
actual like
54:37
social issues and i think once we start
54:40
start
54:40
uh applying that then we start saying
54:42
yeah yeah i can
54:44
i can get involved uh you know and and
54:46
that’s what we need to
54:47
to really do i i think as a society is
54:50
like
54:51
get plugged in and be part of the
54:53
process how can people get involved with
54:55
uh all tech as human or funny as tech
54:57
how can people follow
54:59
and sure what’s going on there well uh
55:01
if you want to get involved i mean i i
55:03
would say with with all tech assuming
55:04
you go to altercasehuman.org
55:06
or write us at hello at alltech is
55:09
human.org
55:10
uh funny as tech is just funny as tech
55:12
dot com uh
55:14
obviously we’re doing the live shows on
55:16
pause and and just doing regular kind of
55:18
podcast uh recordings but with with all
55:20
to consuming a lot of opportunities to
55:22
get involved so even right now with this
55:24
guide
55:24
we’ve got i think about now 20 25
55:27
collaborators all throughout the globe
55:29
who are piecing together all these
55:30
different organizations and job titles
55:32
and doing interviews things like that
55:34
so it’s becoming a very kind of like
55:36
global effort that we’re looking
55:38
for these different backgrounds so for
55:40
example like we just added somebody uh
55:42
earlier today from the uk
55:43
and then you know she was looking at
55:45
some of the raw material we had said hey
55:46
you forgot these
55:47
three or four organizations it’s like
55:49
that’s exactly why we need all these
55:50
different people because
55:52
i’m not in the uk right like you want
55:54
all these people who say wait
55:55
don’t forget this part you know so we
55:57
just got a few people from india and
55:59
other locations
56:00
and that’s what you need to basically
56:02
say like well don’t forget this
56:04
this perspective because that’s
56:06
something else i learned is
56:07
as i know when we were doing a talk
56:09
recently for autism human
56:10
on balancing uh handling like
56:13
misinformation versus like
56:14
free expression promoting free
56:16
expression uh you know somebody from the
56:18
uk said to me well
56:19
that’s kind of like a very american kind
56:21
of idea
56:22
you know in the uk we’re going to
56:24
balance that that differently and it’s
56:25
like yeah that’s that’s that’s a good
56:26
point
56:27
we shouldn’t we shouldn’t forget our our
56:29
limitations or our kind of bias
56:31
that we’re going to have so anybody no
56:34
matter where you are
56:35
please uh please get involved and you
56:37
can also kind of reach out to me
56:38
um yeah honestly on linkedin i’m
56:40
probably the most active uh i’m not as
56:42
good
56:42
as you on on twitter but yeah i do i am
56:44
at tech ethicist
56:46
um but yeah they have great handles that
56:48
that’s easy to write thank you
56:49
thank you yes but uh that yeah that that
56:52
is
56:52
uh although like people have a a tough
56:55
time saying the word ethicist
56:57
so i tend to get involved with uh with
56:58
with terms that that
57:00
are tongue twisters like i also started
57:02
something before autism called the
57:03
digital citizenship summit
57:05
and often times i’d be on tv and the
57:07
newscaster would say like the digital
57:09
citizenship oh my god like
57:11
come up with a better term man i mean
57:14
they didn’t say that
57:15
yes i don’t know i was i was reading
57:17
their brain to text and that’s frankly
57:18
what they
57:19
what they’re saying to me like get this
57:21
clown
57:22
off and like have them come up with
57:24
better
57:26
short distance from head to side in that
57:28
case
57:29
exactly i had to say digital citizenship
57:32
and then they had to say tech ethicist
57:33
they’re like my god is this guy just
57:35
giving me tongue twisters
57:37
and then they had to say your name it’s
57:38
like how many syllables are you going to
57:40
throw into this oh
57:41
well i’m saying thank you for saying
57:43
like the whole name i just didn’t want a
57:44
google ganger because there’s another
57:45
david polgar out there who does some
57:47
some acting
57:48
so i was like you know what i’m going to
57:49
do david ryan polgar but i’ve learned
57:50
over the years that
57:52
social media is not made for the middle
57:54
name like there’s like nowhere
57:56
on like uh let’s say linkedin to even
57:58
but your middle name and
57:59
people get very confused there’s i
58:01
actually have thought of doing
58:02
a podcast that’s just me interviewing
58:04
all the other kate o’neill’s out there
58:06
because
58:07
that’s really screw with google but i i
58:09
love the fact that there’s uh kate
58:11
o’neil who’s a
58:12
professor affiliated with berkeley and
58:14
she wrote a book last year called waste
58:16
so she’s a climate expert
58:18
and because i’ve done some work around
58:20
tech and climate we
58:22
get confused all the time sorry
58:25
and then there’s kathy o’neil who wrote
58:28
mass destruction
58:29
so you know we get confused all the time
58:31
so i think it would just be really fun
58:32
to do a show
58:33
that is only me interviewing all the
58:36
other similar names out there
58:38
it just creates more confusion i love it
58:40
i love it
58:41
and so as far as confusion with you and
58:43
not not confusing you
58:45
people can find you at david ryan polgar
58:48
on linkedin
58:48
at tech ethicist on twitter uh also your
58:52
tech ethicist.com right i am i am okay
58:55
so
58:55
i try to make it a little easier to find
58:57
me that way and hey i had to block out
58:59
the other david blogger so i own
59:01
davidryanpolgar.com technicist.com and
59:04
uh you know whatever dave ryanpolar.com
59:06
so it’s like
59:07
that’s modern modern life man you know
59:10
it’s all about the url
59:11
excellent yeah of course it is i i i
59:13
think i buy about
59:14
two urls a week so
59:18
i’m the same way like like an idea pops
59:20
in you’re like
59:21
this could be brilliant or this could be
59:22
a terrible idea yeah i don’t know
59:24
we’re 20 bucks you know so yeah geek if
59:27
you’re registering
59:28
two domains a week i told you this we
59:30
said this was going to go on for seven
59:32
hours
59:32
now our audience is like okay you
59:33
weren’t kidding yeah yeah
59:36
i just want to thank you so much for i
59:38
know you know
59:39
folks don’t know but we’ve known each
59:41
other for years we’ve
59:43
worked with each other’s programs and
59:44
we’ve been on each other’s shows and i
59:46
just i’m so grateful
59:47
for your friendship and for the work
59:49
that you do so thank you so much for
59:50
being on the show
59:51
david kate you know you’re kate
59:53
o’neill’s always bringing the ko always
59:55
doing a knockout job
59:56
uh and there’s only one kate o’neil in
59:59
my book so uh thank you for
60:00
doing such a great job with the tech
60:02
humanist show and these are important
60:03
conversations that people need to have
60:05
absolutely thanks so much i appreciate
60:07
you being here to help have them
60:09
well thank you
60:17
you