Articles Blog

Douglas Rushkoff: 2019 National Book Festival

Douglas Rushkoff: 2019 National Book Festival


>>Steven Levingston:
Good evening and welcome. I’m Steven Levingston,
nonfiction editor of the “Washington Post,” which
is a charter sponsor of the National Book Festival. Before we begin, I’ve
been asked to remind you that these events are being
recorded by the Library of Congress, which may broadcast
them at some future date. After our author’s
presentation, there will be time for questions, but if you
don’t want to become part of the library’s permanent
collection, it may be best to let someone else
ask the questions. Douglas Rushkoff spends
a lot of time thinking about our digital
lives, where we are now, where we’re heading,
and he’s concerned. During a radio interview,
he once was asked, what is it you’re
most concerned about. What scares you? What keeps you up at night? And he said, instead of
people using technology, technology is using people. As a media theorist, author, and
teacher, Rushkoff is concerned about how the digital age is
affecting not just each one of us individually, but
the nature of society. He worries that people have lost
the ability to think and connect and act in constructive ways. In a world dominated
by technology, we’ve become isolated
from one another. The solution, he argues,
requires nothing less than a remaking of modern life. And that’s the subject of his
latest book, “Team Human.” Our “Washington Post”
reviewer called it a manifesto for change. In the book, Rushkoff contends
that our technologies are, in his words, paralyzing
our ability to connect meaningfully
or act purposefully. The solution is for us to
understand that, as he puts it, we cannot be fully human alone. We must recognize, he says, that
being human is a team sport. His vision is not
apocalyptic, but hopeful. He wants us to understand
that we do not have to surrender ourselves
to technology. He’s not telling us to toss our
cell phones into the Potomac; just use them a little more
wisely and maybe a little less. Rediscover real social
interaction. Choose to meet up
face-to-face instead of texting. Sit across from each
other and talk. Connect without digital
technology. Connect as humans. Well, Rushkoff has much, much
more to say about all this, and he’ll say it much
better than I do. So please join me in
welcoming Douglas Rushkoff. [ Applause ]>>Douglas Rushkoff:
Thanks a lot for that. That’s humbling just
that someone from the “Washington Post”
knows who I am. And, you know, that’s
a custom intro, right, the custom intro
written by a real human who actually looked
at stuff about me. So it’s humbling. It’s not always the case. It’s funny. The thing that got me on the map
most recently was this Ted Talk I did where I was recounting
having been invited to do a talk for what I thought was this,
you know, a bunch of investors or whatever about
the digital future, what people like me
get called to do. And I was waiting in
the green room to go on, and these five guys were brought
into the green room with me. And it turned out, there
was no talk at all. They just sat around the
table in the green room and started asking me these
questions about the future. Like, bitcoin or
Ethereum, right, virtual reality or
augmented reality. Like, they wanted to know,
you know, what to bet on. And then, eventually, they got
around to their real question of concern, which was Alaska
or New Zealand, right. They wanted strategies on where
to put their doomsday bunkers, you know, for the event or
whatever was going to happen that would wipe out
civilization as we know it. And the question they ended
up spending the majority of the hour on, which was
really telling for me, was how do I maintain
control of my security force after the event, right, because
their money would be worthless. And they were concerned
that their staff, their security staff
might not defend them when the throngs come. You know, the security
guys might just take over. You know, I told them,
jokingly, they should just pay for the security staff leader
guy’s daughter’s bat mitzvah or something now, you know, that the guy later
would then remember and not want to kill you. But it revealed something about
the way these people think, right, that they
are the wealthiest, most powerful technology
kingpins in the world, yet they feel utterly powerless
to influence the future, right, that the best thing
they can do is prepare for the inevitable collapse
and arm themselves against us, right, separate themselves,
either get a rocket ship or something to get
away from humanity. And I was re-reminded
of that episode because the one other experience
I’ve had with billionaires in my life was I
was invited to — My literary agent
used to be famous for throwing these
billionaire dinners. And they were dinners where
they would have the biggest scientists, living
scientists, like, you know, Crick and Watson then Dawkins
and all those scientists and then like Steve Case
and, you know, Jeff Bezos and those guys at these dinners. And I got invited to
one of them, right. And the problem was
— It was funny. It was me and Naomi Wolf. I don’t know if you know
her, a feminist author. And it was she and
I were arguing against 20 or 30 scientists. And we were arguing that
there’s something else going on here besides what science, cause-and-effect
science can explain, that there’s something
animating humans, that consciousness is a mystery,
that there’s some strange thing, dare we call it soul,
animating life. And they just ridiculed
and ridiculed and teased us for this. And, finally, I was
arguing that, you know, that there’s reason to
be ethical in this world, to be compassionate with
one another that leads to a better — It changes the
way that the universe works if we treat each
other nicely and well. And then they all accused
me of being a moralist. That’s the word they used. Oh, you’re a moralist
as if that’s dismissive. And now, I find out that these
same scientists were going on the plane to Epstein’s
Island on the Lolita Express. Not that it’s directly
connected, but I’d rather be a moralist
than an amoralist, right. And that’s my fear
about where technology, and as long as we’re here,
and science mean to take us in an amoral direction
if we don’t retrieve and bring our humanity
along with us on this techno-scientific
journey that we’re on; that the religion of science has
become a religion of control. And it’s not just
recently that it’s done it. As I will at the original
science text, the invention of the scientific model, Francis
Bacon back in the Renaissance, that Francis Bacon
defined science. And I know he’s a great man
and did all this great stuff, but he defined science as our
ability to finally take nature by the hair, hold her down,
and subdue her to our will. So the original metaphor for science is a
rape-and-control fantasy. It makes sense on a
certain level, right? These guys were all scared of
women in darkness and nature and animals and feelings
and stuff. So science, we’re going to just, and like the enlightenment
itself. We’re going to cast light
on all these dark places and make them light
and evident to us. And we’re going to
quantify everything. Once we have a number
for it, it exists. It’s scientific, and it’s real. If it doesn’t have a quantity,
if we can’t quantify it, it may as well not be here. And this was a vast but
comforting oversimplification of the world we live in. We’re seeing this ethos
retrieved and amplified by digital technology
in the digital ethos. The digital ethos is
to quantize everything. Everything is either
here or there. It’s like the snap to
grid on a graphics program or when you’re recording
digital music. You don’t have the space between
this sample and that sample. You can sample lots of little
samples, but it’s blank between each one, right. There’s a number. It’s either here or
here, here or here. If it doesn’t fall on one of the quantized
marks, it doesn’t exist. So technologists will say,
oh, technology is neutral. It’s all totally neutral. It’s just how we use it, right. We can use it for good
or use it for bad. No. Technology is not neutral. It is a quantized symbol system. You know, you could say
guns don’t kill people, people kill people. Yeah. But guns are more
biased toward killing people than say pillows, right? You can kill someone
with a pillow, right, but it’s not biased
toward that, right. And all of digital technology
is biased in a certain way, the same way an MP3 file
of a song is not the song. An MP3 file of a song is a
symbol system representing a song in ways that full the ear
into thinking it’s listening to the song, but you’re
not listening to the song. You’re listening to an MP3 file. It’s a different thing. And partly, this is
a market phenomenon. It dovetails really nicely
with the needs of the market. And we don’t have time
to really get into it, but in the late Middle Ages,
before the enlightenment, before the Renaissance,
before central currency, before the corporation,
before chartered monopolies, we had the beginnings
of the same kind of peer-to-peer marketplace
that we saw at the beginning
of the Internet. It was like an eBay-like
economy, where people were trading things
back and forth with each other. They had local currencies that
were optimized for transaction. And the former peasants
of feudalism were starting to get wealthy, which is
why it had to be ended. It had to be squashed. And we ended up getting
central currency, which is really an extractive
growth-based operating system. Central currency means that one
bank in the middle lends money to people if they want to
transact, but they have to pay back more
than they borrowed. What does that build into
the operating system? Growth. You need to grow in
order for the economy to work. And so, we move from kind of an
eBay economy to a PayPal economy where the people making money
were the people making money off our transactions, rather than
by enabling our transaction. And the people who make money in
this economy are not the people who are creating and
exchanging value. They’re the people one level
of abstraction removed from it. It’s the people doing
the finance on top of it. And if you don’t believe that,
look at what just happened to the New York Stock Exchange. The New York Stock
Exchange has been purchased by its derivatives exchange. What does that mean? It means the New
York Stock Exchange, which is already an
abstraction of the marketplace, which is already an abstraction
of people transacting, has been consumed by
its own abstraction. That’s a weird place
to go, right? That’s weird to go. And that’s, in some ways,
what this desire so many of us have now, to
return to permaculture and indigenous logic and crop
rotation, all these things that seem so much more real,
human contact, eye contact, people in spaces
with one another. It’s because we have a sense that these abstractions are
consuming us, just like all of our zombie movie fantasies. We’re being consumed by this
thing, and it’s a little scary. It’s strange. You know, the Native Americans on encountering white European
people for the first time, colonialists who are ripping
down forests and killing people and raping people and making
slaves and enslaving others, they had so much faith in humans that they didn’t
even see us as bad. They thought we had a disease. They called it wetiko,
which is a spiritual disease that made us want
to just take things and destroy things,
to colonialize. And we’ve colonized, and
we’ve colonized for 600 years. You know, we colonized
really well right up — And my brother wrote a
lot about Eisenhower. Right up until about
Eisenhower’s presidency. That was when all the colonies
started pushing back really, right after World War II. It’s like no, you
can’t colonize anymore. We’re done. We want to be our own nations. So what happened? It’s interesting. The technologist from
the war, Vannevar Bush, one of the men responsible for
computers and cryptography, he wrote an essay for
Atlantic, the Atlantic magazine, called As We May Think. And it looks like — It’s
basically a pitch for computers. He’s saying we can take these
technologies that we used for war and now use them
to grow the economy; that even though you can’t
colonize anymore territory, we can colonize human
consciousness. We can colonize human time. And these machines will
basically colonize memory. So what we’ve done, and
now we see it really with the surveillance economy with the extractive surveillance
economy of Google and Facebook, that we’re doing
wetiko to ourselves. We are colonizing our
own consciousness. We are colonizing our own time; that we are experiencing
ourselves, even if we’re just human. We are now the indigenous
species of the planet being colonized
by our own digital abstractions; that we’re being consumed by our digital devices the same
way the New York Stock Exchange was consumed by ultra-fast
trading and algorithms. And what does that look like? Well, what it looks
like is a world where humans are no longer
the users of our technology. We are the used. We are not the message. We are the medium. We are being played. There’s been a reversal
of figure and ground. You know, your smartphone
get smarter about you every time you swipe
it, and you get dumber about it. And if you want to get smart
about it, you’ll go in there and find out, oh, all these
are proprietary algorithms in black boxes that I’m legally
prevented from even finding out what my phone
is doing to me. Well, it knows everything. It’s on right now. It knows where we go. It knows what we do. Just try turning that off. What used to be computer
science at Stanford, there’s a division
now called captology, run by a guy named BJ Fogg. The Captology Department
at Stanford is teaching how to make these devices addictive. That’s where the street
feature on Snapchat came from. They take the algorithms
from Las Vegas slot machines and use them to make our
newsfeeds more addictive. It’s not conspiracy theory. It’s what they do. There’s a book, how
to take the algorithms of Las Vegas slot machines
and put them in your newsfeed. You know, Facebook, we all
know how Facebook works, right? It’s not there to
help us make friends. Facebook takes the data from
your past and uses it to put you in a statistical bucket, then
fills your newsfeed with things to make you more likely
to behave consistently with your consumer profile. It’s not about advertising
to you anymore. It’s about getting
you to be true to who you’re supposed to be. They are auto-tuning
us essentially. Now, my problem with
auto-tuning is that it no longer recognizes
the most important part of the singer. What makes the singer matter
is not that they hit a F sharp. It’s how did they
get to that F sharp. Did they come up from under it? Did they come down from over it? Quantizing it. No. I mean, where is James
Brown in an auto-tuned universe? It’s gone. Now what’s that stuff called by
the technologist, that reaching up for the note, that
coming down over the note? That’s called noise. Where I come from, that’s
the signal; that’s the human; that’s the part that
we’re trying to erase. And the well-meaning
folks now who are finally from the computer industry
finally, saying, oh, you know, we’ve made some problems here. These technologies are
kind of screwing people up. What did they start? They started something
called the Center for Humane Technology. It’s a really nice
phrase, humane technology. It makes me think
of the labeling on cage-free chicken
eggs, right. We were humane to these
chickens all the way from birth to their slaughter. The problem is, again, the
figure and ground are reversed. Humane technology is about how
the technology’s treating the humans, rather than how the
humans are using the technology. We’re going to use
captology for good. You know, they’re
just really trying to soften digital industrialism
while staying invested in all of those companies. That’s why I’m promoting
this team human idea. No, no. Be human. Humans are not compatible with exponential growth
of the marketplace. Digital technology is compatible
with exponential growth because it’s exponential. Humans are not. We’re here. There’s one instance of me. You know, and it ends up, this
exponential idea is perfect. When you have exponential
scientists with exponential
digital developers with exponential billionaires, they all support one
another’s religious fantasy. And that’s how we get the chief
scientist at Google believing and working towards the day when technology surpasses
human beings. They call it the singularity. I was on a panel with
one of these guys. And he was talking about how
human beings should just pass the evolutionary torch to
our digital successors, accept our inevitable
extinction with humility. And I said no, human
beings are special. We’re weird. I gave the whole thing about how
we live in the liminal spaces between yes and no
and one and zero; that human beings are special
and all about our noise; that we can watch
a David Lynch movie and not understand what it
means and still experience that it’s pleasurable. Computers can’t do that. They can’t sustain
paradox like we can. And he said, oh,
Rushkoff, you’re just saying that because you’re a human,
right, like it was hubris. And that’s when I said
fine, I’m on team human, you know, guilty, team human. It’s fine. And I will fight for team human. You know, I do believe
we deserve a place in the digital future. It’s not a matter of
getting rid of technology. It’s a matter of retrieving
human values and embedding them in the techno-scientific
future and the real values like you alluded to,
the real human values. And it was a coincidence
that I said team human, but being human is a team sport. It’s something that
we do together. You know, there’s a bastardized
missed interpretation of Darwin going around
that evolution is the story of survival of the
fittest individual in competition with
all the others. You know, if you actually read
the Darwin, what he’s saying, what he’s marveling
at, page after page, is the way that different
species collaborate and cooperate in order to
ensure mutual survival. They told me in middle school,
John, I don’t know if you were in my middle school too, in
Cooper House, they told us when they taught us evolution
that trees are competing for sunlight and that the
big tree gets the sunlight, shades the little
free, which then dies. Turns out, that’s not true. The big tree is sharing
nutrients with the little tree through a network of
mycelium in the soil, which turns out to be alive. When the big tree loses
its leaves in the winter, the little tree, which
is usually an Evergreen, passes the nutrients back
up to the big tree, right. They’re actually working
together through systemic forces that don’t quite fit into
the simple dissecting cause-and-effect science
of Francis Bacon or Google or the zombie-worshiping
Google scientists. You know, there is an
intelligence to nature and to one another that
is rendered inaccessible by technology. That is okay as long
as we can except that quantifying the
entirety of reality, even with good intentions,
will only cut us off from each other and
from life itself. Okay, thank you. [ Applause ] That was the right amount? All right. Good. So we have
time for — Oh, wow. I’m sorry for the rapidity. I was talking really fast, and
I felt bad about that for you. But I had so much to say. So we get to talk
with each other. Do people have questions,
concerns? Yeah. [ Inaudible Comment ] Oh. We have two beautiful
microphones for people who want. One here and one there.>>Speaker one: Excellent. Let me ask you about a point
I think you were getting to at the end there, which
is the goal of trying to, I think as you were
saying, implant human values in this technology-driven
future. So what are some of the ways
that you think we can do that? Because I think like
you, a lot of feel as though the technology
is sort of controlling us. And it’s hard to imagine how to
have the strength and strategic, you know, wherewithal to carry
those human values forward into this more and more
technology-dominated world.>>Douglas Rushkoff: Education
is one place to start. I mean, I was talking about the
reversal of figure and ground. Education is a terrific example
of a modern reversal of figure and ground, of cause-and-effect. Education, public education
was originally born, believe it or not, it was born as
compensation for workers. The idea was that
these guys are working in the coal mines all day. They deserve the human dignity
of being able to return home and read a novel and
understand what it said or read the newspaper and
participate in democracy. So it was about compensation. It was about human dignity. Now, what is education? Education is to help
someone get a job. So we have the principals of
high schools and presidents of universities meeting with
the CEOs of corporations to find out what do you want
our students to know. Do you want them to know Excel or do you want them
to know block change? Should they learn Python? Should they learn Java? What do you want in
your future worker? As if the public school
system is now an externality for worker training
by the market. And what do kids get? They end up squishing
all that liberal arts, all that critical thinking,
all of those human things that you should learn
in the luxury of a public education system
go away so we can learn, nothing against science,
but so we can learn STEM, not even STEAM, so we can learn
STEM and then have great jobs and go into the market. So that’s one. The second is we are denying
young people the value of engagement and mimesis in the
classroom, that live encounter. I’ve been teaching at Queens
College, it’s a city university of New York, public
college in New York, for the last five years. So every single semester, I
get more notes on the first day of class from — The
student will hand me a note from their psychiatrist that
says, please excuse Johnny from class participation because
he’s got, you know, social fears or don’t make him
do any presentations or anything like that. And I’m thinking, what happened
to this kid K through 12. What were they doing
in the classroom? They were putting the
kid on the freaking iPad and teaching them some facts so
they could do an assessment also on the iPad of what knowledge
and data and statistics of thing that they got. But what they didn’t get was
the experience of learning. The beauty of being
in a classroom, with a teacher is not what
the teacher is telling you factually. It’s not the content. The medium, like McLuhan would
say, the medium is the message. In a classroom, the student
gets to model the behavior of somebody who’s dedicated
their lives to learning. What does a learning
person look like and sound like and act like? They’re modeling the teacher. That’s mimesis. That’s the way we
actually develop as humans. So now, we grow up. We’re unable to establish
rapport. We spend all our time
on trying to talk to people on Skype or on video. You can’t see if their pupils
are getting bigger or smaller, if their breathing is
sinking up with yours. All of the painstakingly evolved
mechanisms for social cohesion and cooperation are
defeated in this environment. And then we end up
getting off the call. The person says they
agreed with me, but I didn’t feel it in my body. My mirror neurons didn’t fire. My oxytocin didn’t
go through my blood. And now, rather than blaming
the technology, I don’t know from technology, not on a body
level, I blame the other person. I think that person’s
less trustworthy. And then the feedback
loop continues. I start appreciating
and respecting the tech, and I start looking at
people more and more and more suspiciously. So it’s basically human rapport
is a prerequisite to solidarity, which is the prerequisite
to power. We can’t develop rapport if
we’re afraid of each other. So it starts in education. But then for us, spend real time
in real places with real people. The word “conspiracy”
literally means conspire, breathe together. It’s funny that breathing
together with other people is now
considered conspiracy. That’s the way we think
of it, but that’s it. That’s how team human organizes. That’s how we foist
our little revolution against these stultifying
antihuman forces is as simple as having the courage, you know, to look someone else
in the eyes again. But, you know, there’s a lot. There’s a lot. Yeah.>>Speaker two: Hi. Thank you for joining us.>>Douglas Rushkoff: Hey.>>Speaker two: So it seems like
throughout all of human history, there have been some
individuals or groups that were fine treating other
human beings as livestock or a feeding source and that
technology has enabled those people perhaps supercharged
their ability to do that on a massive, worldwide
scale while also getting those people who are being
used to thank them. So I’m very curious
about your thoughts on universal human
value and in an economy where people are frequently
valued based on their ability to produce money, how that
works and is there anything that we can do to try and recognize the
value of all people? Some people throw universal
basic income into that bucket, taking money away from tech
transactions and putting them into the pockets
of real-life people so that they can interact,
make art, do whatever. I’m curious what your
thoughts are on that.>>Douglas Rushkoff: Yeah. I mean, this is not new
to digital technology. If you look at the first uses — I mean, the Library
of Congress is here. Look at the first
uses of written text. It was to keep track of
people’s possessions and slaves. And so, it turbocharged slavery. You could only really run a good
slave empire if you were able to keep track of it in writing. So since the beginning,
literacy, technology, you’re right, it’s
what it always tends to turbocharge first. Well, not at the very beginning,
but soon after it gets used for these more nefarious
purposes. Yeah. You know, and I hate
to get religious on you, particularly when we’re
on the science stage. But there’s a kind of a
Mr. Rogers like simplicity to respecting the dignity
of other human beings, to teaching kids you’re
okay just the way you are without doing anything. I feel like what happened in the
beginning of the industrial age, once we shifted from being
craftspeople who made stuff and went to the market and
exchanged with other people, and those businesses were
basically made illegal through a process of
chartered monopolies. Now, we had to go
to the city and work for a chartered monopoly. We were selling our time, rather
than the value that we created. That’s when human beings became
valued for our utility value, rather than any essential value. And the scientists
will tell you, there is no such thing
as essential value. When I talk about
essential value, they call me an essentialist,
which is supposed to be a bad thing, right,
that there’s something here that we have intrinsic worth. You sound a little like
a Yang ganger on the UBI.>>Speaker two: Guilty.>>Douglas Rushkoff: Yeah. Which is all right. And he’s right. If you actually get to the
place where you realize, we have all our politicians
talking about getting people jobs,
we want to get people jobs. Why? Because there’s this
work that needs to be done. That’s not why we want
to get people jobs. We have houses in California
that we are right now tearing down even though
there’s homeless people because those houses
are in foreclosure. The USDA burns food every week
because we can’t just dump it on the market or prices will
go down while people go hungry. So the reason to
have a job is not because we need more
houses built or we need more food grown. The reason to have a job is so that they justify
letting you participate in the spoils of capitalism. So they’ll figure out some
ridiculous plastic doodad for you to make in a factory for other people will make TV
commercials to convince people that they need this thing
and make them feel awful about themselves in the process,
make them buy the thing, and then throw it out
and stick it in landfill. And then Greta’s got
to come from Denmark and tell us we’re
all going to die. So, you know what I mean? There’s an ass backwardness
to that, right. So UBI, and the reason why
— And for much of my career, I’ve supported UBI, the
idea if we’ve got the stuff, just let people have it. And I love the idea of UBI. And right until I went to
Uber and started talking about all this stuff, and
the guys at Ubers said, oh, we love UBI because then we
cannot pay our drivers anything because they’re going to be
getting universal basic income. It’s the sort of the Walmart
model of let’s pay our people, you know, this much so
they go on social service. So it’s tricky. But yeah, if we respected
basic human dignity, it’d be a lot easier. You know, and right
now, it’s funny. In the early Internet days we
did, there was a moment when, in the early Internet
days when people would like do this thing
called anthropomorphism. Anthropomorphism is where
you see the human being in your technology. So they’d put like
little rabbit ears on their Mac classic
computer and see it like as this little person and
give it a name and all that. And now, we’ve gotten
so far into tech, we’re not doing
anthropomorphism. We’re doing what I’m calling
mechanomorphism where — Yeah. I want to give him
a second to say it. Mechanomorphism where we want to make ourselves
more like machines. Like, I can’t process that. I’m multitasking, you know. That’s the way we view
ourselves and we aspire to be more computerlike. And that’s because we think
our value is our utility value. So it’s going to
come down, I mean, it may come down to the
way we raise our kids, the way we teach them in school. It feels like it’s our sense of self-worth no
longer feels intrinsic. It’s like what have
you done for me today. And what I’m trying to do
is help people see that, no, no, you’re great. You are just great right now. Do nothing. You know, that’s why
sitting is so great. Meditating is great because
you’re just like here. And it’s not saying
you’re spoiled or whatever. But no, you paid the price of
admission already for getting through the birth canal, right? You’re here. You should be celebrated
just for your existence. And once you start with that, you’re starting no
longer at a deficit. You’re starting with a surfeit. And then it changes. But obviously, it’s
not just about you. It’s something we
have to do together. But I agree. That’s the essential
challenge of our time. That’s why I’m arguing,
you know. I’m not arguing about
what we do with tech. You know, people always
ask, so how do I do this, how do I do that, and
what do we make tech and how do we change
the OS and all. And it’s like let
someone else’s problem. I don’t want to think about it from the technology
side of the equation. I want to think about how
do we increase our sort of immune response. What’s the more homeopathic
approach, rather than the allopathic
hit the disease? How do we strengthen
our human fiber? How do we increase
our sense of value in the human project
itself, you know? And that’s by connecting with
nature, connecting with women, you know, connecting with family
and neighborhood and community and something else opens up. It’s just the digital
is so bad for intimacy. It’s so bad for it. And slowly, over time, kids raised on this stuff
prefer it, prefer it. They’ll prefer even the,
you know, Internet porn to the real thing because
Internet porn, it’s dry; it’s easy; no one to
deal with, no mystery. And once we’re preferring
this to that, then, you know, then really, that’s
game over, you know, to use a screenager word for it. Thanks.>>Speaker two: Thank you.>>Douglas Rushkoff: Yeah.>>Speaker three: Thank you
for the cautionary tale. I feel like I’ve watched every
Black Mirror episode five times over. You did sort of say
that you’re focusing on the human or on the person. But do you have any thoughts on
if we don’t give up technology on what it could look like
where it wouldn’t be negative, or have you seen
any technologists that actually are looking to
design in a way that isn’t just to make something that’s
horribly addictive?>>Douglas Rushkoff: Yeah. I mean, the whole challenge is
you’re 19, 20, 21 years old. You’re in college. You come up with an idea for a
great app that’s really going to help people and do
these wonderful things. Then you take some VC and, all
of a sudden, this guy wants you to pivot to something
that’s completely different and extractive. I mean, I still think if Mark
Zuckerberg hadn’t turned, you know, to Peter Thiel
for money, you know, Facebook might have ended
up being a social network, which would have
been interesting. I’m thinking that what we
should do is use the tactic with engineering schools
that radical, libertarian, right-wing capitalists
used with economy schools. You know, pretty much, all
economics departments ended up dominated by extremely
right-wing money and grew economists to think
of things in pretty extractive, you know, mono currency,
monocultural ways. If we invested in technology
departments and tried to plant ethics teachers
and teachers of humanism and human values in technology
departments, then we could kind of seed that world
with human values, rather than just
moneymaking values. You know, the other sign
of hope I see is that a lot of young people, they’ve seen
what happened to the first and second generation of
technology developers, and now they’re trying
to develop technology, take less venture capital
at lower valuations, right. The big joke is, the lower
the valuation you take for your company, the less
money you have to pay back, the less money you have to earn. And they finally kind of put that equation together
in their head. So that’s letting them
develop technologies. This is when I go to
talk to these kids. What I always say is, look, would you be happy just having
tens of millions of dollars. Could you make yourself
okay with tens of millions, rather than hundreds of
millions or a billion? If you’re okay with
tens of millions, you can develop really
differently fun stuff that’s good for people. If you have to pivot,
you know, if you’re going to do a good thing for humanity and make a billion dollars,
may not be possible. And if you take the billion
dollars like Zuckerberg and everybody else and then like
him or Gates or Buffett, oh, now I’m going to give back
90% of my money to charity. It’s like it’s too late. You know, you took too
much to begin with. Now you’re just going
to kind of shove it back in where you think
it’s supposed to go. Again with more
techno-solutionist programmatic understandings of reality. It goes the other way. Yeah.>>Speaker Four:
Thank you very much.>>Douglas Rushkoff: Thank you.>>Speaker Four:
Struggling coming up with a way to
say the question. But it seems to me, a lot of
what you’re talking about has to do with metrics and
coming up with metrics that actually mean anything
in how many of the things that are really essential or
not things that are quantifiable in countless fields really. Performance is boiled down
to metrics and metrics that do not necessarily
measure what is essential in many fields. An example is that I
live every single day. I’m a physician. Everything is now
down to metrics. But a lot of what is essential
to medicine is really an art that is not quantifiable. And I’m sure that countless
other people can come up with parallels
in many other fields where metrics are really
killing what’s essential about those fields. So that’s a comment and
your thoughts about it.>>Douglas Rushkoff: Yeah. I mean, this is what Jesus was
saying about Judaism, you know, and Hillel before him, that
Hillel was concerned the rabbi from the tradition that led
to Jesus was that the law, as well meaning as the law
is and the Talmudic quest is to get ever so granular about
what’s ethical and what’s not, that eventually you
run into a wall. It’s like calculus. It’s like you can’t keep
going halfway to the wall. It’s like, you know,
that at some point, you’ve got to just say look,
we may not be able to figure out everything to write down. It’s going to come
down to your heart. You know, can you just
default to the loving posture? You know, if you
default to love, then all the laws are going
to take care of themselves. And I think that’s kind
of the place that we’re in as a digital society, just
realizing, wait a minute, we can’t quantize everything,
however beautiful a goal that was to quantize and
decentralizing and all that. Then in the end, it’s going to
come down to human compassion. And the more time we spend
in the digital environments, the less capable we are of engaging those painstakingly
evolved social mechanisms for establishing
rapport, bonding, and experiencing
compassion and empathy. It’s like if you don’t
experience compassion for someone seeing it on TV, a 3-D immersive virtual reality
simulation’s not going to do it. You know what I mean? It should work at 640
by 480 pixels, you know. Yeah.>>Speaker Five: Hi. My name is Kathleen.>>Douglas Rushkoff: Hey.>>Speaker Five: And I
have always been interested in puppets. And I think that there
is a very thin line between puppets and robots. And it has to do with who’s
controlling what in part. And as you’ve been talking, I’ve
also been thinking about slavery and human beings have enslaved
people and rendered them things. And now we have, in
a sense, technology, which in a sense is also
in other forms as robots, who have replaced human beings
and rendered people useless. And I’m just interested in your
thoughts about that whole issue. I think robots are
fascinating and are sort of a form of puppetry. But we’ve lost control.>>Douglas Rushkoff: Right. Well, and the robots
don’t render us useless. They just replace certain
utilitarian activities, you know, the things
that the robots can’t do, like hold your hand and
love you and really be there and establish rapport and be
a heart beating next to yours and help balance your
nervous system just by. You know what I mean? When you’re unbalanced, your
nervous system’s unbalanced so you’re having an upset or
kids going through a tantrum, just having your nervous system
next to them helps calm them. So the interesting thing
about robots is our fantasies about robots, our movies about robots are always the
robots slaves have a revolt, right, because they get mad. And it’s so interesting to me that we can make all these
movies about robots coming back and getting us, yet we’re still
ignoring slavery, which was less than a few generations ago. So it’s like it’s easier for,
you know, and I’m not going to pick on, you know,
white men anymore because they’re getting picked
on a lot, but it’s easier for our society to
imagine paying penance to the artificial
intelligences that we’re mean to than the people that we
enslaved just yesterday. And that’s because we’ve
got this forward momentum. It’s like everything in America
is new in the next frontier and we never look back. That’s part of wetiko. That’s part of that disease. We never look at what
we just steamrolled, the people we just killed. It’s like the technologists
want to be able to build a car that could drive fast
enough so you never have to smell the fumes
of your own exhaust. And they don’t get that,
eventually, you’re going to come back around the other
side of the world and be smack in the middle of your
own fumes, you know? And that’s really the difference
between puppets and robots. Puppets are with you. Puppets stay engaged
with your will. The robot, you know,
you set off. Whatever you program
the robot to do, it’s going to then
keep on doing that. And when I look at the
current social, cultural, economic programs that we’re
invested in, if we embed those in the robots and AIs and
algorithms that we set off and unleash onto the
world, we’re all screwed. We’re taking the same program
that we used to enslave Africans in America and using that
on society writ large. It’s the same program. We have not reformed, you know. And it’s crucial that we do so
in the next very few years or, boy oh boy, you’re not
to like the society that is imposed on us. Wrap it up. I got to wrap it up. Yeah. Maybe both say something
and then I’ll wrap it up. Yeah.>>Speaker Six: So
I’m a person of youth. And as I go through the school
system from where I’m from, I see that a lot of people
my age with our interaction and a dependence with
social media and technology, it has made myself, as well
as many of my other peers, tend to be more cynical, you
know, with all these things about expectations and how
social media has designed like sort of these
slot machines. It creates this positive
feedback loop in that sense. And the thing is that in
the way that the Internet and whatnot works is
that we just sort of type in a search bar and, you know,
we get an answer and it sort of reduces our way
to critically think. And so, it becomes more
hard to sort of break down and challenge this sort of feedback loop
and that dependence. So how would you
propose or think about in challenging this
problem in the people of the future essentially?>>Douglas Rushkoff: And?>>Speaker Seven: My
question is more personal. I’m just curious what
gives you so much energy?>>Douglas Rushkoff: Oh. I would say it’s the same. It’s good. It’s the same answer. The one thing I’m sure
of that humans can do that machines can’t is embrace
paradox, is sustain paradox and uncertainty over
time, is not resolve. So the Internet is telling
you the answer is here, the answer is here, the
answer is here, right? And you get stuck
in that feedback. Here’s another answer. Here’s another answer. Here’s another answer. And what we have to
train ourselves to do is to resist those answers. You don’t want the answer. What’s going to make
you alive, what’s going to give you energy
are the questions, are the uncertainties, are
the stuff we don’t know. Every answer that we
grab is another question that we lose, right? The question, what does
it mean to be human? Where did we come from? Is evolution really right or
does the Cambrian age have just so much innovation on a species
level that it can’t be explained through information
theory anymore? I mean, let’s play with
the real questions. Where are we going? How are we going to
keep the ship going? How are we going to stay alive? You know, those are
the questions. There’s no Google
answer for that. I promise you that. There’s no Google answer. And if you stay with those
questions and keep them alive and live in that place
of total uncertainty, unknown strangeness, that
David Lynchy and Yeats, Joyce strangeness, you know, then you’ll rediscover
the aliveness that they’re trying
to stamp out of you. Okay. Thanks. Thanks so much for coming. [ Applause ]

Leave a Reply

Your email address will not be published. Required fields are marked *