Freedom™ — Surveillance in 2030

– This is how we’re gonna start. So back in the 18th century, this guy Jeremy Bentham came up with the idea of the panopticon. The panopticon was a theorized prison where one or two prison
guards could literally be in the center of a circular prison and see all of the prisoners. The effect of that was
that the prisoners felt that they were always being
watched even when they’re not. And then they fell into
lines of disciple based on their own thoughts and fears
around doing something wrong and not getting out in time. And I actually think that
we’re in a modern world where the panopticon is
kind of surrounding us. And this is what this
presentation’s a little bit about. But Foucalt is another social theorist that came in the 19th century. And he started to look at what
Bentham was talking about, and started to think that
actually surveillance is a permanent thing. And that even it’s discontinued, that people still feel
that they’re being watched. So they normalize their behaviors. And what’s really interesting is, back in the day and even
before the 18th century, there are a number of different ways that we ensured that kind of discipline and structure in society in a way where people behaved themselves. We actually are now living
a world where these old ways of control and back in the
day it was around police, and school, and discipline
and torture have been replaced with a subtle calculated
technology of subjection. We know that technology
is taking over the world. We’re surrounded by it at
every single point of our day. Every second of our movements
are generally being observed or we feel that they’re being observed and they’re being monetized
in some sort of way. And you know as privacy
security professionals, this is something that we
really need to be aware of. But also come back to thinking about, what can we do in this modern world? What is our role? There’s been some amazing
sessions today already. I look forward to coming again tomorrow and see what more is being spoken about, but I’m just gonna take us
on a little bit of a ride into what I’m calling
the signals of change. The signals of change are these things that I see every single day on the streets or I read about in the
articles that I read. And with the people that I
have lots of discussions with around privacy, security, big
tech companies and government that are indicating to
me that we’re surrounded in this world and that
that idea of the panopticon is prevalent in our society. So all kind of starts when
you think about computers with this guy. This is Douglas Englebart, in 1968 he did something
called the mother of all demos. It was in Stanford. It ran for a couple of days
and over a thousand people, very much like this room
actually watched him give a demonstration of the
world’s first personal computer. Now that’s 50 years ago. If you actually think of the acceleration between then and now,
it’s been incredible. Most people couldn’t
even fathom that we would have this kind of technology in our homes. And not many people could really understand the impact of computers. This is actually an
article taken from 1983. The very sort of optimistically said, why the computer will
reduce political upheavals. Now hands up, who thinks
that the computers have reduced political upheavals? (laughing) It’s because they didn’t
realize what was coming next. And this is what was
coming next, the Internet. So the commercial availability, the public Internet as it
kicked in in sort of ’92, ’93. And I go on the Internet
around about 1993. So Tim Berners-Lee helped usher that in. And now we’re in a position
where the entire world is connected by cables under the sea. Data is flying through the air. And suddenly we’re
generating more data today than we ever have done. By 2025 they actually think we’re gonna generate about 44
zettabytes of information on a yearly basis. That’s the equivalent of a billion billion hi-definition movies. Today it’s only 4.4 zettabytes. So that acceleration
comes from the technology that surrounds us and the way that our cities, are homes, and our offices are actually changing. But no one really understood
how mobile was going to really accelerate as well. Now I normally ask you, hands up, who doesn’t have a smartphone? It’s a massive audience
full of security people. I know you’ve got burner
phones, it’s all good. (laughing) I normally make jokes about
you’re the security guy, and there’s like, okay there’s
500 of you, I love you. (laughing) But now we’re at a point where there’s six billion
smartphones in the world. This is how people do the work. Who would have thought that
Douglas Engelbart would’ve brought in this new age. And in 2007 Steve Jobs would have said in my hand I’ve got an
Internet communicator, a music player, and a smartphone that lets you have conversations. But now having conversations
is probably the fifth or sixth most likely thing
that you’re gonna actually do on a smartphone, right? So that’s now led us into a world of the last sort of 12 to
15 years of social media. Hands up who’s on Facebook in the room? I love it, I love coming here because it’s like not many of you. And it’s actually because it’s
actually a terrible platform in terms of your rights
as a human and a user. And what’s really interesting
is like the evolution of the like button was this, right? So you like, you love,
haha, wow, sad, angry. It’s the architecture of
a modern relationship. And not a lot of people
actually understand that this is exactly what happens. These systems fit human behavioral norms and those behavioral norms
change to fit the system. We’re shaped by our
tools, Marshall McLuhan. And I was in a session earlier and someone held up an Amazon Alexa and they did a whole routine where they were talking about healthcare. And Amazon Alexa was
responding and being friendly and collecting information
and then transmitting it to this gentleman’s doctor. And that’s really interesting to me. There have been 10s of
millions of these devices sold in North America. They’re actually like triple,
or quadruple the amount of Google devices been sold. And there’s Apple devices
and there’s more coming out of China of various
brands that are copycats. I find this to be one
of the most pervasive and world changing technologies, just because we don’t really know that it’s there most of the time. But if you have children,
and you have an Amazon Alexa in the room, you ultimately
have a new big sister for them. In fact Amazon came out with a version, the Echo Dot for Kids where it would teach your kids manners. So fundamentally this kind of device is replacing you as a parent. Now I don’t have one of
these devices at home. I feel that it is too
intrusive in my life. But there’s something really interesting. This is the first time
we’ve really seen the input which is voice and
environment and sound coming into a device, a central repository where artificial
intelligence machine learning can train itself and then at scale, push out responses to the
behaviors that we want to see these machines to have. In fact there’s a lot of experimentation that’s going on with these. And Amazon ran a competition, and one particular group trained it based on all of the conversations
that were happening on Reddit and across a number of other social platforms as well, and in one case it actually told someone to go and kill their foster parents. So this is actually
really concerning to me. Because it can take the good and the bad of who we are as humans, and it can feed it back to us. But not necessarily to
us as being the people that were the origin of the
thought or the sense men, but at scale across millions of people, if not billions of people eventually. So when I start to see this technology, I start to understand
that maybe you’re gonna start tiptoeing around
and not talking so much, because Amazon Alexa is
probably listening to you. And then when we hit our high streaks, we’re starting to see an evolution of how we’re just gonna operate
with normal daily chores, like going to the supermarket. Within three miles of
every single Whole Foods is the majority of Americans that have got both Amazon Prime and own over $100,000 in
terms of household income. And what’s really interesting
about this technology is it’s enabled by
cameras and sensors, RFID, and a number of different kinds
of artificial intelligence that really watch what we do
when we walk into a store. So take a look. (upbeat music) – [Narrator] Four years
ago we started to wonder, what would shopping look
like if you could walk into a store, grab what
you want, and just go. What if we can weave the most
advanced machine learning, computer vision, and AI into
the very fabric of a store so you never have to wait in line. No lines, no checkouts, no registers. – So if we’re in this
particular kind of system and ultimately we are part of the product, we’re in a system that can
influence how we behave in these stores. Now if you think about the ideas of fast and slow thinking
and what Daniel Kahnerman is very well known for with reflexive and reflective thinking, we’re gonna not really
consider what we’re buying half as much as we would if it
was going into our basket and then we’re walking
around the supermarket with it looking at us back in the face. So probably gonna spend about
30 to 40% more every time you visit an Amazon store. But really you walk in,
you pick up what you want, you walk out. I used to call it shoplifting and today we’re calling
it modern life, you know? (audience laughing) So it’s kind of interesting where we are. And now you’ve got Walmart and Microsoft, and lots of other supermarkets
who’ve got competitors to this kind of technology, because they want to remove humans from the mix of the shopping isles. And literally we’re gonna have humans that are literally concierges
to help you spend more money versus just to ring it up at the till and help you pack your bags. And then on the streets we are starting to see some really
interesting things happen in terms of CCTV and surveillance cameras. I grew up in the U.K. I worked for many years in London, I’m literally used to waking up every day and having dozens of cameras in my face. It’s called safety. I’ve almost been blown up three times so I don’t mind it so much. There’s some urgency around this. But these are surveillance, high-definition surveillance
cameras in China. And if you jaywalk, within like a minute of you jaywalking you will be actually submitted with a fine. And you’ll go into a database, and potentially there’s gonna be an impact on your social credit score. Which has got a larger impact on how you can operate in China. I’m not gonna talk about the social credit
system so much today, because it’s kind of undefined as we sit. But it’s something
definitely worth looking at. And then I started to think yeah, okay, so once we’re not just
volunteering our information, what other things are happening? And this is actually a program that’s happening down in California. (electronic music) – [Narrator] The array of
cameras on this aircraft records high resolution images of the 25 square mile
area for up to six hours. It can track every person
and vehicle on the ground, beaming back the pictures in realtime. It’s city wide surveillance
on an unprecedented scale. – What we essentially do is a
live version of Google Earth, only with a full TIVO capability. It allows us to rewind time, and go back and see events
that we didn’t know occurred. – I think that this is a
normalized kind of behavior around surveillance that
we’re gonna see going ahead. That collection of
information in realtime, we’re no longer gonna be
seeing those Google cars driving around collecting
their information about how the streets work. By the way Google Maps
isn’t very good to use in Victoria so I found
out yesterday as well. (audience laughing) But the knock on effect of this is that collection of data
from surveillance cameras, from these kinds of eye
in the sky actually coming to predict policing models. And dozens of cities in North America have deployed predictive policing programs to try and work out the hotspots of where something is
likely to happen next. Has anyone seen Minority Report? Maybe it’s not like three
psychic ability people in a flotation tank. But it’s much more scientific and rigid, and something that we can apply
artificial intelligence to. And then it gets really really interesting when I start to think oh
what about sending hundreds of satellites into space, and them taking high-definition pictures of us down on the ground. I mean we can actually
identify us as individuals by the gait of our walk. So you can do that from a satellite. This is actually a company called Planet. I think they’re out of California as well. They’ve got 150 of these satellites. These are dove satellites
and each satellite takes 1.3 million images total per day. So they can actually take
high-definition images with only a 24 hour latency per day of almost the entire Earth. So the power of the in terms
of understanding who we are as a society and the power of that in terms of law enforcement,
intelligence and such like is very compelling for everyone except for the people on the ground. And once people start to understand this, people start staying at home. But then it’s okay because
you’ve got your Amazon Alexa at home that records
everything that you do. So it’s an interesting world. But we’re careering towards a world with self-driving vehicles. The department of transportation
down in the U.S. actually thinks that by 2023,
2024, we’re actually going to see more and more people
starting to use applications on their phones to call
self-driving vehicles that will take them wherever they want. Instead of an owning car, you will literally have an application. You’ll pay $300 a month, and the car will come to you, and there’ll never have to be a driver to have a conversation with. And you can probably choose the music that you want to listen
to as you drive around. I kind of don’t think
it’s gonna be a good idea to let this artificial intelligence try and interact with you. Because based on buyers and scaling up, you might have to listen to John Denver when you don’t like John Denver, or maybe some techno when you don’t really appreciate that so much
at seven in the morning. But like the self-driving vehicles are gonna come and they’re
going to come at scale. A lot of people didn’t think it was gonna happen this quickly. I actually think that the models for GM and Ford, and we’re already
seeing it with Tesla, but also companies like Lyft and Uber are all gonna become the provision of transportation services. The auto industry is dying. And then beyond that, we’re gonna start moving
away from the cell phones and the rectangles in our pocket to wearing headsets. Now this is actually
the Microsoft HoloLens. A little known fact, the
Microsoft HoloLens were actually developed in a secret laboratory over several years in Victoria. And what’s really interesting about this is it’s unwieldy today and it’s kind of used in, by architects and engineers, and maybe on the shop floor in a factory. And it’s kind of not
really in the mainstream, because it kind of looks like this. If you have ever used it, it’s still low quality, it still hasn’t quite
found its application. But trust me, it is coming. I know people that are working in Facebook and Google, in Magic
Leap, and in Microsoft, and the amount of
investment that’s happening into the augmented reality
field is incredible. And it leads us into this modern world. And this is a video by a
guy called Keiichi Matsuda. I was actually turned
on to Keiichi Matsuda by Nori Young who was on a panel earlier, he’s a good friend of mine. And he created this vision of what our world could look like with that augmented reality. He went down to Medeine in Columbia. He thought okay, if I was
gonna wear this headset, what does that augmented vision look like. You know cats in the sky, information about the streets. Information about certain individuals. When you’re in the supermarket, it will tell you the nutrition value of the banana you’re about to
buy and where it came from, and then you’ll ultimately be hacked, all of your loyalty points stolen, and then you have to recalibrate everyday because it’s gonna be a dangerous world. It’s about 12 minutes for this video. I compel you to watch it.
Keiichi Matsuda, Hyper-Reality. But this was one of the
most fascinating things. We think about high-tech
and how do we deploy levels of surveillance in
watching us on the streets. And China’s kinda good because they think, well what do we’ve got an abundance of? Like older women that sit around, and like to poke their nose into everyone’s business on the streets. So this is the Chinese security patrols that actually operate in large cities. And if they see someone that’s not acting in a way that’s appropriate, they will call it into the police. So not only do we have satellites and listening devices, cell
phones, self-driving vehicles, eyes in the sky, social credit systems. We can’t even trust our grandmothers. So we’re living in a very strange world. And if we go back to the
idea of the panopticon, we’re just gonna have to
normalize our behaviors to act very sensibly and
without moving outside of boundaries that are
defined by the companies and by the governments that we operate within their operating system as it were. So when I think about this, it’s like so how do we get responsibility into this entire dilemma of like, we don’t wanna be watched, we want our own sort
of place in the world. We don’t wanna just be the product. When we’ve got so many
different layers of abstraction. The layers of abstraction,
it’s made it wholly complex by just the terms and conditions that we put around every single layer. The average person would
have to spend 76 working days reading all off the digital
privacy policies they agree to in the span of a year. Does this concern anyone? It would take you nine hours to out loud speak Amazon’s privacy policy. I think the article I was reading, it was in The New York Times, and they were saying you might as well, instead of like the I agree button, it should be the meh whatever button. (audience laughing) Because what can we do? But this is it, cloud computing, layer and layer of applications, APIs. Everything’s got terms and conditions that relate to other terms and conditions and other systems and suddenly, even in something like a self-driving car, you’ve got 26 layers of
different applications and different technologies and we don’t know who’s accountable for a problem in the
system or a breach in data. We go to the sharp end of the stick, but then they bounce it all the way through their technological stack. That brings me to the idea of ubiquity. I’ve talked about
technologies that surround us. So we’re gonna be in a world where we’re not walking down
looking at the triangles, sorry the rectangles in our hands. We might have augmented reality, but really we’re gonna operate in a way that feels very humans and very natural, we’re just not gonna
understand why our behaviors are being controlled. The ubiquity in the world means that we’re being surrounded by the system. So where does the fight back come from when it comes from us? Not only is the people that own charge, privacy, and security,
but us as the humans, the mother and fathers, brothers, sisters, and the suchlike. One of the big sort of stories that I love to talk about is when Google
employees basically said that they refuse to work on the Pentagon’s 10 billion dollar JEDI cloud services project. And literally Google walked away. Because 10s of thousands
of their employees, basically threatened to walk out. And they said we do not stand for this. This is not how we want our artificial
intelligence platforms, our data platforms to be used
for military intelligence. Which is actually a really positive thing. And then the ACLU wrote letters to about hundred different executives including the CEOs of Apple and Microsoft and Amazon and Facebook to actually say, they want to have guardrails around facial recognition technology which is probably gonna be one of the most powerful technologies for keeping an eye on what’s
happening in the world. And then you’ve got these big thinkers. So if you’ve never heard
of these three guys and the Electronic Frontier Foundation, go and find out who they are. So the top left is a
guy called Jaron Lanier. And he talks about how
privacy and information has been skewed by social
networking and modern services and how we need to take back control. How the modern world is about
us having control of data, us owning who we are, and actually being able to use that in a micro transactional world. Then we’ve got Douglas Rushkoff, in 1994 read his book Siberia. It fundamentally changed
how I look at the world. He just written a new
book called Team Human. Go and read about how the world’s culture operates within technology. Douglas Rushkoff’s incredible. He’s also got a Team Human podcast, go and check that out. And obviously Tim Berners-Lee. And he came out with a new
technology called SOLID about decentralized Internet
last year, that’s awesome. And obviously I really support the Electronic
Frontier Foundation, because digital rights are human rights. (audience applauding) And then we’ve got this, thank you. I didn’t expect to get a
clap for that, it’s awesome. But yeah I actually donate
every year to the EFF. I urge everyone to support
people at the ACLU and EFF. But then this kind of things happens like the government wants to
fight back against big tech, that’s great. Did anybody watch the
Zuckerberg Senate Hearings? – [Man] Yes. – Let’s put a bunch of people that don’t understand how to
put socks on in the morning to have a conversation
about social networking and the Internet with the guy that basically created the new Internet. What was really interesting
about watching this last year was it was big government,
the U.S. government against big tech, more than
it was against Facebook. And the U.K. have gone one step further by even intercepting
documents from their lawyers at the airport before they
could actually leave the country and using that because
Facebook wouldn’t even turn up to the hearings in the
U.K. alongside Google and other companies. It’s time for these companies
to be held to rights. Let’s just put the right
people in the room with them, maybe people in the room here, and not some crazy senator from Tennessee. Seriously, seriously. So I normally talk for about
one to two hours every time, so I don’t have much time today and we’re coming towards
the end of our talk. But I try to think about
justice and who we are. Our human rights our
fundamentally on the mind by most terms and conditions
that we sign up to. But where are the companies
actually stepping in and working with us? Well I think it’s very good that we’ve got over a thousand people in this room that deeply deeply care about our rights. People that really want to
work with the tech companies and with government and with the users to understand what does
balancing the world look like, and how can we empower
people that use technology, because a world where
that empowerment happens is a world that’s
enabled from a financial, social, environmental perspective. And this is what I think we can do. I think we can work with
these tech companies to try and redefine what it means to own the data that we generate. I’d love to see a world
in the next 10 to 15 years where conversations are happening, and people like Google and Facebook are actually saying, you
know, you own your data to this extent. We need your permission to use it in a certain number of different fields. We will pay you for the use of that, and we will give you full transparency of how this is working. So ownership is really important. Understanding that ubiquity, and everyone coming to
the forum and saying, to the people actually out in the world that were reports about the transparency of every single significant
system out there instead of hiding in the shadows
and not saying how the world is actually being tracked. And then the security around that because if we’re putting in information, if we’re generating information
through our behaviors, how that actually goes
in and stays secure. And we can actually control
levels of security as well at the administration level. And that final idea of transparency is really really important. And it’s hard. It boils my brain to read all these things and even write these presentations and try to understand
the impact in the world. Or just saying a story about too rigid. A few years ago my girlfriend popped into the living room at 7 A.M. I get up at five every morning to work. And my head was in my hand, he goes what’s wrong? He said, I’ve learned too much. I’ve learned too much about how these systems operate,
where the data goes. How that works in intelligence, how that works internally
in the organizations which is even scarier to me. And how that’s fed back out into the world to productize humanity. And it’s kind of interesting. Those four pieces actually
create the word oust, and what does oust mean? To deprive someone or exclude someone from possession of something. I think that’s what we’re doing, is we need to exclude the tech companies and large organizations from the ownership of us as humans. And we need to give them
permission to interact with us, and we should give that
power back to humans. So at the end of the talk, I’ve removed the trademark from freedom, and it’s really interesting, the idea of saying that
freedom no longer needs to be spoken about if
freedom truly exists. Unfortunately for the next 10 to 15 years, probably the next 200 years, we’re still gonna have
to talk about democracy, because it’s not really working properly. We’re still gonna have
to talk about freedom, and we’re still gonna have to do our jobs in privacy and security, administration, regulations, and policy. I thank you for everything that you do. It’s a really really tough situation. It’s like wresting an
alligator that’s got tentacles. And on each ends of the
tentacles are more alligators. (audience laughing) So thank you, that’s my talk. I’m gonna be around for Q&A after this, thank you very much, cheers. (audience applauding)

Leave a Reply

Your email address will not be published. Required fields are marked *