You Can’t Make This Up

The Great Hack

Episode Summary

Remember back in 2016 when Cambridge Analytica admited to "hacking" Facebook? In this episode, Aminatou Sow (Call Your Girlfriend) talks with the directors of The Great Hack, Jehane Noujaim and Karim Amer. They get into data rights, persuadable social media and how one company used personal information as a means to a political end.

Episode Notes

Remember back in 2016 when Cambridge Analytica admited to "hacking" Facebook? In this episode, Aminatou Sow (Call Your Girlfriend) talks with the directors of The Great Hack, Jehane Noujaim and Karim Amer. They get into data rights, persuadable social media and how one company used personal information as a means to a political end. 

Episode Transcription

Rae: Welcome to You Can’t Make This Up, a companion podcast from Netflix.    


 

[Music]


 

Rae: I’m Rae Votta, and I’m hosting this week’s episode.  Here on You Can’t Make This Up, we go behind the scenes of Netflix original true crime stories with special guests.  This month, we’re talking about The Great Hack.  Remember back in 2016 when Facebook got quote/unquote “hacked?” Well, there’s so much more to that story.  The Great Hack follows the rise and fall of Cambridge Analytica and how they used personal data as a means to a political end.  Directors Jehane Noujaim and Karim Amer are known for their award-winning documentaries like The Square.  In this episode, they’ll be interviewed by Aminatou Sow, journalist, digital strategist, and cohost of the podcast Call Your Girlfriend.  Here is The Great Hack.


 

Aminatou: You both obviously didn’t find out about the story in the news like some of us woke up to, one panicked morning.  Can you tell me a little bit more about the timeline of events for you?


 

Jehane: The timeline of the story for me, personally, starts way back in 2001, when I came out with Startup.com, a film I made with D.A. Pennebaker and Chris Hegedus, and this was sort of the dot.com boom and two people who were starting an internet company.  And at that point, we could be God online, right?  And we thought that we could start these internet companies that were going to make everything speedier, faster, and everything work better.  Then fast-forward to a film I made in 2004 called Control Room, where it was about coverage of the Iraq War.  And depending on the news station that you were watching, you had a completely different view of reality.


 

And we’re kind of—at this time, it feels like Control Room on steroids, where you could be in the same household watching a different—looking at a different newsfeed and not able to have a conversation with a parent or a sibling or a friend depending on what newsfeed you’re getting.  And so even though we had started this project about the Sony hack, which was a physical hack, we quickly began to realize that the much more interesting story was not about the actual physical hack but about the hack of our minds.  And at that time, Karim began—got to meet Carole Cadwalladr and Chris Wylie in the UK, and that must have been the beginning of 2016.  Am I getting that right?


 

Karim: No.  It was 2017.


 

Jehane: It was about a year before their story came out.  But at that time, Chris had not let anybody know who he was or what was happening, and they were going to be releasing the story—they thought—pretty soon, but it took a while for it to come out.  And in that time, we met David Carroll.  We look for stories that are about a topic but that—where you can find characters that are about to go on a journey because we try to make films that are as close as possible to a fiction story so that people feel like they’re in it.  It’s not just a talking head telling you what to think.


 

So we really wanted to follow somebody on a journey, and we met David Carroll around the same time that we met Carole and Chris.  And while their story was happening in a very undercover way and they were developing it, David was basically asking this question.  This company is claiming to have 5,000 data points on me.  What do they know about me, and do I have the right to know that?  And I’m going to go and try to figure that out and sue them for this information if I don’t get it.  So he put in a subject access request for the data that they had on him.  Eventually, he got back a few of the data points, but very far from the 5,000 data points that they said they had.


 

Aminatou: Can you give me a sense of just like how aware you were?  It seems like you were pretty immersed in this world, but were you surprised at how little you knew about how our data was being used and how it was targeted?  Did you care at all?  And, also, like how has it changed your own use of technology and social media, specifically?


 

Jehane: Absolutely.  I was definitely surprised.  I did not know how our data was being used or how my personal data was being used, and when the topic of privacy came up, I was one of those people that said, okay, I understand if I post something, you know, on the Middle East, it’s—we live in a bit of a different world where, if you have protested in the square or you post something that’s anti-government, that can get you into a lot of trouble.  But in the US, I kind of felt like, okay, you know, my privacy—I’m not doing anything that’s illegal here, so who cares what I post?  And it wasn’t until I got deeper into this story that I realized, you know, how our data is collected and how there is this boomerang effect where a voodoo doll of you is created in order to predict your behavior and to target you with advertising.  And it gets to, basically, the ultimate question of our free will and whether, if we continue along this pathway, we are making true choices in our lives.


 

Aminatou: Can you tell me more about the visual language of the film?  Because I greatly enjoyed it.  I also think it is very hard to make a documentary about technology that doesn’t seem cheesy, you know.  It’s like the CGI, the animation, the just all of that working together.  And I love the story that you were telling there also, so I would love if you could talk about some of the choices that you made there.


 

Jehane: In this case, we wanted a visual language that would be able to really show that we’re leaking data everywhere we go, in every little mundane kind of action that we do, whether it’s buying a coffee or going into the subway, that it’s kind of leaking off of us.  And we had an incredible team.  These films are not made just with the two of us.  It’s the most collaborative art form there is, making a film, and that’s what I love about it.  And so we had Pedro Kos, who is our partner, and Judy Korin, who’s another partner who is a graphics wiz, and she basically said, “I’m going to interview some of the top graphics and animation people in the US.”  And we worked with a group called Shy Kids—


 

Karim: And Ash Thorp.


 

Jehane: And Ash Thorp.  And the idea was it shouldn’t be this sort of scary CGI cheesy kind of thing, but it—


 

Karim: Like government surveillance with the black and green screen.


 

Jehane: …it should come from our emojis and the happy faces and the thumbs up and, you know, and that should—that is potentially turning against us.


 

Aminatou: But that’s what’s so disarming about it because you’re like, this is how life is.


 

Jehane: It’s where we post our baby pictures.  It’s where we connect with our friends.  So it should really give that feeling of like this happy, fun place that is potentially turning against us.


 

Karim: And has turned dark.  And what we wanted to show is how the algorithm itself sees us, right?  Like, what is the POV of the algorithm, and what does it look like, and how does it see us?  And I think that the purpose of that was to show—it was to remind people that we are moral creatures, and that’s both our beauty and our limitations.  And we are now being, in many ways, governed by this superstructure which is an algorithm that we know very little about and that, in its very nature, is amoral.  And what happens to a moral society when it’s being shaped by such a force that we know very little about?


 

And so I think—but to begin to have that conversation, we had to first start to, you know, create an imagery and a language between how we’re interacting with this space that we don’t really see yet is always seeing us.  So I think this is the beginning of a conversation, as David says, to make the invisible visible.  That was a part of our inspiration and goal, and, as Jehane said, it was—took an amazing team of Judy, Pedro, Erin Barnett, Ash Thorp, the Shy Kids, and also Gil Talmi, who was our composer, who took sounds of our everyday tech interactions—you know, the tweets, the swipes, the this and that—and he used that into the score of the film, which, again, was creating a kind of, like, what’s the soundscape of this world that’s around us, that is so a part of every interaction we make, and how can we make that soundscape part of the storytelling?


 

Aminatou: Yeah. You know, and one of the things I think that is so compelling in the documentary are the people that you meet because it’s hard to make up your mind about who they are, right?  It’s like they’re sharing information, and then there is just so much that has been reported about all of them and how you interact with them in the film.  And so, you know, like Chris Wylie is somebody who’s been described as a compulsive bullshitter by his employer, you know, but at the same time, I’m like, you’re saying things that people need to hear.  And so I’m just wondering if you could talk about what you think his motives were in whistleblowing.


 

Karim: I think that, you know, it’s—speaking on Chris’ motives is something that Chris should answer to because I’m not, you know—I wasn’t with him when he made the decision.  What I can say is when I did meet Chris, before they came out publicly, he was very shaken.  I mean, I could feel that he really felt disturbed by what he felt he’d been a part of.  And he felt shame, and he felt remorse and was upset about the state of the world.  That being said, I do think that we have a very romantic view, in my opinion, of who whistleblowers are and who they aren’t.  I think that, more often times than not, people who do blow the whistle are in complicated situations, you know, like even Edward Snowden.  He wasn’t exactly, you know, a boy scout selling cookies by the side of the road.  Like, he was a military contractor working on things that weren’t really—that were quite suspect to begin with.


 

So, you know, you kind of have to be in a complicated place, oftentimes, to blow the whistle.  But, you know, I think it’s important for us to create an environment where people who have been a part of things that are operating in darkness and non-transparent environments can come forward, and I think we, as a society, have to ask ourselves which is more important?  Being part of an online mob that destroys people as they’re emerging from, you know—to bring the world evidence about what’s going on, or actually creating a space to hear from people who were in the sausage factory, where disinformation was being concocted and cooked how it works?  And that’s not to say that people shouldn’t be held accountable, but I think we need to figure out the system by which we do so.


 

Aminatou: Right.  And I think people are also very desperate for a simple story, right?  Like, a weapon that you can’t see is something that is really hard to deal with.  And so even for those of us who are like fairly skeptical and we’re very, you know, like we’re critical about everything, it’s still easier to say, “My phone is spying on me” than to be like, actually, the choices that I am making online everyday are part of this, you know, this detritus that is following me, that is creating a full picture of who I am.


 

Karim: Exactly.  No one wants to believe that they’re persuadable.  We all like to think that, especially in Western society, that we are governed solely by our free will and that everything we do is our choice and that we’re in full control of all our decisions because we’re told that we are the masters of our universe.  And the reality is that we live in a world now where we’ve given up so much of ourselves without really understanding it.  So much of our footsteps are tracked, measured, collected, and we’ve kind of been doing it with a bit of a, well, what’s the big deal? as Jehane was saying.  This is just the admission fee to entering the connected world, and the connected world is this lovely place where the Gods of Silicon Valley have just given us all these free services, and all they’re asking for is our data.  What’s the big deal?


 

Well, the thing is is data is recordable human behavior, right?  So we need to just think about what aspect of our lives we’re totally comfortable with having bought and sold in a commodity market that is trying to affect us, understand us, and predetermine our choices.  So, you know, is our sexual history something that we’re comfortable being traded and sold?  Is our medical record something that we’re comfortable?  Is our political view something that we’re comfortable being traded and sold?  And, if so, I think, do we get the right to have a more consensual relationship where we get to say, you know what?  I’m okay with, you know, being aware of how my data’s being used in this way, and I get that it’s a trade.  I’m getting these services for free.


 

I’m not okay with this because you told me it was a game called Candy Crush, and it turns out you’re really just tracking me and like using details about my life.  I’m okay with Netflix, maybe, because it’s clear.  I pay a monthly fee, and I know what I’m involved with.  But Facebook, I thought this was just like a free service, but it turns out where you actually own my photos, and I’ve given up all these rights without really understand how they work.  So I think we’re at this reckoning moment where we’re trying to find where the lines of consent are on our end, and I think it’s not coincidental that we’re doing that as a society in all kinds of other environments, and it’s now entering into the space of data.


 

Aminatou: You know, another person I want to talk about in the documentary is obviously Brittany Kaiser.  And I think that, for everybody who watches, will be—she will be a lightning rod.  Can you talk a little bit about what Brittany’s narrative is in this context?


 

Jehane: Brittany was somebody who started as an idealistic intern for the Obama campaign and then worked in human rights and then was pitched by Alexander Nix to work for Cambridge Analytica.  And at the time, I think she was really excited by the fact that she could use data to track their changes and progress and really use data to have a concrete effect.  And so, you know, she said to us that there were many times when, if she questioned a campaign that she was to take on, she had executives basically telling her, “Look.  This is work.  You’re like a lawyer.  It’s not your right to decide whether we take on this campaign or that campaign.  This is what you’ve chosen to do as a job.”  So I do think that she was complicit.  I don’t think she knew when she left exactly what she had done, that—I think all of us are figuring out the line right now, and we are in this time when there is a deficit of language about this whole subject because everything, you know, so many things that were done were not actually illegal at the time.


 

Karim: And that’s an important point to realize because, you know, the big issue to me where this comes back to is that we have—we need technology to expand our capacity as humans, right, and advance, but we need ethics to preserve our humanity.  And the problem is that there is a speed of movement within technology, and the mechanisms from which we create ethics, which are primarily legal mechanisms of enforcement, are operating at two completely different speeds.  So what’s happening in this gap is that a lot of things may not be illegal when you’re at the forefront of a new technology, like we saw with data, where, I think this time period is going to be called the Wild West of data, where it was just about you could just grab everyone’s stuff and people just give it to you and you can do whatever the hell you want with it, and everything’s great, and, you know, and nobody cares, and there’s no regulation.


 

So, at that time that the story was taking place, you know, most of these things weren’t illegal, but the question becomes just because something wasn’t illegal when you did it, does it make it okay?  So I think what was so attractive to Brittany as a character in the film is that her story very much is about power, and it’s about what happens to those who enter into the halls of power and into the most kind of dark and least understood hallways of that power.  How does it affect them?  How does it shape them, and what can they tell us?  And I think that we naturally have a, you know, an obsession with these types of stories.  If you look at House Of Cards, Succession, these types of stories out there, they all are about that, but little oftentimes do we have the opportunity in a nonfiction story to track someone really going through this journey of how power bends us and shapes us.


 

And so here was an opportunity in one person’s story to take you from the Obama campaign to Brexit to Trump’s campaign, you know, to then having links with Assange and then being investigated by Mueller.  So it was really an incredible arc to follow because it could take you through, you know, such an interconnected hallway of power in the digital era.  And that in itself we felt was a unique journey that could allow audiences to feel that they were going on a film’s journey and not being lectured to about how data does or doesn’t work, which would be quite boring.


 

Jehane: And our journey really echoes the journey that we’ve taken with technology and social media platforms from the hope of Obama and the use of technology there to the use of these digital platforms and technology to spread fear and divide us.  So, yeah, with Brittany, it was incredible to find somebody who could take you into these rooms in a real way, in a human way, that we would normally never get to see.


 

Aminatou: Yeah.  I’m glad that you pointed out, you know, the legality versus the ethics of it because I know that for me, the biggest reaction that I get when I talk to people who have watched it is, how is nobody in jail?  Again, it’s like, people want the simple story, and also, like, real harm has been done.  And so I think a thing that I have been thinking about a lot, especially, like, after viewing is who is—there has to be somebody where the buck stops, right?  And there has to be responsibility that’s taken.  So even when I hear you say, like, we’re all complicit, I’m like, well, I’m not a data miner, so I’m not as complicit, you know.


 

My behavior is that, and I also think that the public is not as educated as it should be, which is not an abdication of, you know, the responsibility of knowing what you’re doing, but the truth is also that the platforms that you sign up for are not the ones that you end up being.  Like, the terms of service like change from under you, the what the platform is doing changes radically.  And so I just wonder, like, what—if you have any ideas about that.  It’s like, yes, sure, like we’re all in this murky place, but somebody has to take responsibility, and ultimately, like, where does that responsibility lie?


 

Karim: I completely agree with you.  I think that, you know, you can’t just point it back to users and say, well, the users are going to just fix this, you know, by being more aware.  That’s like saying we’re going to all solve climate change by using, you know, metal straws, right?  Like, that’s just not going to happen.  Like, we need a societal shift that allows for all the major stakeholders, from users to tech platforms to government, to come together and figure this out.  Because what we’re seeing is data has surpassed oil as the most valuable commodity on earth, right?  And in this era of the oil economy, we used to see wreckage sites where an oil spill—where you could see it destroying marine life.  If you could see it wrecking—environmental waste—then you could see that there was a cost to the spillage of industrial waste on our environment.


 

Similarly, I think we’re now beginning to understand that there is a cost to the toxic information spaces, to the leakage of data in an unhygienic way, and we’re coming up with the language, terms, and imagery to start to have that conversation.  And I think that’s, to me, what’s been most exciting about the release of the film is that, as Jehane was saying, there’s this deficit of language, and people are reacting around the world, starting to say, well, what are my data rights, and what can I do about it, and have I been targeted by misinformation, and how do I spot it, and where do I, you know, flag it, and what does all this mean, and who’s right and who’s wrong, and who’s responsible?


 

So the fact that we’re even having this conversation for me is a great measure of success because that’s kind of the point of the movie is not to subscribe a solution but to begin this conversation about this era that we’ve entered into and to start kind of bringing it to life because the stakes couldn’t be higher, you know.  As Carole Cadwalladr, one of the characters, says in the film towards the end, you know, “This is about whether we can have a free and fair election ever again.” It’s not about left or right or red state or blue state or remain or leave.  It’s about the fundamental fabric of society.  And we’re, you know, less than 600 days away from the next election and not much has changed.


 

Aminatou: I mean, I would say nothing has changed, right, in the sense where watching it and hearing Professor Carroll say like, “information warfare” or Brittany Kaiser say that the UK techniques are weapons-grade, and we’re literally using the language of war to talk about what is going on and seeing that there’s not been, you know, there’s not been a significant shift.  Like, I don’t have any evidence that our legislators are taking it more seriously than they were years ago.  There hasn’t been a big shift in media talking about these issues.  People have not changed their behaviors.  I’m just thinking about the Instagram hoax that we were all subjected to by, you know, mostly people with—mostly people who were celebrities, who were sharing misinformation.  And so, when I think about something like that, thinking about the election, like, that scares me a lot because it’s been four years, and we just like—nothing has changed.


 

Jehane: We have a long way to go.  That’s why we made the movie, so that we could have a conversation about this.  Brad Parscale, who’s the person who is running Trump’s campaign, tweeted about the film and said, “That’s exactly right.  We are using the same methods, and four more years,” basically.  So, you know, this is an absolutely urgent conversation.


 

Aminatou: Well, the film is bookended by Professor David Carroll’s quest to get his own data back, essentially, from Cambridge Analytica.  That’s like going to give me nightmares for probably forever, in thinking about my own behavior on social media.  But I am just wondering if you can expand on that a little bit, like the choice of showcasing that like so prominently and what you are hoping for the viewers to get out of that experience.


 

Karim: Sure.  I mean, I think for us, David Carroll provided an amazing vehicle for telling the story because here was one man saying, you know, I am a pretty ordinary person, a professor at New School, and I have a gut feeling that something’s wrong.  And I want to know what my rights are, and I’m not going to wait for the Mueller investigation to end.  I’m not going to wait for a piece of legislation to change.  I feel the need to know more.  And he goes on this journey where he basically asks a simple question.  Do I have the right to know what you know about me?  And the answer is no, he doesn’t.


 

Like, the laws in this country simply do not allow—they allow the right for you to be surveilled under the auspices of surveillance capitalism, which is what we’re currently operating in, where all of your transactions, commercially, can be tracked, monitored, and collected, but you don’t actually have the right to know what that voodoo doll, as Jehane described earlier, of you looks like and what it’s saying about you and whether you agree with what it’s collecting about you or not and how it’s profiling you.  And that’s the heart of what David’s question is about is what rights do we have?  And I think the beauty of his story is that he keeps going.  He shows that—you know what?—I’m going to jump in the ring here, and I’m going to actually expose myself to risk, and I’m going to sue Cambridge Analytica, which is a military contractor operating in the UK, and I’m going to see how far I can push this.


 

And his case has led to major changes in this conversation, and it reminds us, once again, that everybody can play a role, and an everyday citizen can—has rights that can be enforced.  So we can use the system to hack the system back.  And what’s beautiful about David’s story as well is he finds this band of, you know—we call them the Digital Justice League in our office—of people online who have never met each other but are all active kind of proponents on Twitter about this conversation.  And together, they uncover more and more truths about Cambridge Analytica’s operations.  They support each other.  They helped create the space for people like Chris Wylie to come forward, and I think that collective action, in many ways, is responsible for why Cambridge Analytica no longer exists today.


 

So we do have the power to fight back.  We do have the power to band together and create civil disobedience and hold power accountable in the digital era, and, you know, stand up against disinformation.  And I think we have the ability to show that, you know, in this era of information warfare, we can play the game back, and I think that’s what the future’s going to look like.  But we hope, you know—our film that we made before was about people gathering together in a physical square.  It was called The Square, and it took place in Tahrir Square.  This film, in many ways, is about the destruction of the public square online but the attempt of people to reclaim it, and I hope that we continue to see different people and stories being told about that because it is integral for us to do so.


 

I mean, it doesn’t matter what topic you care about.  You could be coming at the environment.  You could be coming at racial inequality.  You could be coming at social justice of any sign.  If you don’t solve the way in which information is collected, harvested, and used, you cannot fight any of those issues because at the core of this issue is the right to assemble and the validity of freedom of speech on information highways that connect us all.


 

[Music]


 

Rae: That was Aminatou Sow speaking with Jehane Noujaim and Karim Amer, and that’s all for this week’s episode.  We’ll be back next month with a new series or film for you to add to your watch list.  You can find this show on Apple Podcasts, Stitcher, Google Play, Spotify, and wherever else you get your podcasts.  Make sure to subscribe, rate, and review the show.  That way you can help other people find it.  You Can’t Make This Up is a production of Pineapple Street Media and Netflix.  Our music is by Hansdale Hsu.  I’m Rae Votta, and keep on streaming.


 


 

[End of Audio]