Conspiracy! The Show 264: The Great Hack

November 1, 2023

Adam welcomes Jack Kelly and Jenn Scott to discuss a Netflix documentary about how social media stole the 2016 election and why it could very much happen again in 2024.

1
00:00:10,640 –> 00:00:15,720
Welcome to Conspiracy the Show,
the world’s most trusted

2
00:00:15,760 –> 00:00:21,600
conspiracy theory podcast, with
your host Adam Todd Brown and

3
00:00:21,600 –> 00:00:35,280
Olivia Hydar.
Hey everybody, Welcome to

4
00:00:35,280 –> 00:00:39,040
Conspiracy the show.
I’m your host Adam Todd Brown.

5
00:00:39,040 –> 00:00:42,720
Joining me is Co host this week.
Oh, my favorite Co host of all.

6
00:00:42,720 –> 00:00:47,240
No Co host, but I do have a
couple of guests.

7
00:00:47,480 –> 00:00:50,400
Case in point, Jack Kelly’s
here.

8
00:00:50,600 –> 00:00:53,560
Hello Jack.
Have you ever been on Conspiracy

9
00:00:53,560 –> 00:00:57,000
the show?
I don’t think I have, actually.

10
00:00:57,000 –> 00:01:00,680
I’ve been on a few others, but I
have not been on conspiracy.

11
00:01:00,840 –> 00:01:02,480
Well, welcome to the big
leagues.

12
00:01:02,680 –> 00:01:11,560
You didn’t Jen Scotts here too.
What was that?

13
00:01:11,920 –> 00:01:12,520
Hi.
Hi.

14
00:01:12,520 –> 00:01:13,120
Hi.
Hi.

15
00:01:13,120 –> 00:01:15,400
It’s me.
I’ve been on the show before.

16
00:01:15,520 –> 00:01:18,880
I know all the conspiracies.
Jen believes them all.

17
00:01:19,440 –> 00:01:21,480
Maybe not all.
Yes, that’s that’s well known

18
00:01:21,480 –> 00:01:23,280
about me.
If you tell me anything, I’ll

19
00:01:23,280 –> 00:01:27,840
believe it.
Hey, thank you both for doing

20
00:01:27,840 –> 00:01:29,680
the Pod.
I appreciate it.

21
00:01:29,840 –> 00:01:31,960
We’re talking about some shit
today.

22
00:01:32,120 –> 00:01:35,040
We’re talking about a
documentary called The Great

23
00:01:35,120 –> 00:01:37,400
Hack.
Because I don’t know if people

24
00:01:37,400 –> 00:01:40,440
know this, but we are looking
down the barrel of our next

25
00:01:40,440 –> 00:01:43,400
presidential election Right now.
It is.

26
00:01:43,400 –> 00:01:48,800
I don’t want it.
Just a little over a year from

27
00:01:48,800 –> 00:01:51,120
now and it.
Looks like we’re still we,

28
00:01:51,120 –> 00:01:56,280
meaning people who aren’t Trump
supporters, are just riding with

29
00:01:56,280 –> 00:02:01,520
Biden again in 24, which I’m
sure that’s going to go.

30
00:02:01,760 –> 00:02:03,440
Great.
The thing that doesn’t

31
00:02:03,440 –> 00:02:07,560
translate, unfortunately for a
podcast is just the absolute

32
00:02:07,560 –> 00:02:11,160
exhaustion and the dismay on my
face.

33
00:02:11,760 –> 00:02:15,560
I just have no words for a lot
of these feelings I have except

34
00:02:15,560 –> 00:02:18,280
for just like.
Profound sadness.

35
00:02:19,640 –> 00:02:22,640
Like, yeah, I can just.
I can describe Jack’s face in

36
00:02:22,640 –> 00:02:23,720
one word.
And it was.

37
00:02:23,720 –> 00:02:24,600
Distraught.
I don’t.

38
00:02:26,760 –> 00:02:31,960
I don’t want to do this again.
No one does.

39
00:02:32,480 –> 00:02:36,160
Yeah, we should just go without
a president for a while.

40
00:02:36,400 –> 00:02:39,600
Yeah, like.
Until we can figure out the rest

41
00:02:39,600 –> 00:02:43,040
of the government like we don’t
need a fucking king on top of

42
00:02:43,040 –> 00:02:45,600
everything else, like.
It’s all broken.

43
00:02:46,080 –> 00:02:48,040
We just.
I mean, maybe we do need a king.

44
00:02:48,040 –> 00:02:50,040
Maybe that’s what we need.
Maybe I want to be king.

45
00:02:50,280 –> 00:02:54,080
I just can’t wait to be king.
I can’t wait for you to be king,

46
00:02:54,400 –> 00:02:56,400
honestly.
Thank you so much.

47
00:02:56,640 –> 00:02:58,640
It’s the world.
All I ever wanted in life, it’s.

48
00:02:58,920 –> 00:03:01,440
The world we need, the goals.
I know.

49
00:03:01,440 –> 00:03:03,600
I agree.
I think I could do better.

50
00:03:03,840 –> 00:03:07,720
I think a lot of people could do
better than what we have right

51
00:03:07,720 –> 00:03:10,600
now.
Unfortunately, everybody that

52
00:03:10,600 –> 00:03:13,880
would and could do better has no
interest in the job because they

53
00:03:13,880 –> 00:03:15,840
don’t.
Want to be president because

54
00:03:15,840 –> 00:03:18,000
they know how awful the job is.
Yeah.

55
00:03:18,160 –> 00:03:21,400
And anybody who wants the job
has something fundamentally

56
00:03:21,400 –> 00:03:24,960
broken in them.
And it’s, yeah, no, that much

57
00:03:24,960 –> 00:03:26,960
power is awful and you shouldn’t
want it.

58
00:03:27,120 –> 00:03:30,440
I’d be taking one for the team.
I’d be like I hate this the

59
00:03:30,440 –> 00:03:33,960
whole time.
Yeah, I mean, that’s what, like

60
00:03:33,960 –> 00:03:37,320
Congress used to be, was that
people would basically like, do

61
00:03:37,320 –> 00:03:40,240
their two years in the House and
be like, I don’t wanna fucking

62
00:03:40,240 –> 00:03:42,000
do this and then go back to
their community.

63
00:03:42,200 –> 00:03:45,000
Like, that was what you were
supposed to do, and now people

64
00:03:45,000 –> 00:03:48,360
made a career out of it because.
They realize what power it

65
00:03:48,400 –> 00:03:51,200
yields.
Yeah, so we’re covering a

66
00:03:51,200 –> 00:03:53,560
documentary called The Great
Hack.

67
00:03:53,960 –> 00:03:59,400
It is a 2019 Netflix documentary
about the Cambridge Analytica

68
00:03:59,600 –> 00:04:01,840
scandal.
And if you don’t know what that

69
00:04:01,840 –> 00:04:05,200
is, I guess you have landed on
the perfect episode of

70
00:04:05,200 –> 00:04:08,360
Conspiracy, the show for you,
because we’re going to talk

71
00:04:08,360 –> 00:04:13,200
about that by way of talking
about this documentary Cambridge

72
00:04:13,240 –> 00:04:17,040
Analytica was hugely influential
in.

73
00:04:17,320 –> 00:04:23,400
The 2016 election and a lot of
that happened on Facebook, and I

74
00:04:23,400 –> 00:04:27,760
feel like enough people since
then have stopped using Facebook

75
00:04:27,920 –> 00:04:32,680
that it seems like this is less
of a threat now when that’s

76
00:04:32,760 –> 00:04:35,960
absolutely not the case.
Like people are still using

77
00:04:35,960 –> 00:04:38,400
Facebook, people are still going
to use it to.

78
00:04:38,400 –> 00:04:41,120
I guess it depends on what you
define as people.

79
00:04:41,800 –> 00:04:44,560
Yeah, when you say people are
still using Facebook.

80
00:04:44,800 –> 00:04:48,720
What do you define as people?
I think that I think that

81
00:04:48,720 –> 00:04:53,200
millennials in Gen.
Z have mostly exited the

82
00:04:53,200 –> 00:04:55,680
platform.
I think younger Gen.

83
00:04:55,880 –> 00:04:59,560
Xers have also left the
platform, but I mean, that’s

84
00:04:59,560 –> 00:05:03,720
where Boomers and older Gen.
Xers are still hanging out, so.

85
00:05:03,920 –> 00:05:06,080
And they vote.
And they vote.

86
00:05:06,200 –> 00:05:09,400
They vote.
And also to be fair, Cambridge

87
00:05:09,400 –> 00:05:12,040
Analytica is a beautiful name
for a baby girl.

88
00:05:12,360 –> 00:05:15,960
It is a nice name, like, yeah,
you should be a musician with a

89
00:05:15,960 –> 00:05:19,680
name like that, not a company
that engages in the stage.

90
00:05:19,840 –> 00:05:23,520
Cambridge Analytica.
But instead they do

91
00:05:23,520 –> 00:05:26,600
psychological warfare against
people to influence elections.

92
00:05:29,280 –> 00:05:32,600
Yeah, bummer.
I’m not going to say anything.

93
00:05:33,240 –> 00:05:39,360
Truly controversial right now.
This documentary made me so mad,

94
00:05:40,560 –> 00:05:45,840
Just big mad the whole time.
And just like just the face I

95
00:05:45,840 –> 00:05:47,960
was making, the distraught face
I was making talking about

96
00:05:47,960 –> 00:05:50,680
American politics and just the
upcoming election.

97
00:05:50,880 –> 00:05:52,960
That’s how I felt, the whole
documentary.

98
00:05:52,960 –> 00:05:56,400
And I pretty sure I dissociated
for part of this documentary.

99
00:05:56,400 –> 00:05:59,720
So I was like, I can’t listen to
this, I can’t deal with this.

100
00:06:00,520 –> 00:06:03,000
Yeah, this.
Could have been a true crime

101
00:06:03,280 –> 00:06:06,400
episode just as easily as a
conspiracy episode.

102
00:06:06,720 –> 00:06:10,000
It’s crazy, the lack of crime it
is treated as.

103
00:06:10,200 –> 00:06:13,280
Yeah, like someone does.
We just did an oopsies.

104
00:06:13,640 –> 00:06:19,440
Someone does confess to a crime
at the end, but even then they

105
00:06:19,440 –> 00:06:24,040
do it just because, well, that’s
easier than having to comply.

106
00:06:24,040 –> 00:06:27,480
It’s not like anyone’s going to
go to jail over that shit.

107
00:06:28,600 –> 00:06:35,080
It opens with a girl named
Britney Kaiser, who is not being

108
00:06:35,080 –> 00:06:39,080
dramatic at all.
When we see her writing

109
00:06:39,080 –> 00:06:43,080
Cambridge Analytica on some
fucking statue at Burning Man,

110
00:06:43,320 –> 00:06:47,280
it’s like, OK, are you trying to
score points here by being at

111
00:06:47,280 –> 00:06:48,680
Burning Man?
Do you think that makes you

112
00:06:48,680 –> 00:06:51,000
relatable?
Britney, ’cause it sure doesn’t.

113
00:06:51,200 –> 00:06:54,240
It sure sure does not.
That shit confused me so much.

114
00:06:54,240 –> 00:06:56,880
I was like, why are we at
Burning Man right now?

115
00:06:57,040 –> 00:06:59,840
And we never go back.
We never see that red wig again.

116
00:07:00,200 –> 00:07:02,280
Is that why?
Is that why your hair’s red?

117
00:07:02,280 –> 00:07:04,280
Jen ’cause you watch this
documentary and you’re?

118
00:07:04,360 –> 00:07:06,960
Like, great big Britney Kaiser
fan in the House.

119
00:07:07,120 –> 00:07:09,760
Yeah, that’s why I had to do it.
I had to Rep Brit.

120
00:07:10,480 –> 00:07:13,200
Well, and then well, and then I
will say the next time we see

121
00:07:13,200 –> 00:07:16,160
her, she’s like on vacation in
Thailand.

122
00:07:16,200 –> 00:07:20,920
So like, she’s fine and.
She’s like with my family.

123
00:07:21,920 –> 00:07:26,880
Yeah, she’s not a very likable
figure, at least not to me.

124
00:07:27,120 –> 00:07:30,120
I was trying to like her the
whole time because I was like,

125
00:07:30,120 –> 00:07:33,120
maybe she’s doing the right
thing and this is trying to

126
00:07:33,120 –> 00:07:35,960
paint her as not doing the right
thing, ’cause she’s the only

127
00:07:35,960 –> 00:07:39,800
person who actually said
anything was bad or condemned

128
00:07:39,800 –> 00:07:42,000
it.
But then I texted Adam being

129
00:07:42,000 –> 00:07:45,280
like, do we hate her, or do we
just want to hate her?

130
00:07:45,520 –> 00:07:47,520
And I I We hate her.
We hate.

131
00:07:47,520 –> 00:07:48,160
Her.
I hate her.

132
00:07:48,160 –> 00:07:50,080
She’s essentially a war
criminal.

133
00:07:50,280 –> 00:07:53,520
Just because she said it was bad
doesn’t make it OK that she did

134
00:07:53,520 –> 00:07:56,240
it.
Yeah, and like, I get that we’ve

135
00:07:56,240 –> 00:08:00,720
all made mistakes in our youth,
but most of our mistakes don’t

136
00:08:00,720 –> 00:08:03,200
involve toppling foreign
governments.

137
00:08:05,480 –> 00:08:09,880
You don’t know me, you know.
Yeah, it’s.

138
00:08:09,920 –> 00:08:11,560
I know we’re going to get to
this in a minute.

139
00:08:11,560 –> 00:08:15,320
But like, I did appreciate them
pointing out her like, own

140
00:08:15,320 –> 00:08:19,600
personal arc and her own
personal story of like where she

141
00:08:19,600 –> 00:08:22,360
began and where she ended up.
And it’s like, that’s

142
00:08:22,480 –> 00:08:27,440
fascinating and definitely
deserves to be looked at a bit

143
00:08:27,440 –> 00:08:30,760
more, just broader overall in
culture.

144
00:08:30,840 –> 00:08:34,000
Yeah, I would watch a full
documentary about Brittany

145
00:08:34,000 –> 00:08:36,200
Kaiser.
Like, once I heard she was

146
00:08:36,200 –> 00:08:39,240
working on political campaigns
since she was 14.

147
00:08:39,400 –> 00:08:43,240
It’s like, why?
Like how did that happen doing

148
00:08:43,240 –> 00:08:46,480
before that?
Yeah, like who recognized

149
00:08:46,480 –> 00:08:51,080
something in you as a 14 year
old where they were like, we got

150
00:08:51,080 –> 00:08:55,040
to get hurt because she was like
running Obama’s Facebook page in

151
00:08:55,040 –> 00:08:57,240
Yes 08 when he ran.
Yes.

152
00:08:57,680 –> 00:09:03,080
Like, she’s not that old now.
So like, how old was she in 08?

153
00:09:03,080 –> 00:09:06,280
Can we look up her age?
I bet we can, Internet.

154
00:09:06,800 –> 00:09:10,760
How crazy would it be what I was
doing at 12 online?

155
00:09:11,080 –> 00:09:15,840
That’s scary to me, what she was
doing at 12 online because I was

156
00:09:15,840 –> 00:09:19,720
like hacking shit, like like it
was nothing too, like for fun.

157
00:09:19,920 –> 00:09:26,280
And I’m not smart.
She was 22 when she was running

158
00:09:26,360 –> 00:09:29,880
Obama’s Facebook page, which
that sounds about right.

159
00:09:30,360 –> 00:09:33,480
Maybe that’s a lot of
responsibility to give a 22 year

160
00:09:33,480 –> 00:09:34,880
old.
I don’t see that happening now.

161
00:09:35,000 –> 00:09:38,720
But also in two No, I don’t.
I don’t see it happening now.

162
00:09:38,800 –> 00:09:43,760
But in 2008, Facebook was still
pretty new, so she would have

163
00:09:43,760 –> 00:09:48,120
had could have been a way better
grasp like that that there was

164
00:09:48,120 –> 00:09:51,280
this movie with Vince Vaughn and
Owen Wilson where they played

165
00:09:51,280 –> 00:09:53,400
like older.
People going back into the

166
00:09:53,400 –> 00:09:56,640
workforce and there were all
these scenes where they were

167
00:09:56,640 –> 00:09:59,480
pretending like they didn’t know
how the Internet worked.

168
00:09:59,480 –> 00:10:02,600
And it’s like, motherfucker, by
the timeline in this movie, you

169
00:10:02,600 –> 00:10:05,800
were in your 20s when the
Internet really blew up and now

170
00:10:05,800 –> 00:10:08,520
you’re in your 40s.
You’ve had 20 fucking years of

171
00:10:08,520 –> 00:10:10,640
experience.
Don’t tell me you don’t know how

172
00:10:10,680 –> 00:10:14,200
the Internet works.
So like now I could see someone

173
00:10:14,200 –> 00:10:18,320
older being the person who runs
a presidential Facebook page.

174
00:10:18,320 –> 00:10:21,240
But in 2008, it would have been
a young.

175
00:10:21,480 –> 00:10:25,520
Yeah, in 2007.
I remember they just rolled out

176
00:10:25,520 –> 00:10:29,040
in like 2006 that you could join
Facebook in high school.

177
00:10:29,320 –> 00:10:32,480
Like they were just rolling that
out and you needed like an

178
00:10:32,480 –> 00:10:36,800
invite to join.
And I did.

179
00:10:36,800 –> 00:10:39,520
And so it’s like people by and
large, like, didn’t have

180
00:10:39,520 –> 00:10:41,040
Facebook.
Like, my parents didn’t get

181
00:10:41,040 –> 00:10:47,200
Facebook until I was in college.
Yeah, and that was five years

182
00:10:47,920 –> 00:10:49,240
I’ve been on the platform
already.

183
00:10:49,240 –> 00:10:52,320
So.
You know, it it made sense that,

184
00:10:52,320 –> 00:10:53,800
like, she was so young and doing
it.

185
00:10:53,800 –> 00:10:57,560
She was an intern, too, right?
Can we talk about for one second

186
00:10:57,680 –> 00:11:00,760
that weird chunk of time where
only people who had been invited

187
00:11:00,760 –> 00:11:03,080
to Facebook had been posting on
it for years?

188
00:11:03,080 –> 00:11:06,240
And it was like, we were like
criminals on there with how much

189
00:11:06,240 –> 00:11:08,520
we were like, underage partying
and all that shit.

190
00:11:08,520 –> 00:11:10,920
And then all of a sudden our
parents could be on there and we

191
00:11:10,920 –> 00:11:14,080
were all, like, in trouble, but
like, not, ’cause we were

192
00:11:14,080 –> 00:11:17,400
adults.
It was like this weird backwards

193
00:11:17,400 –> 00:11:20,600
chunk of Internet where they
were like, we’re mad about this

194
00:11:20,600 –> 00:11:22,120
and it’s like you weren’t
invited.

195
00:11:22,480 –> 00:11:25,320
Yeah, that happens.
Or at least that happened a lot

196
00:11:25,480 –> 00:11:28,080
in the early days of the
Internet where people were like,

197
00:11:28,080 –> 00:11:31,840
oh Facebook, this is like my own
personal thing, no ones looking

198
00:11:31,840 –> 00:11:34,680
at this right?
And it’s like everyone is

199
00:11:34,680 –> 00:11:37,960
looking at it.
Especially employers, people

200
00:11:37,960 –> 00:11:39,720
looking to employ you.
All that shit.

201
00:11:40,280 –> 00:11:42,600
Well, and now it’s so
interesting because it’s like, I

202
00:11:42,600 –> 00:11:46,400
see a lot of like Gen.
Z folks like not really caring,

203
00:11:46,400 –> 00:11:48,560
like whatever they put online,
they’re like, whatever.

204
00:11:48,560 –> 00:11:49,880
This won’t have any
consequences.

205
00:11:49,880 –> 00:11:54,200
Meanwhile, like millennials are
so fucking traumatized from like

206
00:11:54,400 –> 00:11:59,080
employers will see everything
and it still remains to be true

207
00:11:59,080 –> 00:12:02,920
is that some employers really
fucking care and some employers.

208
00:12:03,080 –> 00:12:04,320
Really.
Fucking don’t.

209
00:12:04,320 –> 00:12:07,040
You know, they’re like whatever
you do on your own time is your

210
00:12:07,040 –> 00:12:11,080
own thing.
But also like, don’t make these

211
00:12:11,280 –> 00:12:14,680
enormous political statements
and don’t like, you know, do

212
00:12:14,680 –> 00:12:17,480
crystal meth and post videos on
TikTok.

213
00:12:18,800 –> 00:12:22,040
Yeah, although don’t do that.
Yeah, please don’t do meth.

214
00:12:22,440 –> 00:12:25,880
David Carroll.
He’s another of the main players

215
00:12:26,120 –> 00:12:28,640
in this documentary.
I just found him so purely

216
00:12:28,640 –> 00:12:30,720
annoying.
Yeah, I didn’t.

217
00:12:30,720 –> 00:12:34,400
I didn’t really have.
An opinion on him.

218
00:12:34,400 –> 00:12:37,160
I don’t think he’s too excited
about all of it.

219
00:12:37,160 –> 00:12:42,600
And I was like, EW stop.
Yeah, but he’s also like someone

220
00:12:42,600 –> 00:12:45,080
needed to file that lawsuit that
he filed.

221
00:12:45,080 –> 00:12:46,920
Like, no, I mean, I’m proud of
him or whatever.

222
00:12:46,920 –> 00:12:48,960
I just think he’s annoying,
like, ultimately.

223
00:12:49,200 –> 00:12:52,840
Yeah, he’s probably a pain to
hang out with, but he’s

224
00:12:52,840 –> 00:12:55,920
definitely not like one of the
villains here.

225
00:12:56,000 –> 00:12:57,520
And I’m not saying you’re
calling him a villain,

226
00:12:57,800 –> 00:12:59,960
obviously.
Oh, he’s a villain.

227
00:13:00,760 –> 00:13:04,360
Probably in several other ways.
He’s an associate professor at

228
00:13:04,360 –> 00:13:09,720
Parsons School of Design.
And so of course, of course

229
00:13:09,720 –> 00:13:13,120
that’s who’s gonna be leading
the campaign against data

230
00:13:13,120 –> 00:13:14,520
rights.
Seems to.

231
00:13:14,560 –> 00:13:16,400
Work.
Yeah, weird choice.

232
00:13:16,600 –> 00:13:18,400
These guys out here making
jeans.

233
00:13:18,880 –> 00:13:21,040
And calling out social media
networks.

234
00:13:21,240 –> 00:13:23,160
Yeah.
One of the things he brings up

235
00:13:23,360 –> 00:13:28,920
while he’s addressing his class
is that thing where you talk

236
00:13:28,920 –> 00:13:33,160
about something and next thing
you know you go on the gram.

237
00:13:33,560 –> 00:13:36,400
And you see an ad for that thing
you were just talking about.

238
00:13:36,600 –> 00:13:39,800
And it’s like, is my phone
listening to me?

239
00:13:39,960 –> 00:13:44,400
And the explanation is always
no, it’s just that the data

240
00:13:44,440 –> 00:13:48,200
that’s out there about you is so
good at predicting your next

241
00:13:48,200 –> 00:13:52,000
move that it just like knows
what to show you.

242
00:13:52,240 –> 00:13:54,560
And I think that this is any
longer true.

243
00:13:54,680 –> 00:13:58,200
That feels like a roundabout
answer to me, because that feels

244
00:13:58,200 –> 00:14:02,400
like we can circle back to OK,
but is my phone listening to me?

245
00:14:02,880 –> 00:14:06,400
Like, is that part of that made
me think that he’s annoying?

246
00:14:06,560 –> 00:14:10,760
I was like, OK, dude because I
felt like I already knew all of

247
00:14:10,760 –> 00:14:12,760
this and I feel like most
people, it’s common knowledge

248
00:14:12,760 –> 00:14:15,600
and like, his explanation of it
felt condescending and wrong.

249
00:14:15,600 –> 00:14:19,120
I was like, OK.
Look, in his defense, this

250
00:14:19,280 –> 00:14:21,160
documentary did come out in
2019.

251
00:14:21,160 –> 00:14:25,120
So it was four years ago.
So it’s like now everyone’s

252
00:14:25,120 –> 00:14:27,800
like, oh, it’s not, That’s
what’s happening.

253
00:14:27,800 –> 00:14:29,600
And it’s like, but in 2019 this
was.

254
00:14:29,960 –> 00:14:33,520
This was new information, you
know, And I think, like for me,

255
00:14:33,800 –> 00:14:37,160
I just always look at my ads
that I like and I I always give

256
00:14:37,160 –> 00:14:40,800
it a side eye because it’s very
bad at predicting what I want.

257
00:14:41,120 –> 00:14:45,840
And I think, I think if I can
give any advice, this podcast is

258
00:14:46,080 –> 00:14:50,320
have such disparate interests
that the Internet literally

259
00:14:50,320 –> 00:14:54,600
cannot paint a picture of who
you are be uncovernable to

260
00:14:54,640 –> 00:14:58,640
advertise to.
Yeah, basically be ungovernable.

261
00:14:59,040 –> 00:15:01,680
Just the things that the
Internet feels it needs to show

262
00:15:01,680 –> 00:15:03,320
me.
I’m like, no, I’m not interested

263
00:15:03,320 –> 00:15:04,200
in that.
Go away.

264
00:15:05,360 –> 00:15:08,240
Your ads mean nothing to me.
Take your algorithm on a hike,

265
00:15:08,240 –> 00:15:10,760
buddy.
Yeah, and see, I’m the exact

266
00:15:10,760 –> 00:15:12,960
opposite.
Instagram knows what I fuck

267
00:15:12,960 –> 00:15:15,520
with, that’s for sure.
They just show me like baseball

268
00:15:15,520 –> 00:15:18,120
cards and shoes and I’m like,
Yep, I want that.

269
00:15:18,440 –> 00:15:20,080
You know where I got this hiss
T-shirt?

270
00:15:20,240 –> 00:15:23,160
Instagram.
I do too many social experiments

271
00:15:23,160 –> 00:15:25,440
online for it to predict what
I’m doing.

272
00:15:25,960 –> 00:15:28,960
And I also will do shit like
talk about stuff just to fuck

273
00:15:28,960 –> 00:15:31,120
with my ads.
Because I’m like, I’ve proved

274
00:15:31,120 –> 00:15:33,600
this wrong.
Because I I will just talk about

275
00:15:33,600 –> 00:15:36,000
shit that I don’t put into my
phone at all.

276
00:15:36,360 –> 00:15:39,000
And see if I start getting ads
for it and it works.

277
00:15:39,200 –> 00:15:42,560
And see, I just kind of use
Instagram for shopping at this

278
00:15:42,600 –> 00:15:44,160
point.
I mean, that’s what they want it

279
00:15:44,160 –> 00:15:47,160
to be so.
It does keep showing me

280
00:15:47,160 –> 00:15:50,720
advertisements for like, a
litter genie, but I’m just like,

281
00:15:50,720 –> 00:15:52,320
yeah, because you know, I have
cats.

282
00:15:52,320 –> 00:15:55,440
Like, that’s of course any cat
owner wants a fucking litter

283
00:15:55,440 –> 00:15:56,960
genie.
Yeah, I don’t want to scoop

284
00:15:56,960 –> 00:16:00,160
litter boxes, duh.
But with what am I what with

285
00:16:00,160 –> 00:16:01,720
what money am I buying this
litter genie?

286
00:16:01,720 –> 00:16:04,000
Yeah.
The Lazy.

287
00:16:04,040 –> 00:16:08,640
No, I’m broke social media.
The laziest version of it, and I

288
00:16:08,640 –> 00:16:11,720
think we still see it all the
time, is when you buy something.

289
00:16:12,160 –> 00:16:14,880
And then you start getting ads
for that thing all over the

290
00:16:14,880 –> 00:16:17,560
Internet.
It’s like, like, what kind of

291
00:16:17,560 –> 00:16:21,520
research are you doing to decide
that this was a good way to

292
00:16:21,520 –> 00:16:25,040
spend your money?
Anyway, let’s get back to David

293
00:16:25,040 –> 00:16:28,680
Carroll.
He again says that, yeah, that’s

294
00:16:28,760 –> 00:16:30,240
your phone’s not listening to
you.

295
00:16:30,240 –> 00:16:33,080
It’s just like, it’s really good
at knowing what you’re going to

296
00:16:33,080 –> 00:16:34,680
do.
And it’s like, that’s kind of

297
00:16:34,680 –> 00:16:38,240
scarier.
If I’m being honest, yeah, I’d

298
00:16:38,240 –> 00:16:40,400
like.
I’d like rather just know that

299
00:16:40,400 –> 00:16:45,960
my phone was listening to me.
And so David Carroll says that

300
00:16:46,040 –> 00:16:50,400
after the 2016 election, he
realized that these same methods

301
00:16:50,560 –> 00:16:53,680
that advertisers are using to
sell you shit on the Internet

302
00:16:53,680 –> 00:16:57,120
could also probably be used for
election purposes.

303
00:16:57,320 –> 00:17:00,480
And I feel like a lot of people
were having that discussion

304
00:17:00,480 –> 00:17:04,040
already by then.
But David Carroll landed on that

305
00:17:04,040 –> 00:17:08,280
after the election, and so he
started looking into it.

306
00:17:08,400 –> 00:17:12,000
One of the groups that we meet
is Project Alamo.

307
00:17:12,240 –> 00:17:17,119
They were a pro Trump group who
spent $1 million a day on

308
00:17:17,119 –> 00:17:20,400
Facebook ads.
Now here’s a question.

309
00:17:20,520 –> 00:17:24,599
They show us a statistic at the
end of this documentary jumping

310
00:17:24,599 –> 00:17:29,040
way, way ahead here that says
Trump ran something like 5

311
00:17:29,040 –> 00:17:35,280
million Facebook ads and Hillary
Clinton ran 66,000.

312
00:17:35,440 –> 00:17:39,280
At which point I think you have
to ask the question.

313
00:17:39,720 –> 00:17:43,440
Is that really election
interference or is that Hillary

314
00:17:43,440 –> 00:17:47,800
Clinton not taking the Internet
into account enough?

315
00:17:48,000 –> 00:17:52,040
Because I do think Hillary
Clinton went into 2016 with kind

316
00:17:52,040 –> 00:17:57,240
of this air about her, where she
just felt like she was kind of

317
00:17:57,240 –> 00:18:01,480
entitled to be president next
and that there was no way Trump

318
00:18:01,640 –> 00:18:05,840
was going to beat her.
And why didn’t she run 5 million

319
00:18:05,840 –> 00:18:09,320
Facebook ads, you know?
You know, I think that it kind

320
00:18:09,320 –> 00:18:13,680
of reflects like, you know, that
a lot of people on the left and

321
00:18:13,680 –> 00:18:16,920
Democrats were like saying,
like, oh, well, it’s ridiculous.

322
00:18:16,920 –> 00:18:19,520
Like Trump can’t win.
Like, he’s not going to win.

323
00:18:19,520 –> 00:18:24,640
And that was like, us not
realizing that there were, you

324
00:18:24,640 –> 00:18:27,200
know, because they do talk
about, like, how, you know,

325
00:18:27,200 –> 00:18:31,440
there’s a lot of, like,
influenceable people online.

326
00:18:31,440 –> 00:18:34,880
I forgot what the word they used
for them, but they like but that

327
00:18:34,880 –> 00:18:36,680
like we just.
I don’t know.

328
00:18:36,680 –> 00:18:40,400
There was this true
underestimation of how badly

329
00:18:40,400 –> 00:18:44,520
people could be manipulated on
Facebook and on social media.

330
00:18:44,520 –> 00:18:47,360
And I think that’s like, I don’t
think it was necessarily like an

331
00:18:47,360 –> 00:18:49,960
era of like she deserved it, but
it was an era of just like

332
00:18:50,160 –> 00:18:53,360
really y’all think that this
motherfucker is going to win

333
00:18:53,360 –> 00:18:55,920
like he’s in a in a traditional
election.

334
00:18:55,920 –> 00:18:59,440
Like, had that been like a 2004
election?

335
00:18:59,800 –> 00:19:02,320
Like she would she would have
won.

336
00:19:02,320 –> 00:19:04,760
Like they’re because we wouldn’t
have had all of this

337
00:19:04,760 –> 00:19:06,880
disinformation.
And so I think, I think that her

338
00:19:06,880 –> 00:19:10,560
attitude going into it was was
fair in my opinion.

339
00:19:10,640 –> 00:19:14,120
But because of all of the ads
that the Trump campaign ran in

340
00:19:14,120 –> 00:19:17,640
Project Alamo that like you
know, they they pushed it and

341
00:19:17,640 –> 00:19:20,040
they realized that that was how
they could win.

342
00:19:20,040 –> 00:19:22,880
And that was a tactic that I
just don’t think Democrats

343
00:19:23,120 –> 00:19:26,160
counted.
Yeah, I think that it’s like,

344
00:19:26,160 –> 00:19:29,280
hard to even pinpoint.
Like, I think that her, probably

345
00:19:29,280 –> 00:19:34,080
her whole team was out of touch
with what was, like, going on

346
00:19:34,080 –> 00:19:37,080
and how influential the Internet
could become.

347
00:19:37,320 –> 00:19:40,600
And so, I mean, entitlement,
sure, on one hand.

348
00:19:40,600 –> 00:19:43,320
But on the other hand, they just
truly didn’t do any of the

349
00:19:43,320 –> 00:19:47,600
research that Trump was prepared
to do and like, had a team that

350
00:19:47,600 –> 00:19:49,880
was willing to fucking spend
money.

351
00:19:49,880 –> 00:19:53,520
They didn’t have to make
something happen that shouldn’t

352
00:19:53,520 –> 00:19:54,680
have.
Period.

353
00:19:54,880 –> 00:20:00,520
Yeah, I mean, it definitely to
me seems like it, it’s at least

354
00:20:00,520 –> 00:20:03,480
a little bit of both like.
Yeah, no, for sure I.

355
00:20:03,680 –> 00:20:07,200
Think definitely a perfect
storm, so to speak, of hell.

356
00:20:07,200 –> 00:20:11,560
Yeah, like, Trump definitely had
people working on his behalf.

357
00:20:11,560 –> 00:20:14,520
But I just feel like Hillary
Clinton could have, too.

358
00:20:14,720 –> 00:20:16,520
That’s what I mean, too.
Like she dropped the ball.

359
00:20:16,520 –> 00:20:18,840
Like she should have had a team
that knew what the fuck they

360
00:20:18,840 –> 00:20:21,840
were up against as far as what
Facebook was potentially going

361
00:20:21,840 –> 00:20:23,840
to give, and she just didn’t.
Yeah, I.

362
00:20:24,000 –> 00:20:29,360
Wouldn’t be surprised if even a
lot of the stuff about well,

363
00:20:29,680 –> 00:20:32,320
Trump’s never going to win.
Like we don’t actually have to

364
00:20:32,320 –> 00:20:35,040
worry about this.
I wouldn’t be surprised if some

365
00:20:35,040 –> 00:20:37,160
of that came from the Trump
campaign.

366
00:20:37,560 –> 00:20:42,360
Also because that was as
effective as anything else in,

367
00:20:42,480 –> 00:20:45,440
Oh yeah, I mean, if you create
opposition because no one was

368
00:20:45,440 –> 00:20:47,320
even like entertaining the idea
of him.

369
00:20:47,320 –> 00:20:50,400
So there was nothing to bat up
against really to create

370
00:20:50,400 –> 00:20:52,440
controversy.
So I’m sure that they did that.

371
00:20:52,440 –> 00:20:56,280
That would only make sense.
Yeah, and I mean, it worked.

372
00:20:56,440 –> 00:20:57,640
We.
Everyone.

373
00:20:57,640 –> 00:21:01,040
I mean, I wrote an article in
2015 about how Trump could

374
00:21:01,040 –> 00:21:05,680
totally win, and boy, did people
call me crazy for that.

375
00:21:06,080 –> 00:21:08,400
Oof.
I don’t think I ever did.

376
00:21:08,400 –> 00:21:12,160
I was like, Yep.
And then everything about me.

377
00:21:12,400 –> 00:21:15,760
My dad worked for Donald Trump
at one point when he was running

378
00:21:15,760 –> 00:21:19,240
for governor of Illinois.
And I don’t think that anyone

379
00:21:19,240 –> 00:21:23,640
knew about that besides my
family, because he was like in

380
00:21:23,640 –> 00:21:26,920
the campaign and my dad is a
fucking criminal.

381
00:21:28,200 –> 00:21:30,920
That’s who Trump likes.
He loves a fucking criminal.

382
00:21:31,920 –> 00:21:36,800
So yeah, 2016, that didn’t go
the way we wanted it to go.

383
00:21:37,640 –> 00:21:42,800
So back to Cambridge Analytica.
David Carroll has this notion

384
00:21:42,800 –> 00:21:47,320
that voter data was used by
Cambridge Analytica to influence

385
00:21:47,320 –> 00:21:51,160
the election, and he ends up
hiring a lawyer in Great

386
00:21:51,160 –> 00:21:55,640
Britain, a guy named Ravi Naik,
to basically sue Cambridge

387
00:21:55,640 –> 00:21:59,680
Analytica for that data.
And that becomes one of the.

388
00:21:59,760 –> 00:22:03,200
The running plot points
throughout this whether he’s

389
00:22:03,200 –> 00:22:06,280
going to get that data and see
what it was.

390
00:22:06,480 –> 00:22:10,440
And spoiler, he doesn’t get it.
Not only does he not get it, but

391
00:22:10,440 –> 00:22:14,640
at one point Cambridge Analytica
is like, threatened with legal

392
00:22:14,640 –> 00:22:16,080
action.
They’re like, well, you’ll be

393
00:22:16,080 –> 00:22:19,480
charged with crimes and they’re
like, all right, charge us with

394
00:22:19,480 –> 00:22:22,560
crimes, that’s fine.
And they get charged with crimes

395
00:22:22,560 –> 00:22:24,800
and they just like, pay a
fucking fine or something.

396
00:22:24,920 –> 00:22:28,280
And they never have to give up
that data, which that’s kind of

397
00:22:28,280 –> 00:22:31,680
horrifying.
Honestly, it’s classic.

398
00:22:31,720 –> 00:22:34,680
I feel like companies like
corporations do that every day,

399
00:22:34,680 –> 00:22:37,160
all day.
They just are like, oh, laws.

400
00:22:37,720 –> 00:22:41,240
We have money to deal with laws.
Yeah, yeah, they’ll pay the

401
00:22:41,240 –> 00:22:43,120
fine.
Because even if the fine is like

402
00:22:43,120 –> 00:22:45,960
$2 billion, they’re like, all
right, drop in the bucket, you

403
00:22:45,960 –> 00:22:47,960
know, class action, lawsuit, who
cares?

404
00:22:47,960 –> 00:22:50,880
Like, who cares if you you have
family members that are horribly

405
00:22:50,880 –> 00:22:54,720
disfigured by our product, like
and their lives are irreparably

406
00:22:54,720 –> 00:22:58,960
changed, But like, you know.
$200.00 will do it right.

407
00:22:59,160 –> 00:23:02,320
I did not know that Cambridge
Analytica started their

408
00:23:02,320 –> 00:23:07,120
political influencer work by
working on the Ted Cruz

409
00:23:07,120 –> 00:23:10,520
campaign.
That has, that has to be such a

410
00:23:10,520 –> 00:23:14,000
powerful selling point to be
able to walk in and be like we

411
00:23:14,000 –> 00:23:17,240
got this fuck face elected, all
right?

412
00:23:17,480 –> 00:23:18,760
Truly.
Truly.

413
00:23:19,040 –> 00:23:23,760
And it it really is, That’s so
scary to be like this guy sucks

414
00:23:23,760 –> 00:23:25,640
so much and look what we did for
him.

415
00:23:25,960 –> 00:23:30,200
And it makes Ted Cruz make
perfect sense now, ’cause I

416
00:23:30,520 –> 00:23:34,080
like, I was always like how who
fucking voted for this guy and

417
00:23:34,080 –> 00:23:36,000
why?
And it’s, oh, it was a

418
00:23:36,000 –> 00:23:39,040
psychological operation that was
conducted on the people of

419
00:23:39,040 –> 00:23:41,400
Texas.
Oh duh.

420
00:23:41,720 –> 00:23:44,080
It’s.
Just that, that’s all.

421
00:23:44,280 –> 00:23:45,720
And you hate it when that
happens.

422
00:23:47,880 –> 00:23:50,000
But that does make perfect
sense.

423
00:23:50,000 –> 00:23:53,960
Like, that’s where you start.
You start with the hardest

424
00:23:54,080 –> 00:23:57,080
possible option.
Like, how do you get someone

425
00:23:57,080 –> 00:24:00,000
like Ted Cruz elected?
Like, once you do that, like,

426
00:24:00,000 –> 00:24:02,520
Trump’s going to be easy.
That motherfucker had ATV show

427
00:24:02,520 –> 00:24:04,520
for years.
He’s got fans.

428
00:24:04,760 –> 00:24:08,040
Ted Cruz’s dad killed JFK.
That’s probably not true.

429
00:24:08,920 –> 00:24:10,800
Yeah, and he was the Zodiac
Killer.

430
00:24:12,680 –> 00:24:16,200
That too.
What did we think of Alexander

431
00:24:16,200 –> 00:24:19,000
Nix, the CEO of Cambridge
Analytica?

432
00:24:19,000 –> 00:24:21,840
He looked like he should be on a
Yeah, he looked like he should

433
00:24:21,840 –> 00:24:25,240
be on a British sketch comedy
show or something.

434
00:24:25,240 –> 00:24:28,320
As opposed to being the guy
actively trying to ruin the

435
00:24:28,320 –> 00:24:29,320
world.
Yeah.

436
00:24:29,320 –> 00:24:29,840
What?
Who?

437
00:24:30,000 –> 00:24:31,720
What’s the guy that he does look
like?

438
00:24:31,720 –> 00:24:35,040
That is.
The guy with the fucking glasses

439
00:24:35,120 –> 00:24:38,920
in the in it.
In it, Yes, yes, yes.

440
00:24:38,920 –> 00:24:41,800
And like, not Ricky Gervais, his
friend.

441
00:24:41,880 –> 00:24:43,720
Yeah.
Oh, Stephen Merchant.

442
00:24:43,840 –> 00:24:48,720
Yeah, yes.
He’s like a low budget Stephen

443
00:24:48,720 –> 00:24:50,800
Merchant.
Yes, yes, yes.

444
00:24:50,800 –> 00:24:53,360
Thank you for knowing his name.
I could not get it out of my

445
00:24:53,360 –> 00:24:55,680
brain.
That that Stephen Merchant is a

446
00:24:55,840 –> 00:25:00,000
tall, gangly motherfucker like
that man is a giant.

447
00:25:00,200 –> 00:25:02,040
Yeah, but yeah, seems like a
noodle.

448
00:25:02,040 –> 00:25:04,880
Yeah, he’s like, he’s like. 6
foot eight.

449
00:25:04,880 –> 00:25:08,760
He’s genuinely enormous, yeah.
I wish everyone could have seen

450
00:25:08,920 –> 00:25:11,640
Jen’s hand movements when she
said he looks like a noodle.

451
00:25:11,840 –> 00:25:14,400
Great for podcasts.
Inspiring stuff.

452
00:25:15,400 –> 00:25:18,640
So yeah, Ted Cruz is where they
get their start and then they go

453
00:25:18,640 –> 00:25:20,840
to Trump and they’re like,
listen, if we got that

454
00:25:20,840 –> 00:25:23,880
motherfucker elected, we can
obviously get you elected.

455
00:25:24,120 –> 00:25:27,840
And they talk a little bit about
their methods.

456
00:25:28,120 –> 00:25:32,280
And The thing is, there’s
science going way, way back on

457
00:25:32,280 –> 00:25:36,320
the type of people who are prone
to vote for someone like Trump.

458
00:25:36,440 –> 00:25:38,560
There’s a book I talk about all
the time.

459
00:25:38,760 –> 00:25:40,960
You can still find a free copy
online.

460
00:25:40,960 –> 00:25:43,160
It’s the book is called The
Authoritarians.

461
00:25:43,320 –> 00:25:47,880
But they also wrote a new
version of it when Trump was

462
00:25:47,880 –> 00:25:51,160
campaigning called Authoritarian
Nightmare and.

463
00:25:53,080 –> 00:25:58,560
It’s written by this guy who was
a professor at a College in

464
00:25:58,600 –> 00:26:00,680
Canada for years and years and
years.

465
00:26:00,680 –> 00:26:04,480
He was a psychology professor,
and he gave incoming freshman

466
00:26:04,480 –> 00:26:10,200
this survey that was meant to
determine how likely you are to

467
00:26:10,200 –> 00:26:13,200
vote for an authoritarian
leader.

468
00:26:13,360 –> 00:26:18,680
And what he found is that there
is just a segment of the

469
00:26:18,680 –> 00:26:20,760
population.
What population?

470
00:26:21,080 –> 00:26:24,280
All of it.
Whole fucking world that just

471
00:26:24,280 –> 00:26:30,720
craves a really strong leader
who isn’t going to make them

472
00:26:30,720 –> 00:26:36,320
have to think like just ARF.
Keep me safe and keep the

473
00:26:36,320 –> 00:26:41,280
fucking trains running, so to
speak, and we will follow you to

474
00:26:41,280 –> 00:26:45,040
the end of the earth.
And it seems like Cambridge

475
00:26:45,040 –> 00:26:50,440
Analytica probably incorporated
some of that into their

476
00:26:50,440 –> 00:26:54,000
research.
Because if you can combine that

477
00:26:54,000 –> 00:26:57,800
with social media and someone
like Trump, that’s easy.

478
00:26:57,800 –> 00:27:00,200
Math.
Like that’s super easy.

479
00:27:00,360 –> 00:27:04,360
And it seems like they did.
They did DNA Sciences on

480
00:27:04,760 –> 00:27:07,560
Facebook.
They just selected for certain

481
00:27:07,560 –> 00:27:11,200
things within people and then
they essentially bred the

482
00:27:11,200 –> 00:27:14,400
algorithm to find them for them.
It’s pretty easy stuff if you

483
00:27:14,400 –> 00:27:17,720
like, even know a little bit
about selecting for genetics,

484
00:27:17,720 –> 00:27:19,800
which is what the Internet is
even based on.

485
00:27:20,040 –> 00:27:22,080
So they really just did
Internet.

486
00:27:22,680 –> 00:27:24,520
Kinda, yeah.
Horrifying.

487
00:27:24,720 –> 00:27:29,000
So we also meet a woman named
Carol Cadwallader.

488
00:27:29,080 –> 00:27:32,040
That last name threw me off.
There is a jarring lack of

489
00:27:32,040 –> 00:27:35,800
violence, but Carol seemed like
a good egg.

490
00:27:35,960 –> 00:27:38,920
She was also one of the people
investigating Cambridge

491
00:27:38,920 –> 00:27:44,600
Analytica, and she got treated
like a Russian asset and a

492
00:27:44,600 –> 00:27:48,080
conspiracy theorist over it.
Which that sucks.

493
00:27:48,080 –> 00:27:52,000
But also, when you’re defending
yourself against those claims in

494
00:27:52,000 –> 00:27:53,880
the documentary, don’t do it in
front of a.

495
00:27:53,960 –> 00:27:56,840
Poster with a bunch of fucking
Russian writing on it.

496
00:27:57,840 –> 00:28:02,920
Yeah, I saw that.
I was like, girl, you’re never

497
00:28:02,920 –> 00:28:08,640
going to beat the allegations.
Of this poster, Yes, I know

498
00:28:08,640 –> 00:28:11,320
exactly what it says.
Crazy.

499
00:28:12,400 –> 00:28:15,880
What did we think of Christopher
Wiley, the guy with the pink

500
00:28:15,880 –> 00:28:18,320
hair?
Kind of Go ahead.

501
00:28:18,600 –> 00:28:20,240
I was going to say I didn’t
think he was in.

502
00:28:20,800 –> 00:28:23,440
This documentary enough for
being a whistleblower, like he

503
00:28:24,240 –> 00:28:27,360
was barely in this documentary I
was like, if you had such a big

504
00:28:27,360 –> 00:28:31,120
part in this, like, why didn’t
we hear more from you?

505
00:28:31,120 –> 00:28:34,600
I don’t know, he was probably
too eloquent on the subject.

506
00:28:35,240 –> 00:28:38,680
Probably could be, yeah, not
messy enough.

507
00:28:38,960 –> 00:28:41,000
He’s another.
They also kind of painted him to

508
00:28:41,000 –> 00:28:43,000
be a narc.
And I was like, OK, chill.

509
00:28:43,200 –> 00:28:48,800
Well, they let Cambridge
Analytica kind of kneecap his

510
00:28:48,800 –> 00:28:50,920
point of view in this because
they’re talking.

511
00:28:50,920 –> 00:28:54,120
I don’t remember which Cambridge
Analytica fuck face it is, but

512
00:28:54,120 –> 00:28:56,680
the guy’s like, yeah, he’s
commenting on a bunch of stuff

513
00:28:56,680 –> 00:28:58,480
that happened when he didn’t
work here.

514
00:28:58,640 –> 00:29:02,560
And also, he pitched the Trump
campaign first.

515
00:29:03,000 –> 00:29:06,520
And we just won.
And I mean, if that’s true, it’s

516
00:29:06,520 –> 00:29:09,680
kind of like, oh dude, I like
your hair, but fuck you too.

517
00:29:09,840 –> 00:29:13,920
Like, yeah, like, don’t, don’t
go pitch in the Trump campaign,

518
00:29:13,920 –> 00:29:15,920
motherfucker.
Then like, is that a

519
00:29:15,920 –> 00:29:18,720
whistleblower or is that a narc
if you’re telling on yourself?

520
00:29:19,120 –> 00:29:20,960
Yeah, that’s the thing.
Like he’s a.

521
00:29:21,520 –> 00:29:23,960
Whistleblower.
But to what end?

522
00:29:24,040 –> 00:29:28,120
Like, is it just because he’s
mad he didn’t get Trump elected?

523
00:29:28,400 –> 00:29:32,320
Like, I don’t know.
But I do find it interesting

524
00:29:32,320 –> 00:29:35,560
that he describes Cambridge
Analytica as a full service

525
00:29:35,560 –> 00:29:38,520
propaganda machine.
Which Oof.

526
00:29:38,960 –> 00:29:41,080
Brutal.
How do you turn that off once

527
00:29:41,080 –> 00:29:44,040
you’ve turned it on?
Because like Cambridge Analytica

528
00:29:44,040 –> 00:29:47,280
was just a company that was
doing this, but it’s not like

529
00:29:47,280 –> 00:29:50,120
the mechanisms for doing it have
gone away.

530
00:29:50,320 –> 00:29:52,600
That data is all still.
Still out there?

531
00:29:52,920 –> 00:29:57,400
And now there’s more companies
that are doing what Cambridge

532
00:29:57,400 –> 00:30:00,480
Analytica did, and you can’t
stop it now.

533
00:30:00,720 –> 00:30:06,720
And I truly cannot express how
angry this documentary made me.

534
00:30:08,600 –> 00:30:12,040
Yeah, I would go as far to say
the Internet as a whole at this

535
00:30:12,040 –> 00:30:15,280
point is a full service
propaganda machine and that we

536
00:30:15,280 –> 00:30:18,040
need to scrub the whole thing
and start over.

537
00:30:18,520 –> 00:30:21,440
I would be all for two, baby.
Let’s go.

538
00:30:21,440 –> 00:30:24,200
Yeah, burn it down, Start over.
Guess what?

539
00:30:24,400 –> 00:30:27,080
No addictive algorithms.
Rule 1.

540
00:30:27,840 –> 00:30:30,360
Yeah, there’s a lot of points in
history where we should have

541
00:30:30,360 –> 00:30:33,280
just stopped.
And with the Internet, it’s like

542
00:30:33,480 –> 00:30:37,880
Yahoo there, We’re past there.
No, like Yahoo groups is where

543
00:30:37,880 –> 00:30:42,080
we should have stopped like.
Like, no social media, none of

544
00:30:42,080 –> 00:30:45,080
that shit.
Like, we, like the Internet,

545
00:30:45,240 –> 00:30:51,200
should have just united us in
that we all now could go find

546
00:30:51,200 –> 00:30:54,080
our people and fucking hang with
them.

547
00:30:54,200 –> 00:30:55,960
And that’s how it was for a long
time.

548
00:30:57,120 –> 00:31:01,120
And then it turned into, Oh no,
why don’t we all fucking hang

549
00:31:01,120 –> 00:31:03,680
out together?
And that’s when shit went

550
00:31:03,840 –> 00:31:06,920
downhill.
Like I didn’t ever need to

551
00:31:06,920 –> 00:31:09,240
interact with my old high school
friends.

552
00:31:09,680 –> 00:31:13,400
Ever again, as it turns out.
Like my life would have been

553
00:31:13,520 –> 00:31:18,040
just as rich if I did not know
that my best friend Larry is an

554
00:31:18,080 –> 00:31:21,400
avid Trump supporter now.
Like, I would love not knowing

555
00:31:21,400 –> 00:31:23,600
that, but hey.
Yeah, well, I would say the

556
00:31:23,600 –> 00:31:25,240
Internet has even gone one step
further.

557
00:31:25,240 –> 00:31:26,920
Instead of like, what if we all
hung out together?

558
00:31:26,920 –> 00:31:30,120
Now it’s just like, what if I
only showed you takes and

559
00:31:30,120 –> 00:31:33,040
opinions that you vehemently
disagree with?

560
00:31:33,160 –> 00:31:37,160
And it’s like, no, I that’s
actually the opposite of what I

561
00:31:37,160 –> 00:31:39,120
want.
I just want to hang out.

562
00:31:39,480 –> 00:31:43,200
With the people that I like,
that I agree with because we

563
00:31:43,240 –> 00:31:46,240
enjoy things and if we have a
disagreement we can talk about

564
00:31:46,240 –> 00:31:51,200
it like humans and not like be
brigaded by people who have, you

565
00:31:51,200 –> 00:31:55,600
know, stronger opinions and want
to throw words around.

566
00:31:55,920 –> 00:32:00,840
And I got y’all I really I’m
like, I feel like I’m so blind

567
00:32:00,840 –> 00:32:03,480
in this conversation, just with
how angry I am.

568
00:32:03,680 –> 00:32:06,200
I’m saying this all while
smiling, because if I don’t, I’m

569
00:32:06,200 –> 00:32:08,280
going to scream.
I just feel like the Internet

570
00:32:08,280 –> 00:32:10,720
stopped being.
We’re all here together and

571
00:32:10,720 –> 00:32:12,480
started being, we’re all here
together.

572
00:32:12,480 –> 00:32:14,280
So let’s fight.
Yeah.

573
00:32:14,480 –> 00:32:17,400
Kind of, yeah.
I forgot they mentioned in this

574
00:32:17,720 –> 00:32:21,000
a detail I had forgotten, which
is that Cambridge Analytica was

575
00:32:21,000 –> 00:32:23,640
in part founded by Steve Bannon,
which.

576
00:32:23,640 –> 00:32:25,720
Thanks, I hate it.
That’s cool.

577
00:32:26,320 –> 00:32:27,920
Somewhat.
There’s a quote where someone

578
00:32:27,920 –> 00:32:32,120
says this is the weapon Steve
Bannon wanted to build to fight

579
00:32:32,120 –> 00:32:35,320
his culture war.
And great, I’m sure he’s given

580
00:32:35,320 –> 00:32:38,640
up on that since then.
I’m sure we have nothing to

581
00:32:38,640 –> 00:32:41,440
worry about there.
So David Carroll, he files his

582
00:32:41,440 –> 00:32:44,320
lawsuit.
Again, that doesn’t fucking go

583
00:32:44,320 –> 00:32:46,360
anywhere.
This becomes enough of the story

584
00:32:46,360 –> 00:32:49,920
that there’s a hearing about it.
And at one point Christopher

585
00:32:49,920 –> 00:32:52,400
Wiley is like, well, the person
you should be talking to is

586
00:32:52,400 –> 00:32:54,960
Brittany Kaiser.
And they’re like, who the?

587
00:32:55,480 –> 00:32:59,000
Is Brittany Kaiser?
It’s like how it seems like she

588
00:32:59,000 –> 00:33:01,160
was just there the whole time.
Like, how did you?

589
00:33:01,640 –> 00:33:05,160
Well, I guess he came forward.
The funky hats, dude.

590
00:33:05,200 –> 00:33:07,600
She got the funky hat.
Nobody sees her.

591
00:33:09,240 –> 00:33:12,520
Yeah, I guess he came forward as
a whistleblower, so they

592
00:33:12,520 –> 00:33:13,840
probably wouldn’t have known
care.

593
00:33:14,160 –> 00:33:16,360
She had a funky hat.
They don’t see that.

594
00:33:16,840 –> 00:33:18,880
Exactly.
The first time they interview

595
00:33:18,880 –> 00:33:22,360
her, she’s somewhere in Thailand
and she’s like the other thing

596
00:33:22,360 –> 00:33:26,840
is maybe she ran the fuck away.
Yeah, I mean, it seems like she

597
00:33:26,840 –> 00:33:31,000
has reason to take up residence
in a foreign country as opposed

598
00:33:31,000 –> 00:33:34,280
to the United States.
I would say so.

599
00:33:34,520 –> 00:33:38,080
I did get that vibe kind of that
cause Brittany Kaiser, she

600
00:33:38,080 –> 00:33:40,560
played a huge role in all of
this.

601
00:33:40,760 –> 00:33:46,480
And she does come forward as
sort of a whistleblower, but

602
00:33:46,480 –> 00:33:51,160
she’s a whistleblower in the
same way the first people to

603
00:33:51,160 –> 00:33:54,760
make it out of Jonestown were
whistleblowers.

604
00:33:54,960 –> 00:33:57,320
Because if you look into that
like.

605
00:33:57,760 –> 00:34:00,320
Deborah Layton was one of the
first people to come out of

606
00:34:00,320 –> 00:34:01,960
Jonestown and be like, whoa,
what the fuck?

607
00:34:01,960 –> 00:34:03,640
You’re not going to believe
what’s happening there.

608
00:34:03,840 –> 00:34:07,080
Deborah Layton was, like, in
charge at Jonestown.

609
00:34:07,080 –> 00:34:10,560
She was like the number two.
And then she comes out and gets

610
00:34:10,560 –> 00:34:13,560
to like, build the narrative of
what was happening in there and

611
00:34:13,560 –> 00:34:16,280
be like, oh, I was a victim to
this fucking cult leader, am I

612
00:34:16,280 –> 00:34:18,400
right?
And it’s like, Nah, you built

613
00:34:18,400 –> 00:34:20,679
that with him, like you were
doing it.

614
00:34:20,679 –> 00:34:23,960
You weren’t one of the people
being held as a slave.

615
00:34:24,440 –> 00:34:26,840
Jonestown, you were the slave
master.

616
00:34:27,040 –> 00:34:30,400
And then you come out like, Oh
my God, Can you believe it?

617
00:34:30,600 –> 00:34:34,280
Brittany Kaiser.
I kind of get that sense too,

618
00:34:34,280 –> 00:34:37,440
where she’s like, oh fuck, Can
you believe what these people

619
00:34:37,440 –> 00:34:40,480
were doing?
And some of it I think it’s just

620
00:34:40,480 –> 00:34:42,960
to protect.
And I think.

621
00:34:43,159 –> 00:34:45,800
Her.
Well, I think some of that is

622
00:34:45,800 –> 00:34:50,520
also her trying to, like
mentally and emotionally protect

623
00:34:50,520 –> 00:34:53,000
herself.
And like, I’d be interested to

624
00:34:53,000 –> 00:34:56,000
know more of, like, what the
conversations with her family

625
00:34:56,000 –> 00:34:58,320
was like.
I mean, we get a phone.

626
00:34:58,320 –> 00:35:03,120
We overhear a phone conversation
between her and her mom and her

627
00:35:03,120 –> 00:35:06,720
mom saying like, oh, you know,
talking about like, having to.

628
00:35:07,280 –> 00:35:09,200
Pay the bills or something.
And she’s like, I don’t have

629
00:35:09,200 –> 00:35:14,640
$1000 to pay all of the bills.
And I’m like, I’m fascinated to

630
00:35:14,640 –> 00:35:19,160
find out like, what this
conversation is because, like,

631
00:35:19,160 –> 00:35:23,800
to me that so clearly sounds
like she was working on all of

632
00:35:23,800 –> 00:35:26,400
these, like.
Campaigns that have put her

633
00:35:26,400 –> 00:35:29,960
mother on her, you know, on her
heels and like trying to figure

634
00:35:29,960 –> 00:35:33,000
this out and she’s like, you
know, I can, I can help you out.

635
00:35:33,000 –> 00:35:36,200
And it’s just like it’s like
girl you you’re part of the

636
00:35:36,200 –> 00:35:38,920
problem.
Like you did this to your mom.

637
00:35:38,920 –> 00:35:41,600
Like this is this is what
happened.

638
00:35:41,760 –> 00:35:45,560
And you were part of this and
you directly influenced this in

639
00:35:45,560 –> 00:35:48,880
hurting your own fucking family
and yourself.

640
00:35:48,960 –> 00:35:51,600
And I think that part of it is
like she’s just trying to

641
00:35:51,600 –> 00:35:54,120
mentally protect herself.
And I think that’s you know

642
00:35:54,120 –> 00:35:55,520
what?
You know, Deborah Layton did

643
00:35:55,520 –> 00:35:57,600
too.
You know, leaving these cults

644
00:35:57,600 –> 00:36:00,120
and it’s like, no, you you were
in charge.

645
00:36:00,120 –> 00:36:04,040
Like, you need to, like you need
to go to therapy and talk that

646
00:36:04,040 –> 00:36:07,480
out with somebody and like, come
to terms with what you did and

647
00:36:07,480 –> 00:36:11,560
accept that that’s what you did.
I think about the, Oh my God,

648
00:36:11,560 –> 00:36:14,640
I’m not going to remember his
name, but the there’s a guy that

649
00:36:14,640 –> 00:36:16,600
left the Moonies that was
actually, like, really

650
00:36:16,600 –> 00:36:19,640
influential in the Moonies too.
And he brought in a lot of

651
00:36:19,640 –> 00:36:21,840
people from the Moonies.
To the Moonies.

652
00:36:21,840 –> 00:36:26,120
And he’s like now out and
speaking on like or or really

653
00:36:26,120 –> 00:36:29,360
like Mike Rinder, you know, for
Scientology, like coming out And

654
00:36:29,360 –> 00:36:33,480
like, he’s like, I did all this
bad stuff and I am now paying

655
00:36:33,480 –> 00:36:36,920
the price for it.
I am also having to sit in front

656
00:36:36,920 –> 00:36:39,640
of people who are telling me
this was your fault.

657
00:36:39,640 –> 00:36:42,920
You did this to me specifically.
Like you specifically did this

658
00:36:42,920 –> 00:36:45,200
to me specifically.
And he’s having to live with

659
00:36:45,200 –> 00:36:47,200
that.
And he even like cries in

660
00:36:47,200 –> 00:36:50,200
interviews sometimes because he
is just so distraught of like

661
00:36:50,200 –> 00:36:52,440
how he hurts.
So many people, and he’s

662
00:36:52,480 –> 00:36:54,880
obviously in therapy or like,
he’s not in therapy about it,

663
00:36:54,880 –> 00:36:56,240
’cause they’re all still weird
about therapy.

664
00:36:56,240 –> 00:36:58,440
But like, he’s working through
it, you know?

665
00:36:58,440 –> 00:37:00,800
And it’s like, but it’s like
Britney still needs to work

666
00:37:00,800 –> 00:37:02,680
through these things.
I’m just like, you did this to

667
00:37:02,720 –> 00:37:05,120
people.
I didn’t see real regret from

668
00:37:05,120 –> 00:37:07,720
her.
I didn’t see any accountability

669
00:37:07,720 –> 00:37:10,160
from her.
She’s just like, she’ll be like

670
00:37:10,160 –> 00:37:12,960
sitting watching Zuckerberg and
be like, OK, yeah, Zuck or

671
00:37:12,960 –> 00:37:14,800
whatever the fuck.
And it’s like you don’t get to

672
00:37:14,800 –> 00:37:16,320
be like that.
We get to be like that.

673
00:37:16,360 –> 00:37:18,360
You don’t go to fucking be like
that.

674
00:37:18,680 –> 00:37:22,640
And also like, no tears.
She’s really just like fucking

675
00:37:23,040 –> 00:37:25,600
just.
I think that she really did all

676
00:37:25,600 –> 00:37:30,960
of this so that she could see
herself as a as a human again

677
00:37:31,200 –> 00:37:34,600
and like to protect, like you’re
saying her emotional well-being

678
00:37:34,760 –> 00:37:37,520
instead of just actually
emotionally taking

679
00:37:37,520 –> 00:37:41,400
responsibility for the fact that
you just fucking decided this

680
00:37:41,400 –> 00:37:43,960
would be a fun fucking thing for
you to do.

681
00:37:44,240 –> 00:37:47,800
Little Miss Campaign.
Well, not just that, but there’s

682
00:37:47,800 –> 00:37:52,480
a thing she mentions in this.
Where she points out that her

683
00:37:52,480 –> 00:37:59,360
family lost a lot of money
during the 2008, 2009 financial

684
00:37:59,480 –> 00:38:03,920
collapse and.
I can’t help but wonder if that

685
00:38:03,920 –> 00:38:09,120
was maybe the trigger that made
her go from Democrat who runs

686
00:38:09,160 –> 00:38:14,800
Obama’s Facebook page to human
rights campaigner to OK, well

687
00:38:14,800 –> 00:38:17,880
now I’m going to fucking help
Trump and the worst version of

688
00:38:17,880 –> 00:38:22,440
Republicans take over the world.
Like, I wonder if she blamed her

689
00:38:22,440 –> 00:38:28,360
family’s financial demise on
Obama and the Democrats a little

690
00:38:28,360 –> 00:38:32,680
bit and was like, there’s that
whole bootstraps mentality that

691
00:38:32,680 –> 00:38:35,240
is like, well, if your family’s
suffering, you should just take

692
00:38:35,240 –> 00:38:38,560
any job that’s given to you.
Yeah, well, I wonder if it was

693
00:38:38,560 –> 00:38:42,000
that or was it like a revenge
thing where she was like, oh, I

694
00:38:42,000 –> 00:38:44,160
saw your thing.
She’s like, they did this to my

695
00:38:44,160 –> 00:38:48,520
family.
Kind of like that’s nuts,

696
00:38:49,000 –> 00:38:54,880
because she does make a hard
fucking pivot from human rights

697
00:38:55,080 –> 00:38:56,560
to.
All right, well, now I’m going

698
00:38:56,560 –> 00:39:00,000
to join the NRA and find out how
this side.

699
00:39:00,480 –> 00:39:03,680
Of the world does things like I
thought that was such an

700
00:39:03,680 –> 00:39:06,200
interesting part when she was
talking about that, where she

701
00:39:06,200 –> 00:39:09,160
was like, I don’t like guns and
it’s like, she’s like, I thought

702
00:39:09,160 –> 00:39:11,800
I like was interested in the
human design of people who like

703
00:39:11,800 –> 00:39:12,840
guns.
And I’m like, yeah, that’s

704
00:39:12,840 –> 00:39:15,840
fucking creepier, bitch.
Yeah, that’s a little weirder.

705
00:39:17,360 –> 00:39:20,760
Yeah, I just, I yeah, I’m
wondering, I’m with you, Adam.

706
00:39:20,760 –> 00:39:24,520
I’m wondering if this is like
misplaced blame.

707
00:39:24,920 –> 00:39:30,160
Like, you know, because I know a
lot of people blamed Obama for

708
00:39:30,160 –> 00:39:32,640
the financial collapse and was
like, no, no, no, no.

709
00:39:32,640 –> 00:39:35,040
He just took office.
Guess who set all that up?

710
00:39:35,040 –> 00:39:38,040
Like, you know, it’s like we’re
kind of going through a similar

711
00:39:38,040 –> 00:39:42,360
thing with like, you know, it’s
like Bob Iger had made like a

712
00:39:42,360 –> 00:39:47,880
bunch of not great choices in
his first time as CEO of Disney.

713
00:39:48,200 –> 00:39:49,960
And then Chapeck was also not
very good.

714
00:39:49,960 –> 00:39:53,080
But he then had to deal with
what Iger?

715
00:39:53,400 –> 00:39:56,680
Had like set up for him and go
through all of that.

716
00:39:56,680 –> 00:40:00,000
And then now that Iger’s back,
some of his own chickens have

717
00:40:00,000 –> 00:40:02,280
come home to roost that he set
up that we’re going to be

718
00:40:02,280 –> 00:40:05,680
somebody else’s problem.
So like it’s, you know, put

719
00:40:05,680 –> 00:40:08,120
it’s, you know, kicking, you
know, kicking the ball down the

720
00:40:08,120 –> 00:40:09,920
field for somebody else to deal
with.

721
00:40:10,160 –> 00:40:15,640
And it’s exhausting.
But yeah, I can see.

722
00:40:15,640 –> 00:40:18,360
I could see her.
Especially being like, you know,

723
00:40:18,360 –> 00:40:24,480
I remember when I was like, you
know, 2324 and, like the things

724
00:40:24,480 –> 00:40:29,520
that I thought then are a lot of
them are not what I think now.

725
00:40:30,320 –> 00:40:33,720
And a lot of it is just like, Oh
no, you actually are required to

726
00:40:33,720 –> 00:40:37,960
hold a lot of nuance in your
head and like find comfort in

727
00:40:37,960 –> 00:40:40,920
the Gray area.
Whereas, like, it’s easier to

728
00:40:40,920 –> 00:40:43,160
blame somebody and be like,
they’re the reason for all my

729
00:40:43,160 –> 00:40:44,640
problems.
I mean, like, they’re not.

730
00:40:44,640 –> 00:40:45,960
There’s actually a lot of
reasons.

731
00:40:45,960 –> 00:40:50,400
But, you know, for a 22 year old
who just came off of a campaign

732
00:40:50,400 –> 00:40:53,600
and then being like, oh wow,
this president personally harmed

733
00:40:53,600 –> 00:40:57,800
me by, you know, ruining my
family’s financial situation.

734
00:40:57,800 –> 00:41:00,800
Like I could see, I could see
the logic of like, well, I’m

735
00:41:00,800 –> 00:41:05,120
gonna go work for this data
company and then like slowly

736
00:41:05,640 –> 00:41:08,720
trickle into right wing
nonsense.

737
00:41:09,080 –> 00:41:11,800
Yeah.
She talks about how they did it,

738
00:41:11,920 –> 00:41:15,640
and it’s so simple.
Like they just targeted the

739
00:41:16,080 –> 00:41:19,880
people they wanted to target and
just flooded those people’s

740
00:41:19,880 –> 00:41:25,960
pages and news feeds with anti
Hillary Clinton content.

741
00:41:26,160 –> 00:41:29,640
And they did it in a way that
made it seem like it was just

742
00:41:29,640 –> 00:41:34,120
organic and coming from like
grassroots political groups and

743
00:41:34,120 –> 00:41:37,600
shit, because you could just
make up a news outlet.

744
00:41:38,160 –> 00:41:42,080
And like, post content to
Facebook from that news outlet

745
00:41:42,080 –> 00:41:45,560
when you’re really like that
happened during the 2016

746
00:41:45,560 –> 00:41:47,560
election.
There were all these like what

747
00:41:47,560 –> 00:41:51,480
seemed like quaint little local
news outlets that were just

748
00:41:51,680 –> 00:41:55,960
propaganda and they were set up
solely to disseminate that

749
00:41:55,960 –> 00:41:58,200
propaganda.
And the difficult thing is that,

750
00:41:58,200 –> 00:42:02,200
like Americans are not educated
well enough on disseminating

751
00:42:02,200 –> 00:42:06,800
propaganda and media literacy.
And so, like, we’re, you know,

752
00:42:06,800 –> 00:42:09,440
it’s like people that are are
like, no, these are fake fucking

753
00:42:09,440 –> 00:42:11,600
websites.
Like these are fake news

754
00:42:11,600 –> 00:42:14,400
sources.
And but people don’t want to

755
00:42:14,400 –> 00:42:16,600
investigate it any further.
They just see a thing.

756
00:42:16,600 –> 00:42:19,120
It confirms their bias.
They don’t go any further into

757
00:42:19,160 –> 00:42:20,520
it.
And it’s like, OK, but what is

758
00:42:20,520 –> 00:42:22,760
that?
What is that source right now?

759
00:42:23,080 –> 00:42:26,120
Like, what do you like?
Who’s actually behind that

760
00:42:26,120 –> 00:42:28,280
source?
And it’s like if you don’t dig

761
00:42:28,280 –> 00:42:32,600
deep enough into it then and
they’re they’re just getting

762
00:42:32,680 –> 00:42:36,760
more sophisticated and hiding
what a source is and who’s

763
00:42:36,760 –> 00:42:39,480
behind it, and it’s I.
Literally did it myself.

764
00:42:39,480 –> 00:42:43,360
Like recently.
Like for myself, I was trying to

765
00:42:43,960 –> 00:42:46,080
get.
Verification.

766
00:42:46,280 –> 00:42:49,360
I don’t even remember on which
outlet, but I had heard that if

767
00:42:49,360 –> 00:42:52,320
you have publications written
about you that you could get

768
00:42:52,320 –> 00:42:55,560
verified that way.
So I made an AI writer.

769
00:42:55,560 –> 00:42:57,960
I had AI write the article about
me.

770
00:42:57,960 –> 00:43:02,120
I posted it to Medium as that
fake AI writer and then posted

771
00:43:02,120 –> 00:43:03,480
it.
And people were like, wow,

772
00:43:03,480 –> 00:43:07,320
that’s so cool all day.
It took me 20 minutes.

773
00:43:07,640 –> 00:43:11,840
Yeah, this is not hard stuff.
And also that’s me.

774
00:43:11,840 –> 00:43:14,440
Think about what people are
doing to you all the time.

775
00:43:14,480 –> 00:43:17,160
That’s me being silly.
That was a goof.

776
00:43:17,440 –> 00:43:20,520
Yeah, this is a company that was
spending $1,000,000 a day.

777
00:43:22,400 –> 00:43:25,880
So like, imagine the way they
can target it, how easy it was.

778
00:43:25,880 –> 00:43:29,560
It’s fucked up.
Yeah, and so Cambridge Analytica

779
00:43:29,560 –> 00:43:33,360
eventually, as this stuff starts
coming out, goes on this media

780
00:43:33,360 –> 00:43:35,960
campaign trying to make
themselves seem like.

781
00:43:36,320 –> 00:43:40,160
Less of a villain than they are.
This is the point where they

782
00:43:40,480 –> 00:43:45,040
kind of paint Chris Wiley as
like a jealous ex employee.

783
00:43:45,160 –> 00:43:48,080
And it’s like, OK, sure, but is
what he’s saying true?

784
00:43:48,240 –> 00:43:53,160
Because if so, I don’t care if
he’s a jealous ex employee.

785
00:43:53,440 –> 00:43:56,760
Sometimes jealous ex employees
say shit like this.

786
00:43:57,280 –> 00:43:58,800
And.
Why are we caring about what’s

787
00:43:58,840 –> 00:44:02,800
who’s jealous here?
Yeah, like, is what he’s saying

788
00:44:02,880 –> 00:44:05,240
true?
That’s the fucking question.

789
00:44:05,400 –> 00:44:09,960
When they bring up Brittany
Kaiser, though, that same guy is

790
00:44:09,960 –> 00:44:12,320
a little less adamant about
things.

791
00:44:12,320 –> 00:44:16,160
He’s like, well, I thought she
was a friend, so I’m like, OK,

792
00:44:16,160 –> 00:44:19,600
well, everything Brittany Kaiser
saying is true, obviously.

793
00:44:20,200 –> 00:44:24,440
Like if that’s his reaction, I
do like the part where she finds

794
00:44:24,440 –> 00:44:28,000
all this like old.
Like her old calendar and all

795
00:44:28,000 –> 00:44:31,200
these emails, because Cambridge
Analytica was.

796
00:44:31,280 –> 00:44:35,000
Wild.
They were trying to deny that

797
00:44:35,120 –> 00:44:38,280
they used this Facebook data
after a certain point.

798
00:44:38,400 –> 00:44:42,040
And that’s one of the myths
about the Internet is people are

799
00:44:42,040 –> 00:44:45,000
like, oh, once it’s on the
Internet, it’s there forever.

800
00:44:45,520 –> 00:44:47,600
Like, motherfucker, Not if
someone stops paying their

801
00:44:47,600 –> 00:44:50,320
hosting bill like shit.
Me about Myspace?

802
00:44:50,600 –> 00:44:52,440
What?
Yeah, shit disappears from the

803
00:44:52,440 –> 00:44:56,160
Internet all the time.
And Cambridge Analytica had

804
00:44:56,160 –> 00:45:00,840
successfully eliminated any
proof that they were using this

805
00:45:00,840 –> 00:45:05,560
Facebook data from the Internet.
But then Brittany Kaiser finds

806
00:45:05,600 –> 00:45:08,600
this e-mail on her old work
computer.

807
00:45:09,040 –> 00:45:13,280
That clearly says, well, we’re
using these, this Facebook data

808
00:45:13,560 –> 00:45:18,000
for 30 million people and it’s
like got a date on it that is

809
00:45:18,200 –> 00:45:22,080
clearly well after when they
said they had deleted all this

810
00:45:22,080 –> 00:45:24,040
stuff.
So they couldn’t really refute

811
00:45:24,040 –> 00:45:27,840
what she was saying, Like she
had a laptop full of

812
00:45:27,840 –> 00:45:30,880
information, but still nothing.
Shocking.

813
00:45:30,880 –> 00:45:33,520
She’s alive, actually.
Kinda, yeah.

814
00:45:33,720 –> 00:45:35,560
Yeah.
I mean, that’s probably why she

815
00:45:35,560 –> 00:45:37,680
lives in Thailand now.
I mean, I’m assuming she lives

816
00:45:37,680 –> 00:45:40,320
in Thailand.
Yeah, seems like we don’t know

817
00:45:40,320 –> 00:45:41,840
if she does.
Assuming that.

818
00:45:42,120 –> 00:45:44,160
And so where?
She was like, I guess no going

819
00:45:44,160 –> 00:45:47,360
outside right now, I wonder.
Girl, I guess.

820
00:45:47,960 –> 00:45:49,040
Yeah.
What?

821
00:45:49,600 –> 00:45:52,680
Were you wondering?
I said I I was gonna say I was

822
00:45:52,680 –> 00:45:55,840
wondering if like, they have,
like, extraditions.

823
00:45:56,040 –> 00:45:57,960
Is that the word?
Oh, yeah.

824
00:45:57,960 –> 00:45:59,320
I don’t know.
But she’s not.

825
00:45:59,640 –> 00:46:03,240
Yeah, but I don’t think she’s a
criminal to the US.

826
00:46:03,240 –> 00:46:07,880
She’s just somebody that would
probably have a hit taken out on

827
00:46:07,880 –> 00:46:10,360
them.
Hide somebody else.

828
00:46:10,680 –> 00:46:13,800
There is that thing that comes
up later about her meeting with

829
00:46:14,080 –> 00:46:18,200
Julian Assange, about the 2016
election.

830
00:46:18,200 –> 00:46:20,880
Or was it might have been the
2020 election, I don’t remember.

831
00:46:20,880 –> 00:46:23,800
But she was like, oh, we didn’t
talk about the election.

832
00:46:23,840 –> 00:46:25,600
It’s like OK.
Sure, girl.

833
00:46:25,960 –> 00:46:28,920
Sure.
You didn’t like in that case, I

834
00:46:28,920 –> 00:46:32,200
could see the US being like,
yeah, we’d like to have a word,

835
00:46:32,360 –> 00:46:36,920
but who knows the the point
where I really started to kind

836
00:46:36,920 –> 00:46:40,040
of turn.
On Britney Kaiser, and I don’t

837
00:46:40,040 –> 00:46:44,800
know if she was there for this
part of what Cambridge Analytica

838
00:46:44,800 –> 00:46:48,720
did, but they didn’t really
start with Ted Cruz like Ted

839
00:46:48,720 –> 00:46:53,080
Cruz was their selling point,
but where they started was Third

840
00:46:53,080 –> 00:46:55,600
World.
Countries and like Eastern

841
00:46:55,680 –> 00:46:59,840
European countries like
Lithuania, Romania, Kenya,

842
00:46:59,880 –> 00:47:04,680
Ghana, Malaysia, they interfered
in all of those elections way

843
00:47:04,680 –> 00:47:09,360
before they decided to upgrade
and start fucking with the US

844
00:47:09,400 –> 00:47:11,920
and the UK.
Also, we haven’t mentioned, but

845
00:47:12,120 –> 00:47:16,720
Cambridge Analytica did the same
thing with Brexit and they were

846
00:47:16,720 –> 00:47:19,320
instrumental in making Brexit.
Happen.

847
00:47:19,480 –> 00:47:24,680
But before that they were in all
of these tiny countries just

848
00:47:24,680 –> 00:47:29,080
helping probably authoritarian
regimes take over.

849
00:47:29,400 –> 00:47:32,280
That’s one of the things that
struck me.

850
00:47:32,280 –> 00:47:35,600
There’s a Ted talk at the end of
this where Carol, the

851
00:47:35,600 –> 00:47:38,480
journalist, is talking to
Silicon Valley.

852
00:47:38,880 –> 00:47:40,760
And she’s like, is this really
what you want?

853
00:47:40,960 –> 00:47:44,080
Do you want history to remember
you as the handmaidens of

854
00:47:44,080 –> 00:47:47,600
authoritarianism?
And it’s like, I think the scary

855
00:47:47,600 –> 00:47:52,080
answer is yes, like, I don’t.
I don’t think these companies

856
00:47:52,080 –> 00:47:56,560
are opposed to authoritarianism,
especially when you consider

857
00:47:56,560 –> 00:48:00,840
that if they are the ones who
make it happen, then they’re

858
00:48:00,840 –> 00:48:04,240
probably going to be set up
pretty well in terms of life

859
00:48:04,240 –> 00:48:07,760
under that authoritarian regime
like Google has.

860
00:48:08,480 –> 00:48:12,640
Had no problems kowtowing to
China when they need to.

861
00:48:12,920 –> 00:48:15,200
I don’t think any of these
companies would have a single

862
00:48:15,200 –> 00:48:18,760
fucking problem with being
remembered as the ones who

863
00:48:18,760 –> 00:48:21,040
ushered in these authoritarian
regimes.

864
00:48:21,040 –> 00:48:22,640
They’re like, yeah, baby, that’s
what we do.

865
00:48:22,920 –> 00:48:24,440
As long as they get paid, they
don’t care.

866
00:48:24,720 –> 00:48:28,880
Like you know, as long as we
live under a capitalistic

867
00:48:28,880 –> 00:48:32,840
society and money is king, they
don’t care.

868
00:48:32,920 –> 00:48:35,760
And that’s kind of what the
government cares about, too,

869
00:48:35,840 –> 00:48:38,760
like the American government.
This is what I mean this.

870
00:48:39,000 –> 00:48:44,040
I’m so angry, like I just, I
know we’re heading toward it.

871
00:48:44,040 –> 00:48:47,240
So I’m just going to go to it.
They talk about working on this

872
00:48:47,240 –> 00:48:51,440
campaign in Trinidad and Tobago
of this do.

873
00:48:51,640 –> 00:48:55,760
The Do So campaign, Do so
campaign, Which is can they

874
00:48:55,760 –> 00:48:59,000
wanted to put in because there’s
apparently like a, you know, a

875
00:48:59,000 –> 00:49:03,000
big Afro Latino community and
then like a very large Indian

876
00:49:03,000 –> 00:49:06,160
community, which I didn’t know
about Trinidad and Tobago so.

877
00:49:06,320 –> 00:49:07,800
Yeah, fun.
I didn’t either fun fact to

878
00:49:07,800 –> 00:49:12,920
learn but that they wanted to
put in an Indian leader and they

879
00:49:12,920 –> 00:49:17,680
were working for the the Indian
community there and so they made

880
00:49:17,680 –> 00:49:22,840
this whole campaign called do so
by like you know I forget what

881
00:49:22,840 –> 00:49:24,920
the like what the two.
Well, they were.

882
00:49:25,120 –> 00:49:30,040
What they basically did was they
encouraged the Do So campaign,

883
00:49:30,280 –> 00:49:35,080
encouraged young people to not
vote because like on the grounds

884
00:49:35,080 –> 00:49:36,920
that they’re not getting the
choices they want, the

885
00:49:36,920 –> 00:49:38,960
government’s not working for
you, whatever.

886
00:49:39,120 –> 00:49:43,920
But the whole point was to get
people to rally around the Do So

887
00:49:43,920 –> 00:49:48,560
campaign, knowing that the
people on the black side would

888
00:49:48,560 –> 00:49:51,400
just not vote and the people on
the Indian side.

889
00:49:51,760 –> 00:49:54,240
They might get on social media
and be like, yeah, I’m not

890
00:49:54,240 –> 00:49:56,720
voting.
But they knew once their parents

891
00:49:56,720 –> 00:49:59,360
were like, no, you’re voting
like you’re coming with me and

892
00:49:59,360 –> 00:50:01,280
you’re voting.
That side was still going to

893
00:50:01,280 –> 00:50:03,760
vote.
And like, they knew all of that

894
00:50:03,760 –> 00:50:05,480
ahead of time and it fucking
worked.

895
00:50:05,640 –> 00:50:08,880
Yeah, it’s bad times.
It’s bad times, man.

896
00:50:08,880 –> 00:50:11,240
Well, especially with the Do So
campaign, because they even talk

897
00:50:11,240 –> 00:50:14,760
about how because there’s like
recording of them in this

898
00:50:14,760 –> 00:50:16,880
meeting.
Like talking about how, like,

899
00:50:17,320 –> 00:50:21,200
you know, they realize like, oh,
well, you know, this group of

900
00:50:21,200 –> 00:50:24,600
people like what’s the easiest
thing is to not vote.

901
00:50:24,640 –> 00:50:28,000
And so it’s actually easier to
convince them to not do anything

902
00:50:28,160 –> 00:50:31,760
than to do anything.
So what if we just convinced

903
00:50:31,760 –> 00:50:34,160
them to not vote?
And so we did that and we

904
00:50:34,160 –> 00:50:36,680
started this whole campaign.
And I mean, The thing is, is

905
00:50:36,680 –> 00:50:39,400
that like, we, we’ve seen that
work here.

906
00:50:39,400 –> 00:50:42,560
Like, that has absolutely
worked, you know, and it’s

907
00:50:42,560 –> 00:50:46,720
worked over many years of people
being convinced that they’re

908
00:50:46,720 –> 00:50:50,680
like, oh, well, if I don’t vote,
like, it’s whatever, my vote

909
00:50:50,680 –> 00:50:54,640
doesn’t matter anyway.
And having to do real campaigns

910
00:50:54,640 –> 00:50:59,280
to be like your votes matter
especially in local elections.

911
00:50:59,280 –> 00:51:04,200
Like they’re trying to dissuade
younger voters who tend to vote

912
00:51:04,200 –> 00:51:11,720
Democrat and vote liberally to
not vote because it it locks in

913
00:51:12,160 –> 00:51:16,240
authoritarian and oppressive and
regressive regimes.

914
00:51:16,600 –> 00:51:19,760
And like if you tell them to not
vote they’re not going to go

915
00:51:19,760 –> 00:51:20,760
vote.
So you have to.

916
00:51:20,760 –> 00:51:25,400
So making So it’s like the fact
that like these these regimes

917
00:51:25,400 –> 00:51:29,160
are also so anti like making
voting easier and that they

918
00:51:29,160 –> 00:51:32,840
hated voting from home.
They hated vote by mail.

919
00:51:32,840 –> 00:51:34,920
They still hate it because
they’re like, well, we don’t

920
00:51:34,920 –> 00:51:37,840
want to make this easier for
people because the whole point

921
00:51:38,000 –> 00:51:42,400
is to make it hard for people so
that they give up because they

922
00:51:42,400 –> 00:51:44,360
think that it’s too much and
their vote doesn’t matter.

923
00:51:44,360 –> 00:51:48,000
And we can’t have them know that
it actually does matter and that

924
00:51:48,000 –> 00:51:50,560
it actually is pretty easy to
vote and we could make it.

925
00:51:50,640 –> 00:51:52,720
Easier.
Literally every government

926
00:51:52,720 –> 00:51:55,800
program makes it as hard as
possible to do it so that you

927
00:51:55,800 –> 00:51:58,720
won’t and you won’t reap the
benefits of the government that

928
00:51:58,720 –> 00:52:00,800
are built in for you.
It’s the same with like

929
00:52:00,800 –> 00:52:02,880
unemployment.
It’s the same with like going to

930
00:52:02,880 –> 00:52:05,560
the DMV.
It all encourages you having

931
00:52:05,560 –> 00:52:10,760
then a criminal behavior or like
other behavior essentially that

932
00:52:10,760 –> 00:52:13,800
then they can hold against you
and then charge you for because

933
00:52:13,800 –> 00:52:17,560
it’s all just beating the
capitalism you suck energy.

934
00:52:18,200 –> 00:52:22,000
Yeah, at least we make it super
easy to vote here in the United

935
00:52:22,000 –> 00:52:25,800
States.
That was one of the things I

936
00:52:25,800 –> 00:52:28,880
hated about the end of this
documentary.

937
00:52:29,040 –> 00:52:33,560
I hated when Carol the
journalist started going into

938
00:52:33,560 –> 00:52:37,040
all the stuff about how, listen,
this is what Russia does, this

939
00:52:37,040 –> 00:52:41,640
is Russia fucking be in Russia
and just meddling in our

940
00:52:41,640 –> 00:52:43,360
affairs.
And it’s like.

941
00:52:43,840 –> 00:52:48,680
If you think the CIA didn’t see
this and go, oh shit, I know

942
00:52:48,680 –> 00:52:52,120
what we’re working on now like
you’re out of your fucking mind.

943
00:52:52,160 –> 00:52:57,600
Like we do this, like for all
the talk of Oh well, Russia

944
00:52:57,600 –> 00:53:01,880
bought all these Facebook pages.
It’s like motherfucker, the UK

945
00:53:01,880 –> 00:53:07,840
did this, like our ally is the
like it was a company in the UK

946
00:53:08,000 –> 00:53:10,920
who did this.
And we’re like Russia.

947
00:53:10,960 –> 00:53:12,400
It’s like, no, we do that shit
too.

948
00:53:12,640 –> 00:53:15,480
Yeah, anybody who can do this
will do this.

949
00:53:15,520 –> 00:53:17,200
Yeah.
It’s ultimate control over

950
00:53:17,200 –> 00:53:20,640
millions of people.
Anyone who can do this will do

951
00:53:20,640 –> 00:53:23,440
this, Yeah.
The call is coming from inside

952
00:53:23,440 –> 00:53:26,600
the house.
Yeah, we need to start over.

953
00:53:26,600 –> 00:53:30,560
But Internet bad.
Yeah, it’s, I mean it’s like

954
00:53:30,600 –> 00:53:33,000
it’s like this started in the
UK.

955
00:53:33,000 –> 00:53:36,560
It started like, you know, it’s
like they were using it here and

956
00:53:36,560 –> 00:53:40,960
it’s like and Russia just jumped
on the bandwagon like it’s you

957
00:53:40,960 –> 00:53:43,160
can’t say that they’re the
drivers of this.

958
00:53:43,160 –> 00:53:47,480
It’s really easy to paint Russia
as like the big bad and that

959
00:53:47,480 –> 00:53:51,240
they’re the reason and that
they’re the only bad actor in

960
00:53:51,240 –> 00:53:55,440
this being like, I’m pretty sure
Brittany Kaiser’s an American.

961
00:53:55,440 –> 00:53:57,760
I’m pretty sure that Christopher
Wiley’s an American.

962
00:53:57,960 –> 00:54:00,240
I’m pretty sure that there were
a lot of Americans that worked

963
00:54:00,240 –> 00:54:01,520
on this.
I’m pretty sure there’s a lot of

964
00:54:01,520 –> 00:54:04,440
Americans that hired Cambridge
Analytica.

965
00:54:04,640 –> 00:54:07,200
Like, it’s not.
It’s like that’s it’s not all

966
00:54:07,200 –> 00:54:09,560
Russia, my friend.
It’s like anybody who wants to

967
00:54:09,560 –> 00:54:12,400
take control knows who to
contact, who’s going to put them

968
00:54:12,400 –> 00:54:15,080
in that place?
You’re telling me the people

969
00:54:15,080 –> 00:54:18,360
that did colonizing colonize the
Internet?

970
00:54:18,360 –> 00:54:22,760
What?
That seems so weird of them.

971
00:54:23,120 –> 00:54:27,480
So out of character, so out of
character of the UK.

972
00:54:28,160 –> 00:54:31,960
So I think that’s, I think
that’s our episode, right?

973
00:54:32,440 –> 00:54:34,240
People should watch this.
I think it it’s a very

974
00:54:34,240 –> 00:54:37,880
interesting documentary.
Perform your own opinions about

975
00:54:37,880 –> 00:54:39,920
Brittany Kaiser.
I’m not crazy about her.

976
00:54:39,920 –> 00:54:41,520
I don’t trust her.
Oh yeah.

977
00:54:41,520 –> 00:54:45,200
I mean, I never, I never trust
somebody that accessorizes that

978
00:54:45,200 –> 00:54:49,640
poorly.
Or who works for the government?

979
00:54:50,120 –> 00:54:53,560
No, that’s the loud, it’s the
accessories.

980
00:54:55,320 –> 00:55:00,120
Yeah, I it’s I it’s just like
one of these documentaries that

981
00:55:00,120 –> 00:55:04,440
like I’ve really like stepped
away from being because I used

982
00:55:04,440 –> 00:55:08,800
to be really politically active
when I was younger and as the

983
00:55:08,800 –> 00:55:12,960
lines got further and further
divided and I felt like I was

984
00:55:12,960 –> 00:55:17,040
seeing more propaganda on my
timeline and from my friends.

985
00:55:17,040 –> 00:55:20,960
And that’s still kind of
happening now with certain

986
00:55:20,960 –> 00:55:24,160
things.
And I am just getting

987
00:55:24,160 –> 00:55:29,400
increasingly worried about like
where even my friends are

988
00:55:29,400 –> 00:55:31,080
getting their news sources.
I’m like, you guys are better

989
00:55:31,080 –> 00:55:32,400
than this, like what is
happening?

990
00:55:32,400 –> 00:55:35,240
And so it’s like I’ve just had
to step away from a lot of it

991
00:55:35,240 –> 00:55:39,200
because it’s like, you know,
people the, the sources that

992
00:55:39,200 –> 00:55:42,360
they’re trusting are also being
infiltrated.

993
00:55:42,400 –> 00:55:45,040
And so it’s like because they
know that they know that people

994
00:55:45,040 –> 00:55:48,280
trust them.
And so it’s like you have to

995
00:55:48,360 –> 00:55:51,920
still continue to be vigilant.
And so it’s like and I’m just

996
00:55:51,920 –> 00:55:56,520
seeing the the divides happening
and it’s just and it is just so

997
00:55:56,720 –> 00:56:00,240
it is so scary.
And especially like right now in

998
00:56:00,240 –> 00:56:04,600
this moment as a Jewish trans
person, it is very difficult to

999
00:56:04,600 –> 00:56:08,680
exist in this world.
And so it is.

1000
00:56:08,840 –> 00:56:12,000
It’s just seeing a lot of stuff
happening.

1001
00:56:12,000 –> 00:56:15,960
And I’m just like this, like
some of these opinions did not

1002
00:56:15,960 –> 00:56:19,920
exist even even five years ago.
Like when I started coming out

1003
00:56:19,920 –> 00:56:24,920
as trans in 2015, a lot of these
opinions did not exist.

1004
00:56:24,960 –> 00:56:29,000
And to me, it’s like people
really didn’t give a shit about

1005
00:56:29,000 –> 00:56:30,360
trans people.
Like they cared.

1006
00:56:30,360 –> 00:56:35,400
But like, not to this level.
And so, like, to me it’s just

1007
00:56:35,400 –> 00:56:37,560
like, well, yeah, it’s it’s
propaganda.

1008
00:56:37,640 –> 00:56:40,640
It’s propaganda that’s that’s
helping push this along.

1009
00:56:40,640 –> 00:56:44,280
And so it’s like, it’s just,
it’s just getting increasingly

1010
00:56:44,280 –> 00:56:51,120
scary and like and and shit like
Cambridge Analytica driving this

1011
00:56:51,200 –> 00:56:55,680
and now there’s other actors in
there driving this is just.

1012
00:56:55,880 –> 00:56:59,920
I mean, it is, it is just, it’s
just made being online and being

1013
00:57:00,200 –> 00:57:07,400
a caring and empathetic person,
really difficult because you

1014
00:57:07,400 –> 00:57:10,520
want to care about stuff, but
you don’t want to fall victim to

1015
00:57:10,520 –> 00:57:13,320
propaganda and you don’t want
to.

1016
00:57:14,080 –> 00:57:17,920
I don’t know have this consume
your entire brain because the

1017
00:57:17,920 –> 00:57:21,200
algorithm is feeding it and
making you upset and anxious.

1018
00:57:21,440 –> 00:57:25,240
And it’s it’s hard because it’s
the main reason I want to like

1019
00:57:25,320 –> 00:57:27,480
step away from social media and
I just want to like delete all

1020
00:57:27,480 –> 00:57:30,760
my accounts.
But I also will miss like having

1021
00:57:30,760 –> 00:57:34,240
jokes with friends and like you
know seeing what my friends are

1022
00:57:34,240 –> 00:57:38,600
up to in their day-to-day stuff.
And it’s it’s I think that

1023
00:57:38,600 –> 00:57:40,760
that’s part of the problem is
that people don’t want to give

1024
00:57:40,760 –> 00:57:44,400
that up and so they’re stuck
being fed propaganda in between,

1025
00:57:44,760 –> 00:57:47,640
you know, their friends,
vacation posts and posts of

1026
00:57:47,640 –> 00:57:50,880
their kids and families and pets
and.

1027
00:57:51,360 –> 00:57:53,520
You got to get on, you got to
get on Blue Sky.

1028
00:57:53,520 –> 00:57:57,000
There’s nothing I am on.
I am on blue sky.

1029
00:57:57,000 –> 00:57:59,880
There is nothing on my blue sky
because I don’t understand how

1030
00:57:59,920 –> 00:58:01,600
it works.
So if you could show me.

1031
00:58:01,920 –> 00:58:04,560
It’s like Twitter, but with
fewer Nazis.

1032
00:58:05,880 –> 00:58:10,480
I have to be online.
It is my job to be online and it

1033
00:58:10,480 –> 00:58:14,080
sucks all day, every day.
But I think that everyone could

1034
00:58:14,240 –> 00:58:19,960
benefit from my rule of if you
see something that is trying to

1035
00:58:19,960 –> 00:58:23,520
get you to take a side that is
not the side of human rights.

1036
00:58:23,880 –> 00:58:25,600
Maybe look at what you’re
reading again.

1037
00:58:25,840 –> 00:58:29,720
Yeah, that is a solid rule.
And I don’t know, it just seems

1038
00:58:31,080 –> 00:58:35,480
to the more we create discourse
that is against each other, the

1039
00:58:35,480 –> 00:58:38,960
farther we are away from taking
down the powers that control us.

1040
00:58:39,360 –> 00:58:43,480
Yeah, the blue versus red thing
was such a brilliant move on

1041
00:58:43,480 –> 00:58:49,400
this government’s part.
Like we are forever doomed to.

1042
00:58:50,360 –> 00:58:54,280
This existence where no matter
what the government suggests,

1043
00:58:54,440 –> 00:58:57,320
half the country’s going to be
for it and the other half is

1044
00:58:57,320 –> 00:59:00,680
going to be against it.
Even if it’s a thing that hurts

1045
00:59:00,680 –> 00:59:04,120
us all, Even if it’s a thing
that hurts people, you know,

1046
00:59:04,360 –> 00:59:07,520
even if it’s a thing that
fucking would help us all, it’s

1047
00:59:07,560 –> 00:59:11,320
always going to be that thing
where, well, that’s that side’s

1048
00:59:11,320 –> 00:59:14,960
idea, so it’s bad.
The thing I always reference is

1049
00:59:14,960 –> 00:59:19,680
national Internet, like every
administration since like Bush.

1050
00:59:20,240 –> 00:59:23,200
Has been pushing the idea that
hey, we should have like, just

1051
00:59:23,200 –> 00:59:26,280
free Internet for everybody.
Like everyone should have the

1052
00:59:26,280 –> 00:59:29,840
same access to this technology.
Because there are still a

1053
00:59:29,840 –> 00:59:33,080
shockingly high number of people
in this country who just have no

1054
00:59:33,080 –> 00:59:36,800
access to the Internet.
But every time it gets proposed,

1055
00:59:37,000 –> 00:59:40,400
the other side is like, I don’t
want this authoritarian fuck to

1056
00:59:40,400 –> 00:59:44,480
be the one who does it.
And like it happened with Obama

1057
00:59:44,480 –> 00:59:46,760
he was like, hey, free Internet,
what do you think?

1058
00:59:46,760 –> 00:59:49,760
And Republicans were like, fuck
you, you socialist scum.

1059
00:59:50,040 –> 00:59:52,240
And.
Then Trump pushed the same idea

1060
00:59:52,240 –> 00:59:55,560
and people on the left were
like, Oh no, I’m not letting you

1061
00:59:55,560 –> 00:59:59,720
Nazi fuck spy on me like that.
And it’s like, God damn it, we

1062
00:59:59,720 –> 01:00:01,480
need national Internet.
Stop.

1063
01:00:01,480 –> 01:00:05,160
It yeah, it needs to be a
utility like, you know, like

1064
01:00:05,160 –> 01:00:09,520
anything else, because of just
how connected our world is now

1065
01:00:09,520 –> 01:00:13,240
and how desperately you need the
Internet for everything.

1066
01:00:13,480 –> 01:00:17,920
But it’s yeah it’s and and then
but that also goes into

1067
01:00:18,080 –> 01:00:25,320
lobbyists and that goes into you
know once again these companies

1068
01:00:25,320 –> 01:00:30,160
and capitalism feeding our
representatives too and also

1069
01:00:30,160 –> 01:00:35,160
starting their own like
propaganda campaigns of like Oh

1070
01:00:35,160 –> 01:00:42,080
no, you don’t want nationalized
Internet because we spectrum are

1071
01:00:42,080 –> 01:00:46,000
piece of shit connection.
We want you to keep paying

1072
01:00:46,000 –> 01:00:52,120
$80.00 a month for our terrible
Internet because we need you and

1073
01:00:52,120 –> 01:00:56,640
we need you to make us exist and
make us rich even though we’re

1074
01:00:56,640 –> 01:00:58,240
not going to make your service
any better.

1075
01:00:59,040 –> 01:01:03,560
Like I I don’t think there is a
government version of the

1076
01:01:03,560 –> 01:01:07,440
Internet that would be worse
than the spectrum Internet I

1077
01:01:07,440 –> 01:01:08,400
have now.
It’s.

1078
01:01:08,840 –> 01:01:11,280
So bad.
It’s the only available thing in

1079
01:01:11,280 –> 01:01:12,880
my neighborhood.
Same.

1080
01:01:13,080 –> 01:01:15,080
And that’s the thing.
And that’s monopolization.

1081
01:01:15,080 –> 01:01:20,520
That’s like, it’s like there’s
so much connected to all of

1082
01:01:20,520 –> 01:01:24,640
these policy decisions.
And and instead of like really

1083
01:01:24,640 –> 01:01:27,360
peeling back the curtain and
being like, no, it’s companies

1084
01:01:27,360 –> 01:01:32,000
that are that are invested in
you and invested in your money

1085
01:01:32,000 –> 01:01:37,080
and what they want from you and
they’re helping this.

1086
01:01:37,080 –> 01:01:39,920
And then you have people coming
in of like, well, I agree with

1087
01:01:39,920 –> 01:01:41,480
that and they’re going to make
me rich.

1088
01:01:41,480 –> 01:01:44,120
So I’m going to make these
propaganda campaigns to make

1089
01:01:44,120 –> 01:01:47,560
sure that that stays in power.
And so we’re also going to help

1090
01:01:47,560 –> 01:01:51,120
these people be elected that
will also put that in power.

1091
01:01:51,120 –> 01:01:56,480
And it’s just, it is, it is just
so I have said for literally

1092
01:01:56,880 –> 01:01:59,400
years now, I’ve been saying it
since I was in high school that

1093
01:01:59,400 –> 01:02:04,880
I fucking hate this country for
the fact that it just refuses to

1094
01:02:05,040 –> 01:02:08,880
protect its citizens from forces
like this and that.

1095
01:02:09,160 –> 01:02:12,480
Like, Oh yeah, there just there
just doesn’t seem to be a path

1096
01:02:12,680 –> 01:02:16,480
that it ever will.
And it and it is, and it is so

1097
01:02:16,480 –> 01:02:20,600
disheartening and it is so
difficult to grapple with that

1098
01:02:20,720 –> 01:02:25,280
and like, but also like I can’t
really go anywhere else and so

1099
01:02:25,280 –> 01:02:26,960
it’s like.
We can’t afford to leave.

1100
01:02:26,960 –> 01:02:30,600
We’re trapped here, baby.
Also, we should eat them.

1101
01:02:30,600 –> 01:02:33,760
We should eat them all.
Yes, let’s eat them.

1102
01:02:34,400 –> 01:02:36,840
Let’s eat them.
All right.

1103
01:02:36,840 –> 01:02:39,120
I think that’s our episode.
Thank you both.

1104
01:02:40,040 –> 01:02:41,400
I think this episode is sad
enough.

1105
01:02:42,480 –> 01:02:44,760
We did it.
We made this documentary.

1106
01:02:44,840 –> 01:02:48,040
Government officials.
We made this documentary even

1107
01:02:48,040 –> 01:02:52,760
sadder than it is.
Jack, Jen, Thank you both for

1108
01:02:52,760 –> 01:02:55,640
doing the pod.
This feels like an appropriate

1109
01:02:55,640 –> 01:02:58,200
time for plugs.
I got to show November 10th at

1110
01:02:58,200 –> 01:03:02,200
the Sardine in San Pedro.
You can come out and see me tell

1111
01:03:02,200 –> 01:03:04,520
jokes.
And I’m also doing Samantha

1112
01:03:04,520 –> 01:03:08,920
Jane’s show comedy Go Go at the
L Sid November 17th.

1113
01:03:08,920 –> 01:03:11,520
The L Sid Nope.
I fucked that up last week too.

1114
01:03:12,160 –> 01:03:18,600
Yeah, that L already means the
it’s not the the and my my sub

1115
01:03:18,600 –> 01:03:20,720
stack Adam Todd
brown.substack.com.

1116
01:03:20,880 –> 01:03:25,400
I just put a new video up where
I’m opening shit, opened a

1117
01:03:25,400 –> 01:03:29,240
sealed copy of Fleetwood Mac’s
1979 album Tossing.

1118
01:03:30,440 –> 01:03:33,880
Boxing and I ate a bowl of
Carmela Creeper cereal.

1119
01:03:33,880 –> 01:03:38,400
Spoiler it fucking sucks.
It is a It is a blight on the

1120
01:03:38,400 –> 01:03:42,240
Monster cereal franchise.
Embarrassing, right?

1121
01:03:42,240 –> 01:03:44,800
Yikes.
What else do we got to plug?

1122
01:03:44,800 –> 01:03:48,560
Jack, what do you got?
I don’t have anything, but you

1123
01:03:48,560 –> 01:03:51,560
can follow me on social media,
the hellscape that we just

1124
01:03:51,560 –> 01:03:56,480
talked about.
I mostly post cat photos and I

1125
01:03:56,480 –> 01:04:00,600
just recently posted a video of
me going rock climbing to

1126
01:04:01,120 –> 01:04:04,360
lighten the mood.
I guess you can follow me on

1127
01:04:04,400 –> 01:04:07,080
Instagram and Twitter.
Jack loves TV.

1128
01:04:07,400 –> 01:04:10,600
Jen, what do you got?
Who, me?

1129
01:04:10,600 –> 01:04:13,840
Yeah, I got a show.
I’m gonna be on a live stream

1130
01:04:13,840 –> 01:04:20,160
show.
On Twitch on November 9th and it

1131
01:04:20,160 –> 01:04:21,920
I will post more about it
because I don’t remember the

1132
01:04:21,920 –> 01:04:25,480
name of it.
I am on Daddy’s Favorite at the

1133
01:04:25,480 –> 01:04:31,240
Goldfish in LA Thursday the 16th
of November and please subscribe

1134
01:04:31,240 –> 01:04:32,800
to all my stuff.
Follow me at Meet Jen

1135
01:04:32,800 –> 01:04:34,680
everywhere.
I said I was quitting only fans

1136
01:04:34,680 –> 01:04:37,440
but I fucking lied.
So if you want to see me naked,

1137
01:04:37,440 –> 01:04:40,520
fucking do that.
She lied.

1138
01:04:40,800 –> 01:04:42,920
All right, let’s get the fuck
out of here.

1139
01:04:43,080 –> 01:04:44,120
Jack.
Say goodbye.

1140
01:04:44,440 –> 01:04:46,720
Bye.
Jen, say goodbye.

1141
01:04:46,920 –> 01:04:48,960
Bye, bye.
Goodbye everybody.

1142
01:04:48,960 –> 01:04:57,480
We love you, people of Earth.
Your planet just has to be

1143
01:04:57,480 –> 01:04:58,160
destroyed.
Your planet.

1144
01:05:09,080 –> 01:05:09,680
Has to be destroyed.

Leave a Reply

Your email address will not be published. Required fields are marked *