[image credit: kc green | Gunshow Comic]
This week, it’s all Mighty Mouth, all the time. In other words, no guest, just yours truly.
I’m asking you to settle in, grab a cup of coffee - or you might want a big ol’ slug of bourbon for this one? - and listen as I spin a tale of lies, damn lies, statistics, and treachery, with the main bad actor of the piece being that thing that over two billion humans across the globe have become addicted to: Facebook.
Why is a snarky healthcare podcast talking about Facebook? Welp, because the Book of Face has become the place where people facing health challenges go to find peer support, resources, research opportunities, and community while they face everything from cancer to neurological conditions to medical error to joint replacement. By the way, I’m not linking to any groups in the show notes. You’ll figure out why as I go along here.
Got your coffee? Your bourbon? Your coffee with bourbon in it? OK, here we go.
First, have you heard the phrase “Cambridge Analytica”? In case you’re a visitor from another planet, or were holed up in a cave for the last few years, Cambridge Analytica was a consulting firm based in London whose business model was “us[ing] data to change audience behavior” - which in plain English means using people’s online behavior as weaponized data.
Social engineering is defined as “the use of deception to manipulate individuals into divulging confidential or personal information” - in the case of Cambridge Analytica, they used a Facebook quiz called “This Is Your Digital Life” to hijack the user data of the people who used the quiz, which included the data of all their Facebook friends, who had NOT given consent to having their data jacked by that Digital Life quiz. 87 million Facebook users’ data got slurped up in the process.
Using this massive data slurp, Cambridge Analytica managed to shift an American presidential election’s results in 2016 - they warmed up on the 2014 midterms, influencing the outcome of a couple Senate races - and also managed to influence the outcome of the Brexit referendum in the UK.
All of this data slurping was kicked off by an app created by a Cambridge psych professor who was also a Facebook consultant until the lid blew off this story in 2018.
So: Facebook, social engineering, data slurping. Those are the three issues on deck here, and I’ll be working to connect each of them to the risk to people’s health, safety, and human rights.
Let’s tackle data slurping first. Data is what drives most of the global economy. Since tech served up the opportunity to slice and dice human behavior and activity into stuff called “predictive analytics” and “business intelligence,” everyone from General Motors to your local insurance broker to major philanthropic foundations to the big hospital in your city has been creating, buying, and selling information - data sets - on who’s doing what, where, and what happens as a result.
The goal is knowing stuff. Ideally knowing stuff that means that the one who “knows” can turn that knowledge into an economic opportunity. This drive to “know stuff” is not a bad thing. We all like to know stuff. It’s why education is so dang popular! But when a big enterprise wants to know stuff, and builds giant vacuums to basically suck up the most granular and intimate details of people’s lives … does that sound creepy to you? I mean, what does YOUR porn viewing history look like? Google knows, they have that data. Unless you’ve turned off all tracking on your Google settings. Which you can do, but most people have not. And what’s Google doing with all that data - that stuff they know about you? They’re selling it. To any business looking to target ads and search results at you.
Even more creep-factor on this data slurping/know-stuff slippery slope shows up in how closely we can all be tracked, physically, by the devices we use to go about our digital lives. There was a New York Times piece in December headlined “Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret” that showed, in graphic detail, just how closely monitored we all are by the tech we use every day. One example included someone’s visit to a Planned Parenthood clinic, where they were for two hours, and then went home. Another was the ad campaigns that target phones in hospital emergency rooms with ads for personal injury attorneys.
Your personal alarm bells still not ringing?
How about this one - “Health Insurers Are Vacuuming Up Details About You — And It Could Raise Your Rates”? Here’s the lede on that story, verbatim: “Without any public scrutiny, insurers and data brokers are predicting your health costs based on data about things like race, marital status, how much TV you watch, whether you pay your bills on time or even buy plus-size clothing.”
Data slurping means that, in service of knowing stuff about you, pretty much any jackass on the internet with a few bucks can suck in all sorts of highly personal information about you - porn history and reddit comments, clothing sizes and restaurant preferences, doctor’s visits and time spent in saloons. And if that jackass on the internet is a big corporation with a massive budget for “business intelligence”? They’ll be buying up information on millions of us at a time, on the daily.
There’s even a whole industry sector that’s devoted to managing this data marketplace, and getting their cut - the data brokering industry. Check out a piece on VICE’s Motherboard, “What Are 'Data Brokers,' and Why Are They Scooping Up Information About You?” that talks about the industry, and its three segments: background-check level stuff, called “people search”; datasets for use by marketers to target advertising based on your interests and behavior; and risk mitigation data mining, used to detect fraud. Of course, all these datasets out there all over ever’where means that identity theft is a higher risk than it used to be - any jackass on the internet who wants to steal the identity of another jackass on the internet only needs a few data points to create a whole new jackass-on-the-internet profile, and then head out to apply for credit cards which they then take on shopping sprees.
Your alarm bells ringing yet? No? Let’s talk social engineering.
Social engineering, in case you’ve already forgotten what I said a few minutes ago, is “the use of deception to manipulate individuals into divulging confidential or personal information” - in simpler terms, it’s being a lying sack of shit so you can trick some poor jackass on the internet into giving you something like their email password, or login info on their bank account, or even just to get them to believe some wild story and take action based on your outrage over it.
If “lying sack of shit” is too strong for you, how about “fake social media profile” or “Russian troll farm employee”? That work? ‘Cause those are prime infection vectors of social engineering online, particularly on social media. Like, say, FACEBOOK. Although Twitter, Instagram, even LinkedIn have plenty of lying sacks of … uh, fakers on their platforms.
Social engineering examples include all phishing attacks, where somebody spoofs a trusted contact like Bank of America or the IRS, and gets you to click on something that lets them drain your bank account. It also includes tin-foil-hat jackasses on the internet like Alex Jones and his InfoWars shit show, and any other conspiracy-theory pimp who’s firing up outrage in service of sketchy commercial, or political, ends. Sometimes both at the same time - I’m looking at you, Cambridge Analytica. Social engineering is all over social media - wotta surprise! Fake profiles on Facebook and Twitter! Who knew?
OK. I’ve covered data slurping - more politely termed “data brokering” by those who are minting coin off it. And social engineering, or the modern version of the 19th century snake oil salesman, who’s still hawking snake oil, using the Facebook page he built so he could run ads and suck up user data via the Facebook Ads and Facebook Developer platforms to flog his bullshit.
So I guess that means we’re gonna talk about Facebook!
It should be somewhat clear by now in this thing that Facebook is not a good guy in this story. Despite founder Mark Zuckerberg’s droning on about wanting to “make the world more open and connected,” the origin story of Facebook held clear warning signs of what was to come.
Facebook was started by a 19 year old Harvard sophomore - the aforementioned Zuckerberg - as a hot-or-not comparison tool called Facemash where Harvard students could pass judgement on the looks of Harvard’s female undergrads. The images used were scraped from Harvard’s online House “facebooks” that had digital profile info on students in the undergraduate dorms, where most Harvard undergrads live. The site was super popular … for a few days, until Harvard’s administration found out about it, and shut it down. Zuckerberg was charged with copyright violations, breach of security, and privacy violation. This was in 2003, sixteen years ago.
In early 2004, ol’ Marky Z was back at it, once again writing code for a “facebook” for all of Harvard. When he launched it, some fellow students threatened to sue him for stealing the idea from them. He also wound up in hot water for hacking his facebook’s user data to get the email info for Harvard Crimson reporters who were investigating the idea-thieving thing. Those reporters later sued, winning an undisclosed settlement. But the genie was out of the bottle, and Facebook was alive.
In 2004, as Facebook was becoming a living, breathing infant threat to humanity, ol’ Marky Z had this exchange via text with a buddy:
Zuck: Yeah so if you ever need info about anyone at Harvard [j]ust ask. I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend's Name]: What? How'd you manage that one?
Zuck: People just submitted it. I don't know why. They "trust me" Dumb fucks
By the end of 2005, Facebook had 6 million users at universities around the world. In September 2006, they opened up Facebook membership to anyone over 13 years old who had an email address, anywhere in the world. The site now has over 2.3 billion active users globally. And there are thought to be around 83 million fake profiles in that “active users” count.
OK, let’s bring the three threads together, and weave a blanket of damning evidence out of ‘em, shall we?
The best example of how these three topic areas - data slurping, social engineering, and Facebook - intersect and become even more dangerous is … Facebook itself. All three are in its own origin story, since Marky Z used them all to build and expand the infant version of Facebook at Harvard over a decade ago. In the years since, as Facebook become one of the most valuable companies on Earth - current market cap on their shares is $483.49 billion as I record this - they’ve followed Marky’s mantra of “move fast and break things” to the degree that they’ve moved global democracy and community fast toward being broken beyond repair.
And in case you think that maybe Facebook was just clueless, and wasn’t really aware of what they were doing - slurping user data, figuring out ways to package and sell it, in the process ignoring any sense of ethics when it came to the health and safety of its users - look no further than their now-infamous emotional contagion experiment. That story broke in 2014, when the US National Academy of Science published “Experimental evidence of massive-scale emotional contagion through social networks,” with the lead author being a Facebook data science dude, Adam Kramer. The National Academy published an editorial in the same journal issue, “Editorial Expression of Concern: Experimental evidence of massivescale emotional contagion through social networks” - “expression of concern” being academic-ese for WHAT THE FUCK ARE YOU EVEN DOING, DUDES?
What Facebook did in the “study” was, without the knowledge or consent of the study’s participants - 700,000 Facebook users - to test those users’ reactions to posts expressing negative feelings about something, or positive feelings about another thing, and if those feelings were contagious to their Facebook network. Quoting from a Sage Journals piece on this little experiment, “They tested this by manipulating the newsfeed users saw showing them either more negative posts from their friends or more positive ones. The effect was significant, albeit small; altered newsfeeds demonstrated a slight emotional contagion effect.”
That’s just one example of Facebook’s habit of turning their users into lab rats in weird science experiments. How about the story that just broke this week about Project Atlas, where they talked teens into downloading a virtual private network tool - something that’s supposed to make internet use safer and more private - THAT SPIED ON THEM. “Facebook pays teens to install VPN that spies on them” landed on the Techcrunch site on Tuesday. Gnashing of teeth and rending of garments continues over that one.
Facebook has become the de facto community building platform for people dealing with illnesses and conditions of all kinds, too, via Facebook Groups. People facing a diagnosis, or seeking one, look for other people to talk to about their issue, looking for peer support and resources. If they find a Closed Group, they think that “closed” means “private.” Au contraire, mon frere - “closed” just meant that data slurpers had to work a little harder, make a few extra clicks, to suck up all the user info for every single person in that group, including their entire friends network and all of THEIR user info. All they needed to do was install a browser extension, and they could grab it all.
Communities impacted by this data-for-sale mindset include global #MeToo and sexual assault survivors, people with hereditary cancer risks like BRCA mutation, HIV+ patient communities, the list is almost endless. And it’s impossible to know if your Closed Group’s information was slurped, since Facebook doesn’t track who grabs the data. It’s open season. On you, me, and everyone else on the Book of Face.
Some folks are starting to wake up to the risk. One buddy of mine in the healthcare human rights game is e-Patient Dave deBronkart. His profile is still visible, but the banner image on his profile says “I have left Facebook.” with a link to a post on his site saying why he did it. Strongly recommend reading that for additional context on the risk to human health that Facebook has become.
My point? Take a look at your own Facebook activity, and consider whether it’s working for you, or if you’re just a slave to Facebook’s drive for piles of cold, hard cash in the guise of “connection.” Facebook is engineered to be addictive - I know, I’ve been on the site since early 2007. That little dopamine hit you get every time Facebook tells you somebody liked your post, or commented on something you said, smacks you right in the most basic part of your feels. And has you craving that hit again, so you keep posting, and liking, and commenting, and sharing, and feeding Facebook’s bottom line. All while you’re being manipulated at the most basic level to keep doing it.
I still have a profile on Facebook. I’m weighing whether it matters if I delete it, since all the data I created, and which got slurped away by who knows who, is already in datasets on servers all across the globe. By the way, Facebook is far from the only big tech player doing the data slurping thing. Google, Amazon, Netflix, Apple - they all do it. Apple tries to front that they’re all about “privacy” and that they don’t sell off their datasets, but if you think they’re not targeting the hell out of you using the data they’ve got on you? Dream on, Spanky.
This is on my mind - and on the podcast - today because tonight, Thursday, January 31, at 7pm Eastern Time US, I’m one of the schmoes leading a “do we stay or do we go now?” discussion about Facebook and patient communities on Twitter. The tweetchat hashtag is #WEGOhealthchat - I talked to WEGO’s Chief Strategy Officer David Goldsmith last month, and will note that I’m sure that WEGO’s business model includes data aggregation and sale on their network, since it’s 2019 and they’re a digital health company. I’m not anti commerce, but in the rise of the data economy, and the simultaneous rise of what’s now being called “surveillance capitalism,” caveat emptor (buyer beware) must become caveat usor (user beware) - we’re all “users” in the digital economy. We need to start staking a claim to our creator status in the digital-data marketplace, or we’ll wind up serfs growing food in the (data) fields at the foot of the castle walls. We'll be given gruel, small shelters, told to be grateful for Big Tech's beneficence. 100 AD, with smartphones.
Oh, and in case you think *I’ve* joined the tin-foil hat brigade when it comes to Facebook, that I’m seeing monsters under the bed, and I need to take a chill, and maybe a heavy dose of Xanax? One of the first investors in Facebook, Roger McNamee, a Silicon Valley dude who became a mentor to ol’ Marky Z, known to his friends as “Zuck”, just released a book, “Zucked: Waking Up to the Facebook Catastrophe,” calling out the utter lack of ethics, or morals, at the top of the Facebook food chain.
So give some thought to your presence, and activity, on Facebook. Being there, doing that, just might be hazardous to your health.
Links of interest, maybe:
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power - Shoshana Zuboff
Facebook Special Issue - Research Ethics Volume 12 Issue 1, January 2016, Sage Journals
Here’s a Long List of Data Broker Sites and How to Opt-Out of Them - VICE Motherboard
How To Spot a Social Engineering Profile on Social Media - ZeroFox
Danny van Leeuwen, also known as Health Hats - with his diverse and prolific health experience, Danny uses his multiple hats to empower people as they travel toward their best health. To join Danny on that best health journey visit his blog.
David C. Norris
Janice Lynch Schuster
Wanna be on this list? Become a Patron!
Movin’ On Up by Podington Bear
Podcast distribution rights:
Creative Commons 3.0 license