My newsletter reviewing “The Social Dilemma,” a Netflix documentary on Monday, Oct. 5, generated numerous responses. The one below is from a Facebook employee and more than that, a guy as close to family to me as you can get. Chris Hawkins is my daughter’s longest and (IMHO) best boyfriend ever. They live together in San Francisco and both work for Facebook. My thoughts on Chris’s opinions and ideas on how to address some of the issues raised in the film are at the end. I think it best for you to read Chris’s words first, with no influence from me. Here they are:
I resisted watching this movie as I read a lot of reviews that led to me to believe I’d find it quite aggravating, but I wanted to make sure this was an informed response, so I checked it out this morning.
The Social Dilemma actually raises some good and important points, but unfortunately it does so with all the precision of a jackhammer. In order to make its content accessible, it loses a lot of subtlety along the way, and in order to elicit an emotional response uses pretty heavy handed film making techniques. It’s ironic in a movie about manipulation, that the film itself uses so many manipulation techniques.
Casey Newton at the Verge is a social media analyst I have a lot of respect for. Despite being quite antagonistic to tech companies (he recently published leaked Q&As from Facebook in their entirety) he posted this excellent overview of the documentary over at the Verge. There are a lot of links in this email, but if you only read one, it should be that.
While in general the film is harmless and may even bring attention to real issues around addiction and this incredibly concerning era we’re moving into around wide-scale disinformation, for people who are genuinely interested in these giant social issues, I don’t think it’s a stand-in for understanding the nuances.
Some resources I would recommend that go a little deeper here, would be Casey Newton, who I already mentioned, as well as Azeem Azhar over at the Exponential View, Ben Thompson at Stratechery and our ex-CSO Alex Stamos (now at Stanford), who’s most insightful publicly readable contributions unfortunately only exist as threads of tweets on Twitter.
The Social Dilemma film itself contains many inaccuracies and exaggerations.
One of my biggest frustrations with the movie is the way it portrays content recommendation and ad targeting as the roots of misinformation. While content recommendation can certainly help to spread misinformation to some extent, very little of this happens through paid ad targeting (for example: Russia spent under $200k in the 2016 election, an order of magnitude less than the $22 million spent by Hillary and $48 million spent by Trump). Indeed, even content recommendation is not the whole story. QAnon, the conspiracy references in the show, started on fringe website 4chan, a community that has no content recommendation algorithms and no paid ad targeting. It then largely spread on Reddit, a network that also has no content recommendation or ad targeting. In Facebook’s own ecosystem, COVID misinformation has been virally spreading on WhatsApp, a text messaging service with no recommendation system and no ads of any kind, where you need someone’s phone number in order to message them.
Another issue I take with the Social Dilemma is that it doesn’t provide any realistic solutions. As one of the interview subjects says, the genie is out of the bottle. Suggestions put forth by the film include taxing data and legally limiting content recommendation algorithms, but as I’ve shown above, this doesn’t tackle the root of the problem. In fact, having data is an important part of being able to create a safer environment online — Telegram, an encrypted messaging app that technically cannot read the contents of messages and stores no user data is frequently used to transmit child pornography (we don’t really know how frequently, because there’s no interpretable data and therefore it is impossible to measure).
The film also discounts the work of thousands of people working inside these companies on online safety. I may be slightly biased in this regard, but I daresay that Facebook and Google have many thousands more people working on online safety than big oil companies have working on climate change mitigation, or big tobacco has working on lung cancer research. At the heart of it, these companies are actually less profit driven than their predecessors and largely filled with socially conscious employees (who, yes, tend to be politically on the left). If you ever ask me why I find Mark Zuckerberg a scary person, it is not because of any profit motive. Rather, it’s his messianic complex about connecting the world that concerns me.
There are also a range of smaller inaccuracies that bother me: for example, the dramatized portion where an ad is sold to a weapons manufacturer. I do not know of any mainstream social network (certainly not Facebook) that allows ads for firearms. I won’t enumerate all of these here, as this is already getting rather long.
I don’t mean any of the information here to take away from the real issues. There are problems and consequences to designing these networks that now connect people around the globe. As the movie correctly observes, humans evolved to live in small tribes, not global, interconnected communities where communication is instantaneous. But there is no way back — instead we need to create smart regulation and hold companies accountable without heavy handed fear mongering.
My response: Chris clearly loves his job, respects his employer and believes in the goals they are working to achieve. Does that blind him to what is going on? I don’t think so, although it does influence his thinking and why wouldn’t it? There is nothing wrong with that. It shouldn’t be any other way.
No disagreement with Chris on the movie taking a sensationalist approach to make some of its points. I mentioned that issue in my blog post, but that comment somehow did not make it to the version I emailed out. The sensationalism does not bother me. During the past twenty years massive amounts of highly credible information on climate change dangers have been ignored because it was boring. Brilliant scientists delivering critically important messages about the sustainability for human life on this planet were never given airtime because they were not smooth talkers and unable to compete with the sensationalism of what currently passes for the nightly news.
Chris points to the leaked Facebook Q&As and a review of the documentary by Casey Newton at the Verge, and I read them, along with the rest of the links he included. This is all interesting and useful background to more fully understand the complexity of the issues this movie raises.
Chris takes affront with the movie’s portrayal of content recommendations and ad targeting as a root of misinformation. I shared his frustration. When I defended Net Perception’s ad targeting technology against critics years ago, I felt the same way he does. Watching the Ed Sullivan show in the 1960’s, everyone sat through untargeted commercials for Geritol vitamin elixirs, diapers and Ford trucks. Wouldn’t it be better to see ads for products you might have at least a passing interest in purchasing? Chris is right, the movie provided few ideas on how these companies could remedy the issues it raises. While offering some suggestions for users, such as turning off notifications, never clicking videos you didn’t specifically search for and limiting time you and your kids spend on these platforms) there were few ideas on how to fix the underlying issues. Several things occur to me and I’ll cover those before I finish here.
My biggest concern and fear is the degree to which all political discourse is dominated by a minority of extreme and often half-baked views from the far left and far right. In some Olympic sports, the highest and lowest scores are thrown out, with the final score an average of the scores from the judges in the middle of the two extremes. Could something like this be implemented on Facebook? Is there a way to have disgusting rhetoric, negative attacks and hyperbole from the far right or far left be muted somehow?
After watching this movie and carefully reading Chris’s response and his links to others who have studied the topic, it becomes clear the answers aren’t easy. Sincere, smart and well-intentioned people have thought about and studied this at great length and come to different ideas on how best to move forward. What disturbs me the most is the increasing disdain for experts and the willingness of many to jump to short-hand slogans as if they had any genuine chance of being right. Whether it’s “drain the swamp,” or “restore the soul of America,” we face issues of frightening complexity that require us all to work together, if we want to have any hope of finding a lasting solution. Calling those who don’t align perfectly with your opinion idiots or unpatriotic low-lifes is not helping.
Chris makes the point that Facebook and Google spend millions annually working on online safety and no big oil companies are working on climate change mitigation and it’s equally unlikely big tobacco is working on lung cancer research. While Chris may be right, I will guess that neither Facebook nor Google is testing the possibility of “how much screen time is too much?” Is it possible to break this core belief that the more engagement they have the better? If it were judged, for instance, that 15-year-olds should be limited to, say, 45 minutes per day of screen time, for their optimal health and well-being, how hard would these companies work to block a particular user from going over that amount? Could, after that 45 minutes, algorithms be created to fill screens with boring content designed to end engagement? What financial incentives could be put into place that would reward that sort of outcome? Would Facebook or others fund research to find if there is an “ideal” amount of time for 11-year-old girls, 22-year-old college students, or 70-year-old grandparents to spend on their platforms? Could performance be hobbled after X many minutes, so the devices while usable, would provide an unsatisfactory experience? I could go further, but I think I’ve made my point.
For all those who’ve read this far, I commend and thank you. This is an important issue needing all of our attention and thought. But complex problems are difficult as learning an issues back story and history takes time and effort to uncover and comprehend. And with history always written by the victors, issues like “social good,” and “shareholder value” or even “social well-being” are highly susceptible to being over-simplified, especially in films like this.