The Social Dilemma: You write, I read

My newsletter reviewing “The Social Dilemma,” a Netflix documentary on Monday, Oct. 5, generated numerous responses. The one below is from a Facebook employee and more than that, a guy as close to family to me as you can get.  Chris Hawkins is my daughter’s longest and (IMHO) best boyfriend ever. They live together in San Francisco and both work for Facebook.  My thoughts on Chris’s opinions and ideas on how to address some of the issues raised in the film are at the end.  I think it best for you to read Chris’s words first, with no influence from me.  Here they are:

Hey Steve!

I resisted watching this movie as I read a lot of reviews that led to me to believe I’d find it quite aggravating, but I wanted to make sure this was an informed response, so I checked it out this morning.

The Social Dilemma actually raises some good and important points, but unfortunately it does so with all the precision of a jackhammer. In order to make its content accessible, it loses a lot of subtlety along the way, and in order to elicit an emotional response uses pretty heavy handed film making techniques. It’s ironic in a movie about manipulation, that the film itself uses so many manipulation techniques.

Casey Newton at the Verge is a social media analyst I have a lot of respect for. Despite being quite antagonistic to tech companies (he recently published leaked Q&As from Facebook in their entirety) he posted this excellent overview of the documentary over at the Verge. There are a lot of links in this email, but if you only read one, it should be that.

While in general the film is harmless and may even bring attention to real issues around addiction and this incredibly concerning era we’re moving into around wide-scale disinformation, for people who are genuinely interested in these giant social issues, I don’t think it’s a stand-in for understanding the nuances.

Some resources I would recommend that go a little deeper here, would be Casey Newton, who I already mentioned, as well as Azeem Azhar over at the Exponential View, Ben Thompson at Stratechery and our ex-CSO Alex Stamos (now at Stanford), who’s most insightful publicly readable contributions unfortunately only exist as threads of tweets on Twitter.

The Social Dilemma film itself contains many inaccuracies and exaggerations.

One of my biggest frustrations with the movie is the way it portrays content recommendation and ad targeting as the roots of misinformation. While content recommendation can certainly help to spread misinformation to some extent, very little of this happens through paid ad targeting (for example: Russia spent under $200k in the 2016 election, an order of magnitude less than the $22 million spent by Hillary and $48 million spent by Trump). Indeed, even content recommendation is not the whole story. QAnon, the conspiracy references in the show, started on fringe website 4chan, a community that has no content recommendation algorithms and no paid ad targeting. It then largely spread on Reddit, a network that also has no content recommendation or ad targeting. In Facebook’s own ecosystem, COVID misinformation has been virally spreading on WhatsApp, a text messaging service with no recommendation system and no ads of any kind, where you need someone’s phone number in order to message them.

Another issue I take with the Social Dilemma is that it doesn’t provide any realistic solutions. As one of the interview subjects says, the genie is out of the bottle. Suggestions put forth by the film include taxing data and legally limiting content recommendation algorithms, but as I’ve shown above, this doesn’t tackle the root of the problem. In fact, having data is an important part of being able to create a safer environment online — Telegram, an encrypted messaging app that technically cannot read the contents of messages and stores no user data is frequently used to transmit child pornography (we don’t really know how frequently, because there’s no interpretable data and therefore it is impossible to measure).

The film also discounts the work of thousands of people working inside these companies on online safety. I may be slightly biased in this regard, but I daresay that Facebook and Google have many thousands more people working on online safety than big oil companies have working on climate change mitigation, or big tobacco has working on lung cancer research. At the heart of it, these companies are actually less profit driven than their predecessors and largely filled with socially conscious employees (who, yes, tend to be politically on the left). If you ever ask me why I find Mark Zuckerberg a scary person, it is not because of any profit motive. Rather, it’s his messianic complex about connecting the world that concerns me.

There are also a range of smaller inaccuracies that bother me: for example, the dramatized portion where an ad is sold to a weapons manufacturer. I do not know of any mainstream social network (certainly not Facebook) that allows ads for firearms. I won’t enumerate all of these here, as this is already getting rather long.

I don’t mean any of the information here to take away from the real issues. There are problems and consequences to designing these networks that now connect people around the globe. As the movie correctly observes, humans evolved to live in small tribes, not global, interconnected communities where communication is instantaneous. But there is no way back — instead we need to create smart regulation and hold companies accountable without heavy handed fear mongering.

Cheers,

Chris

My response: Chris clearly loves his job, respects his employer and believes in the goals they are working to achieve.  Does that blind him to what is going on? I don’t think so, although it does influence his thinking and why wouldn’t it?  There is nothing wrong with that.  It shouldn’t be any other way.

No disagreement with Chris on the movie taking a sensationalist approach to make some of its points. I mentioned that issue in my blog post, but that comment somehow did not make it to the version I emailed out. The sensationalism does not bother me.   During the past twenty years massive amounts of highly credible information on climate change dangers have been ignored because it was boring. Brilliant scientists delivering critically important messages about the sustainability for human life on this planet were never given airtime because they were not smooth talkers and unable to compete with the sensationalism of what currently passes for the nightly news.

Chris points to the leaked Facebook Q&As and a review of the documentary by Casey Newton at the Verge, and I read them, along with the rest of the links he included.  This is all interesting and useful background to more fully understand the complexity of the issues this movie raises.

Chris takes affront with the movie’s portrayal of content recommendations and ad targeting as a root of misinformation.  I shared his frustration.  When I defended Net Perception’s ad targeting technology against critics years ago, I felt the same way he does.  Watching the Ed Sullivan show in the 1960’s, everyone sat through untargeted commercials for Geritol vitamin elixirs, diapers and Ford trucks. Wouldn’t it be better to see ads for products you might have at least a passing interest in purchasing?  Chris is right, the movie provided few ideas on how these companies could remedy the issues it raises.  While offering some suggestions for users, such as turning off notifications, never clicking videos you didn’t specifically search for and limiting time you and your kids spend on these platforms) there were few ideas on how to fix the underlying issues.  Several things occur to me and I’ll cover those before I finish here.

My biggest concern and fear is the degree to which all political discourse is dominated by a minority of extreme and often half-baked views from the far left and far right. In some Olympic sports, the highest and lowest scores are thrown out, with the final score an average of the scores from the judges in the middle of the two extremes.  Could something like this be implemented on Facebook? Is there a way to have disgusting rhetoric, negative attacks and hyperbole from the far right or far left be muted somehow?

After watching this movie and carefully reading Chris’s response and his links to others who have studied the topic, it becomes clear the answers aren’t easy.  Sincere, smart and well-intentioned people have thought about and studied this at great length and come to different ideas on how best to move forward.  What disturbs me the most is the increasing disdain for experts and the willingness of many to jump to short-hand slogans as if they had any genuine chance of being right. Whether it’s “drain the swamp,” or “restore the soul of America,” we face issues of frightening complexity that require us all to work together, if we want to have any hope of finding a lasting solution.  Calling those who don’t align perfectly with your opinion idiots or unpatriotic low-lifes is not helping.

Chris makes the point that Facebook and Google spend millions annually working on online safety and no big oil companies are working on climate change mitigation and it’s equally unlikely big tobacco is working on lung cancer research.  While Chris may be right, I will guess that neither Facebook nor Google is testing the possibility of “how much screen time is too much?”  Is it possible to break this core belief that the more engagement they have the better?  If it were judged, for instance, that 15-year-olds should be limited to, say, 45 minutes per day of screen time, for their optimal health and well-being, how hard would these companies work to block a particular user from going over that amount?  Could, after that 45 minutes, algorithms be created to fill screens with boring content designed to end engagement? What financial incentives could be put into place that would reward that sort of outcome?  Would Facebook or others fund research to find if there is an “ideal” amount of time for 11-year-old girls, 22-year-old college students, or 70-year-old grandparents to spend on their platforms? Could performance be hobbled after X many minutes, so the devices while usable, would provide an unsatisfactory experience? I could go further, but I think I’ve made my point.

For all those who’ve read this far, I commend and thank you.  This is an important issue needing all of our attention and thought.  But complex problems are difficult as learning an issues back story and history takes time and effort to uncover and comprehend. And with history always written by the victors, issues like “social good,” and “shareholder value” or even “social well-being” are highly susceptible to being over-simplified, especially in films like this.

Analysis of movie: The Social Dilemma

Most readers know my newsletter is not a platform for political, religious or social commentary. This one may be a small exception. Although not political or religious, there is a bit of social commentary here. The documentary film, The Social Dilemma, is eye-opening, explaining how Facebook, Google, Instagram and other social media platforms work. I encourage you to watch it, become a bit more informed and make your own conclusions. My thoughts are below.

Official Trailer for The Social Dilemma

Without pointing fingers or demonizing anyone, The Social Dilemma, explains how the algorithms underlying social media platforms like Instagram and Facebook function, how they evolved and some of the impacts they have. The tech world has been figuring out and fine-tuning this software for quite a while.

Rob Kost, one of my best friends and Prodigy colleagues, reacted to my recommendation to watch this film, by saying: “All the more interesting coming from the guy who ran one of the first social networks.” Rob recalls correctly. In the early 1990’s I ran the communication products for Prodigy Services Company which covered Bulletin Boards, Chat and E-mail – one of the first large commercial social networks, along with competitors like AOL, CompuServe and Genie. We all learned a great many things creating and operating these forums, including the importance of not allowing anonymity — requiring people to own their words. Prodigy also took too long to discover the futility of attempting to censor what people are allowed to say. I was involved in lobbying Congress regarding the Telecommunications Act, which passed in 1996. It codified the distinction that online services were like telephone companies and could not be held responsible for what someone posted on one of their bulletin boards or chat systems, in the same way that the phone company could not be found legally responsible for bank robbers planning a crime using the telephone. The incident which prompted the historic court case, Stratton Oakmont, Inc. v. Prodigy Services Company happened on my watch. A user on our Money Talk bulletin board created a post essentially calling a Long Island securities firm “a bunch of crooks.” This led to Section 230 of the 1996 Communications Decency Act. Stratton Oakmont’s antics were the story behind the 2013 movie starring Leonardo DiCaprio called The Wolf of Wall Street. But I digress. Through all of this we learned, and solved a lot of problems, but for the most part, we never anticipated what is happening today.

Later on I leveraged my hard-won online and Internet experience to become a founder and entrepreneur of early stage Internet and tech companies. One of those companies, Net Perceptions, productized a set of collaborative filtering algorithms into something we called a “Personalization Engine” in the late 1990’s. The software worked so well at predicting what people would like and buy, Net Perceptions found itself going public (NETP) and reaching a rather extraordinary market value in just over five years. I say this to acknowledge I’m not entirely innocent in all of this, but I can assure you, our goals were not evil. We just wanted to help clients sell more stuff by making their advertising messages more relevant and directed to those most receptive. But the seeds planted with this and other sophisticated software products have resulted in something far scarier.

Our country and the world face real and significant problems. The solutions require wisdom, collaboration, fair mindedness, informed and humble people. But social media platforms, the way they operate today, are contributing to, perhaps even causing, less of these things. We are inadvertently being manipulated by a financial model that depends on us viewing, clicking, linking, tweeting and buying into becoming something none of us wants.

The process works by de-integrating, slicing, dicing, grouping and dividing people into smaller and smaller groups, referred to as cohorts, silos, and bubbles. We are continually fed precise and unique bits of information that harden our prejudices, leading us into an “us vs. them” frame of mind, causing us to doubt the motives, goodness and patriotism of our fellow citizens. These feeds reinforce our thinking that we are “right most of the time.” It results in us having a very poor understanding of the ideas and concerns of people with whom we may disagree. Losing the feeling that “we’re all in this together” is a very bad thing. Imagine, in contrast, if the COVID-19 threat had been handled with an attitude of “something is attacking us, we need to band together to fight, find the best and most effective strategies and implement them together to ensure the fewest number of people are affected until we find a vaccine or cure,” instead of turning it into a political, “us vs. them” fight?

Social networks work because every click, every page we look at and for how long is recorded, tabulated, analyzed and organized into a picture or profile of us that allows us to be exploited. By exploited I mean specific advertising messages that are served to us along with additional content options that cause us to stay longer at one page or in one area. Another impact is the way search results are tailored to take advantage of our limitations, biases, weaknesses and proclivities. For instance, the film points out that if you enter the term “Global Warming,” into Google, the results you see are determined by where you live, your profile, and projected political ideas. Results for your search could be along the lines of “Global Warming Hoax” for one person and “Scientists Warn of Global Warming’s looming catastrophe,” to another. It just “depends.” People end up with their very own unique and individual versions of “the truth” at the loss of commonly accepted and acknowledged truths.

Lastly, and perhaps most disturbing, is how unprepared and unmatched we individuals are to deal with this level of manipulation. These algorithms, based on our behavior, are highly addictive, and seamlessly tap into our fears, phobias, anxieties, traumas, uncertainty, emotions, intellect, perceptions, vanities and desires. When being publicly pushed on the loss of privacy due to the Net Perceptions technology, I often countered with, “Hey, all they want to do is sell you more stuff. Are you telling me you don’t have the will to stand up to advertising that does not interest you?” Watching this movie, I realized the reality and implications have gone far beyond just showing ads.

Perhaps the most powerful impact of this film is the way it demonstrates the ease with which humans can be manipulated by bad people and those with heinous motives. These methods are not secret, difficult to understand or hard to use. These incredibly powerful tools, when aimed at our democratic institutions, have the greatest ability to cause us the imaginable harm. If you, for instance, wished to destroy the United States, our freedoms and democracy, the film lays bare precisely how easily it can be done. Equally scary to me, as a man with two awesome granddaughters, is the potential harm it poses to teenage girls. After watching the movie, I think I would prefer they carried around live grenades and open packs of cigarettes before allowing them more than a few supervised minutes a day with a smartphone.

Turn your smartphone off, shut down your iPad, watch the movie. Let me know what you think.

P.S. In October of 1997 I gave a speech at the Camden Conference in Maine. It outlines in more detail, my history and perspective on online communities. The speech was titled, “Electronically, We’re All Neighbors: A Perspective on Community.” It can be a bit cringe-worthy in places, but that is what happens when looking at things from twenty years or more ago. If you’re family, you’ll enjoy it.

Five things Social Media does, like it or not

Official Trailer for The Social Dilemma

Yesterday my friend Frank Del Monte recommended a new documentary film called The Social Dilemma.  Maggie and I watched it last night and were stunned.  If you’ve ever used Facebook, Google, Instagram, etc. it explains how these platforms work and why you see what you see.  It is an exceptionally well done film and does not talk down to anyone.

This morning I’m alerting my friends to find this film and watch it as soon as they can.  It explains a lot.  Once my “alerting” is completed, I plan to watch it once more then do a longer write up here, detailing why these findings are so incredibly important and the impact I think these platforms are having on all of our lives – and not all for the good. But for right now, please make time to watch this movie.  That way, my analysis will have more impact for you as you’ll have seen what I’m talking about.  Let me predict something: After you watch this, you will be doing what I am, telling your friends and family to watch it, too.  What actions you decide to take are your own.