Entry tags:
Review: The Social Dilemma (2020)
I’d been hearing about this documentary but to be honest it didn’t present any arguments to me that I hadn’t already heard of, even in the year 2020 when it was released. It’s fine for someone who is young and perhaps grew up with social media, but as a millennial who saw the growth of social media as well as the various conversations and discussions around it, it didn’t surprise me. The documentary perhaps presented more facts and interviews, but it didn’t really interest me or shock me. Which is ironic to say because one of the big arguments of the documentary was how social media was designed to grab our attention, and I’m rating this documentary as just ok because it didn’t grab my attention.
Spoilers?
Narrative and Format
The documentary followed two main paths. One was in interviewing various people who had worked in the tech industry. The other path was a dramatization of the life of a family and how social media negatively affected them.
Here are some of the arguments that were made during the documentary:
- Users are the product. Not just our data, but in the way social media can modify our behaviours to do what advertisers want. Advertisers are the true customers of technology customers.
- Social media was intentionally designed to be addictive. In fact, many tech executives took psychology classes to study tactics to get users’ attention.
- Social media significantly hurts young people who have grown up on the internet. They seek validation from thousands/millions of people and when they don’t receive it, are more prone to hurting themselves.
- Social media is a powerful tool that can be used to sow discord, or to sow misinformation to influence thinking.
- There is not enough regulation in the area of social media.
I’ll discuss these more in detail in the themes section below. But as you can see, none of these arguments are new if you’ve been on the internet long enough.
I initially did agree with one of the interviewers who said that no one person is the evil villain. We didn’t know of the power that social media would have over our lives. But Tristan Harris noted that he as a worker started to notice the addictive aspects of technology. When that became apparent, then it became the tech industry’s responsibility to control social media so that it couldn’t be used to hurt people.
Production
Now that I’m thinking back on the documentary, while the narrative was loosely cohesive, I didn’t feel that the interviews were that cohesive in the sense that interviewers were only brought in for specific topics. That’s fine, it was more of a distraction stylistically than narratively. But perhaps I would be more drawn in if there were fewer interviewers (fewer people to keep track of).
The plot point with the family was mildly interesting but they definitely could have gone harder with it. They used the family to show issues, but only on a surface level. For instance, the younger daughter went so far as to smash the safe to retrieve her phone. What did she say when her dad confronted her? What does she think about it? I also understand the intention to keep the politics really vague but it kind of distracted from it all. There was also a subplot related to the family in which there were three guys in a control center working to grab the son’s attention through social media.
Content
So as I mentioned, I felt that there were maybe too many interviewees. There were also some who were only tangentially related to the area of social media. For instance, Jaron Lanier had some good discussions, but he was presented as coming from a background of virtual reality. I only found out on Wikipedia that he’d written a book on deleting social media. There were a lot of people to keep track of, the only one I remember was Tristan Harris because we saw more of him. Otherwise, the other interviewees felt a bit like throw ins. Sure, it’s good to have an expert talk on the subject for credibility but without an overarching narrator, the call to action was left to the interviewees and it was left a little vague.
Themes
Behaviour
If you aren’t paying for a product, you are the product. This has been touted often enough in the modern age, and was quoted in the movie. On a surface level, we believe that our data is the product. But what do people do with the data? They use it to analyze our behaviour. The trio at the control center were the literal representation of the tech company trying to gain the son’s attention. Whenever he was not looking at his phone, they’d try to find ways to get him looking. When he was ending his usual scroll, they’d put something on screen to make him keep looking. And when he was looking long enough, they’d add an advertisement.
Jaron Lanier explained that if tech companies could modify our behaviour just a little bit, that could still make lots of money for some companies. If the companies could keep our attention long enough to see a certain ad, then maybe that company would gain even just a little bit of sales. But the more data the tech companies receive, the better they get at predicting users’ behaviour. That is the power of big tech.
Addiction
Initially, I believed that it was an accident. Even Tristan Harris noted that he’d surprised himself when he was working at a tech company by how addicted he was to his email. But the moment they saw it was addictive, there should have been strategies put forth to prevent the internet from becoming more addictive. Instead, tech companies went full steam ahead in studying how to exploit the addictiveness. Many people who worked in tech had taken courses on psychology, to study how best to capture their users’ attention. One line was noted in the documentary, in which the only two industries that call their customers “users” was illegal drugs and technology, insinuating that both were purposely addictive.
Former executives even recall experiences where they’d go home and be addicted to their social media when they had their family around them. They designed the addiction and yet they still fell for it.
Young people
Some stats were presented that showed how self harm and suicide rates rose dramatically among young people (especially girls) after social media became mainstream. This is another conversation, but I remember seeing a documentary that said that young girls are the most attuned to their communities; they’re most motivated to fit in with their communities. But social media is not a normal community. Social media is thousands, millions of people. Our brains were not wired to connect with that many people, to be able to handle attuning ourselves to that many people. if even 5% of 100,000 people told someone that they hated how they looked, that’s still 5,000 people. Compare that to the average person who probably knows like 100 people personally at maximum. If just 5 out of the 100 people comment negatively on you, that’s probably something the average person could handle.
This was exemplified in the story with the young daughter who painstakingly took the best picture and photoshopped it appropriately. She was elated when the praise came through, but even just the one negative comment on her ears deflated her mood considerably.
Discord and misinformation
This topic was demonstrated through explaining conspiracy theories, as well as political discord.
Social media was designed to show you rabbit holes. If you clicked on a certain vide, it would show you more videos like that. Some were more dangerous than others, the dangerous ones being conspiracy theories or hate groups. One of the examples shown was Kyrie Irving going down the flat earth rabbit hole. He’d probably clicked on videos that slowly led to that. And there are more dangerous rabbit holes like discussing how covid is a hoax and convincing people not to practise safety precautions.
Political unrest was also attributed to social media in some countries. An example was the hate crimes against Rohingya Muslims in Myanmar. Facebook was widely used in the country, and it was a way for discriminatory ideology to spread quickly.
This topic was more serious as it closed with the documentary warning us that the tech industry had basically created a tool that could be used by anybody to control a large group of people. But can you trust this tool to fall into the right hands? No, you cannot. There was a quick detour discussing artificial intelligence and how it was already widely used in our lives. However, artificial intelligence/machine learning does not have a conscience. It does not know what is right or wrong. It is a statistical model. It doesn’t differentiate between a harmless rabbit hole and a dangerous one, it only knows what is more popular.
Regulation
I guess the call to action was the need for regulation. Many people in the end said that it was unrealistic to wipe social media from our lives. It’s out of the bag now, and we know that social media can be a useful tool as well. So the solution is regulation. Government regulation, as well as regulation of children.
TV for children was regulated, and interviewees noted that something similar should be done for social media. Some families do this for their children, limit their usage. But if it can be controlled at the source, that would be immensely helpful as well, and put responsibility on the tech companies to be less predatory and to remove the power from them.
Overall
So as I mentioned, this documentary didn’t really show me anything I hadn’t seen before. But it’s still good to discuss I suppose.
Spoilers?
Narrative and Format
The documentary followed two main paths. One was in interviewing various people who had worked in the tech industry. The other path was a dramatization of the life of a family and how social media negatively affected them.
Here are some of the arguments that were made during the documentary:
- Users are the product. Not just our data, but in the way social media can modify our behaviours to do what advertisers want. Advertisers are the true customers of technology customers.
- Social media was intentionally designed to be addictive. In fact, many tech executives took psychology classes to study tactics to get users’ attention.
- Social media significantly hurts young people who have grown up on the internet. They seek validation from thousands/millions of people and when they don’t receive it, are more prone to hurting themselves.
- Social media is a powerful tool that can be used to sow discord, or to sow misinformation to influence thinking.
- There is not enough regulation in the area of social media.
I’ll discuss these more in detail in the themes section below. But as you can see, none of these arguments are new if you’ve been on the internet long enough.
I initially did agree with one of the interviewers who said that no one person is the evil villain. We didn’t know of the power that social media would have over our lives. But Tristan Harris noted that he as a worker started to notice the addictive aspects of technology. When that became apparent, then it became the tech industry’s responsibility to control social media so that it couldn’t be used to hurt people.
Production
Now that I’m thinking back on the documentary, while the narrative was loosely cohesive, I didn’t feel that the interviews were that cohesive in the sense that interviewers were only brought in for specific topics. That’s fine, it was more of a distraction stylistically than narratively. But perhaps I would be more drawn in if there were fewer interviewers (fewer people to keep track of).
The plot point with the family was mildly interesting but they definitely could have gone harder with it. They used the family to show issues, but only on a surface level. For instance, the younger daughter went so far as to smash the safe to retrieve her phone. What did she say when her dad confronted her? What does she think about it? I also understand the intention to keep the politics really vague but it kind of distracted from it all. There was also a subplot related to the family in which there were three guys in a control center working to grab the son’s attention through social media.
Content
So as I mentioned, I felt that there were maybe too many interviewees. There were also some who were only tangentially related to the area of social media. For instance, Jaron Lanier had some good discussions, but he was presented as coming from a background of virtual reality. I only found out on Wikipedia that he’d written a book on deleting social media. There were a lot of people to keep track of, the only one I remember was Tristan Harris because we saw more of him. Otherwise, the other interviewees felt a bit like throw ins. Sure, it’s good to have an expert talk on the subject for credibility but without an overarching narrator, the call to action was left to the interviewees and it was left a little vague.
Themes
Behaviour
If you aren’t paying for a product, you are the product. This has been touted often enough in the modern age, and was quoted in the movie. On a surface level, we believe that our data is the product. But what do people do with the data? They use it to analyze our behaviour. The trio at the control center were the literal representation of the tech company trying to gain the son’s attention. Whenever he was not looking at his phone, they’d try to find ways to get him looking. When he was ending his usual scroll, they’d put something on screen to make him keep looking. And when he was looking long enough, they’d add an advertisement.
Jaron Lanier explained that if tech companies could modify our behaviour just a little bit, that could still make lots of money for some companies. If the companies could keep our attention long enough to see a certain ad, then maybe that company would gain even just a little bit of sales. But the more data the tech companies receive, the better they get at predicting users’ behaviour. That is the power of big tech.
Addiction
Initially, I believed that it was an accident. Even Tristan Harris noted that he’d surprised himself when he was working at a tech company by how addicted he was to his email. But the moment they saw it was addictive, there should have been strategies put forth to prevent the internet from becoming more addictive. Instead, tech companies went full steam ahead in studying how to exploit the addictiveness. Many people who worked in tech had taken courses on psychology, to study how best to capture their users’ attention. One line was noted in the documentary, in which the only two industries that call their customers “users” was illegal drugs and technology, insinuating that both were purposely addictive.
Former executives even recall experiences where they’d go home and be addicted to their social media when they had their family around them. They designed the addiction and yet they still fell for it.
Young people
Some stats were presented that showed how self harm and suicide rates rose dramatically among young people (especially girls) after social media became mainstream. This is another conversation, but I remember seeing a documentary that said that young girls are the most attuned to their communities; they’re most motivated to fit in with their communities. But social media is not a normal community. Social media is thousands, millions of people. Our brains were not wired to connect with that many people, to be able to handle attuning ourselves to that many people. if even 5% of 100,000 people told someone that they hated how they looked, that’s still 5,000 people. Compare that to the average person who probably knows like 100 people personally at maximum. If just 5 out of the 100 people comment negatively on you, that’s probably something the average person could handle.
This was exemplified in the story with the young daughter who painstakingly took the best picture and photoshopped it appropriately. She was elated when the praise came through, but even just the one negative comment on her ears deflated her mood considerably.
Discord and misinformation
This topic was demonstrated through explaining conspiracy theories, as well as political discord.
Social media was designed to show you rabbit holes. If you clicked on a certain vide, it would show you more videos like that. Some were more dangerous than others, the dangerous ones being conspiracy theories or hate groups. One of the examples shown was Kyrie Irving going down the flat earth rabbit hole. He’d probably clicked on videos that slowly led to that. And there are more dangerous rabbit holes like discussing how covid is a hoax and convincing people not to practise safety precautions.
Political unrest was also attributed to social media in some countries. An example was the hate crimes against Rohingya Muslims in Myanmar. Facebook was widely used in the country, and it was a way for discriminatory ideology to spread quickly.
This topic was more serious as it closed with the documentary warning us that the tech industry had basically created a tool that could be used by anybody to control a large group of people. But can you trust this tool to fall into the right hands? No, you cannot. There was a quick detour discussing artificial intelligence and how it was already widely used in our lives. However, artificial intelligence/machine learning does not have a conscience. It does not know what is right or wrong. It is a statistical model. It doesn’t differentiate between a harmless rabbit hole and a dangerous one, it only knows what is more popular.
Regulation
I guess the call to action was the need for regulation. Many people in the end said that it was unrealistic to wipe social media from our lives. It’s out of the bag now, and we know that social media can be a useful tool as well. So the solution is regulation. Government regulation, as well as regulation of children.
TV for children was regulated, and interviewees noted that something similar should be done for social media. Some families do this for their children, limit their usage. But if it can be controlled at the source, that would be immensely helpful as well, and put responsibility on the tech companies to be less predatory and to remove the power from them.
Overall
So as I mentioned, this documentary didn’t really show me anything I hadn’t seen before. But it’s still good to discuss I suppose.