The key that opened to door for Facebook is apparently, dopamine.
To get people to stay on Facebook for a long time, you have to generate dopamine discharges.
These discharges are ephemeral instants of happiness and those moments go hand in hand with the LIKES and SHARES.
The Modus Operandi of Facebook is based on its explicit “exploitation of the vulnerability of human psychology,” confesses Sean Parker, one of the inventors of this social network.
“The inventors; me, Mark Zuckerberg, and Kevin Systrom of Instagram and all those people, we knew. Despite this, we did it.”
Parker confessed his discouragement with Facebook in a recent public intervention and declared himself an objector of social networks. He ended his speech with a disturbing phrase:
“Only God knows what is being done with the brains of children.”
There was a time when the one who DISLIKED these platforms was branded by default as resistant to change, as old.
That time passed. A real storm is being unleashed around the role played by social networks in our society.
Facebook and Twitter are accused of having become spaces that tighten the debate and contaminate it with false information.
There is already a belief that if you live in these platforms and spend as much time as possible in them, they will create addiction; the networks combined with mobile devices contaminate people to the point they’ve become addictive inventions, the new tobacco. a public health problem.
Facebook and other social networks are a democratic health problem
On December 12, 2017, a former vice president of Facebook, Chamath Palihapitiya, assured that the networks are “tearing” the social fabric apart.
“The short-term feedback cycles driven by the dopamine we have created are destroying the functioning of society,” he said in a forum at the Stanford School of Business.
On January 23, Tim Cook, CEO of the all-powerful Apple, said he did not want his 12-year-old nephew to have access to social media.
On February 7, actor Jim Carrey sold his shares of the platform and encouraged Facebook to boycott his passivity in the face of alleged Russian interference in the elections.
The perception we have of networks has mutated. They were born as an instrument to connect with friends and share ideas.
The alleged isolation generated by the Internet paled. They became a democratizing force in the heat of the Arab Spring. They seemed like a perfect tool for social change, they empowered the citizen.
“They gave voice to those who did not have a voice,” said Emily Taylor, an executive at Oxford Information Labs, who has been working on governance issues on the Web.
“In just seven years, everything has changed. These political advertising campaigns are aimed at altering the electoral processes. ”
The passage through the polls of the Brexit and the election of Donald Trump are two of the phenomena that pushed everyone to ask themselves questions: how did nobody see it arrive? The answer, in part, was sought and found in the networks.
Facebook was cited in October by the Justice Committee of the US Congress to explain its role in the alleged Russian interference in US elections in 2016.
It was admitted that 126 million people had been able to access content generated by alleged Russian agents (the Internet Research Agency), which also posted about a thousand videos on YouTube and 131,000 messages on Twitter.
Among all those messages were news stories that Hillary Clinton had sold arms to ISIS, during her tenance in the 8 years Barack Obama was president of the United States.
A Pew Research study published in October 2016 points out that 49% of North American users consider that political conversations on social networks are more furious than in real life.
Facebook and other social media contribute the tension
“On Twitter,” says researcher Mari Luz Congosto,” the tone is very harsh. The sour tone has increased, before it was more jocular. The messages have become harder. ”
But this has not been the only controversy.
The networks have been in focus for the purchase of fictitious followers by influencers; for the public lynching of people who are reported in the networks and are ostracized without trial.
In Myanmar, Facebook experienced one of its worst episodes: last year it was accused of becoming the fundamental vector of propaganda against the Rohingya minority, victim of a genocide.
A research report published last week by Wired magazine highlights the internal hell that the organization has lived in the last two years.
The tension over what to do once embarked on what was a reality – its status as a global information vehicle – disputes over disputes over how to deal with the avalanche of false news and the tension that was flooding its pages has mowed the reigning optimism, including Zuckerberg’s own.
Facebook is the leading platform in redirecting readers towards content since mid-2015, when it surpassed Google.
More than 2,130 million people are part of its community. There are 332 million on Twitter. Two thirds of American adults (67%) report being informed via social networks, according to an August 2017 study conducted by the Pew Research Center.
Facebook does not create content, but it filters it, hides it, censors it and ocassionally organizes it. First it decided to carry out editorial work with a team of journalists who chose the most popular news. Then, after several scandals during the campaign, they bet on the algorithms, but the decision backfired.
The problem is the business model. This is what Emily Taylor points out. The user agrees to transfer data in exchange for a free service.
The algorithms use that information to determine the interests of the user. Advertising firms pay for it.
“Not only are data extracted from what is publicly posted,” says Taylor, “but also from the location of private messages.”
The more time we spend on the platform, the more data can be extracted.
A shocking, sensational, even improbable, news calls more to reading than a calm and balanced analysis. A drift that affects both networks and traditional media.
“Facebook shows you what the algorithm wants, we do not know with what objective, whether perverse or not,” says Mari Luz Congosto, network expert and researcher at the Telematics Group.
“You lose a part of your freedom and the platform does business with that. Manipulates what people read, marks the way. ”
And the problem is that the algorithm commands more and more. We have gone from an Internet that was accessed by computers, in which one sought, explored, to one that is reached through applications installed on the mobile. It is something that happens, above all, with a whole generation of young people living inside their phone.
“The Internet comes to you through an algorithm, it is not you who are going to look for something on the Internet,” said Guatemalan lawyer and digital activist Renata Ávila.
“Before we used to operate on the street, the world was ours, we entered and left the buildings. Now we are locked in a mall with strict rules that only seek to maximize the business model. ”
For Ávila, the problem is not unique to Facebook, far from it. All platforms work the same: “The problem is the architecture of the mobile, of the apps. The business model. ”
The Facebook Bubble Effect
The user reads what his friends and his ideologically related people send him, concluded a study published in the North American scientific journal PNAS, that analyzed 376 million interactions among Facebook users. It also concluded that people tend to look for information aligned with their political and ideological ideas.
Although Zuckerberg says that he is willing to stop news, brands and memes, even if he is going to tweak the algorithm so that there is less information and more relationship between users, he will not want to lose advertising revenues that come in depending on the time spent in his network
Jonathan Taplin, entrepreneur who published the book Move Fast And Break Things: How Facebook, Google And Amazon Cornered Culture And Undermined Democracy, has all his hopes placed in the EU.
“Europe is leading the world in this,” he states.
“We should thank, for example, that Google will be fined 2,4 million euros for abuse of dominant position.”
“You have to regulate,” says Taplin. “We need laws. It is not the market that is going to solve the problem.”
Taplin advocates reducing the size of these empires by law: forcing Google to sell YouTube; Facebook to get rid of Instagram and WhatsApp; apply competition laws, resize.
The Silicon Valley greats, meanwhile, have sent an army of lobbyists to Washington. They fear that the same that happened to Microsoft will happen to them: They’ll be condemned because of their abusive monopoly practices.
There are voices that demand that platforms have to respond for what is published in them, a ridiculous proposal, since social media are mere channels.
Will anyone propose that mainstream media face a court for knowingly lying to the public to advance their interests to make a profit?
There are others that demand that educational programs include practical elements that allow young people to learn to manage the addictive component of networks must be created and added to social media content.
Some people say, finally, in a clear display of anthropological optimism, that people will progressively get of with their lives without them, just as many people got rid of junk food, and that they will choose to devote their reading time to more select bites.