SciELO - Scientific Electronic Library Online

 
vol.13 número3O futuro da imprensa portuguesa: há lugar para o Estado?Discursos en torno a la construcción de identidad juvenil en America Latina.: Fenómeno político, educativo y cultural. Una mirada desde Colombia. índice de autoresíndice de assuntosPesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Observatorio (OBS*)

versão On-line ISSN 1646-5954

OBS* vol.13 no.3 Lisboa ago. 2019

https://doi.org/10.15847/obsOBS13320191388 

Paving the way for regulation: how the case against Facebook stacked up

 

Diogo Queiroz Andrade*

*Researcher at ICNOVA, ongoing PhD in Digital Media at Universidade Nova de Lisboa (FCSH NOVA), Portugal

 

ABSTRACT

The case against Facebook has been stacking up for some time now, but in the last two years it gained further consistency. The Brexit referendum and the American presidential elections brought to light the dominant social network as a threat to democratic processes and liberal societies, with the Cambridge Analytica scandal further undermining Facebook's credibility. Now that the first official reports have been published and a growing stack of literature and scientific analysis is available, it is time for a systematized review of both the repercussions of Facebook on democracy and public speech, and the proposed ways to tame it. I start by framing the problem by using the first official reports on the use of Facebook as a tool to undermine democratic processes; I then proceed to explain how this social network has evolved from a champion of liberation across the Middle East into a weapon of mass interference and manipulation both by foreign governments and local agencies; finally, I trace the possible paths forward considering the referenced literature. The point made in this essay is that it became inevitable for liberal democracies to regulate big tech, starting with social media. Facebook is the case in point, due to its massive scale and seriousness posed by its several cases of misconduct – but the whole movement should lead to a new take on the role of technology in society.

Keywords: Facebook . Disinformation . Manipulation . Polarization . Freedom . Democracy

 

Introduction

The Final Report published by the House of Commons on Disinformation and Fake News is straightforward: "the dominance of a handful of powerful tech companies has resulted in their behaving as if they were monopolies in their specific area (…) Facebook, in particular, is unwilling to be accountable to regulators around the world." (House of Commons, 2019). This is the first document from an official entity of a liberal democracy that clearly points to a pattern of manipulation and disinformation which endangers the current political status quo in many parts of the world. The final text builds on the Preliminary Report that was already a finger pointed towards social media platforms in general and Facebook in particular. The text is damning to the company: "Tech companies (…) reward what is most engaging, because engagement is part of their business model and their growth strategy. (…) This manipulation of the sites by tech companies must be made more transparent. Facebook has all of the information. Those outside of the company have none of it, unless Facebook chooses to release it." (House of Commons, 2018). In the United States, several House Committees are also looking into foreign meddling with the elections using Facebook tools, in line with reports from international agencies such as the United Nations and the European Union.

A thorough study published by the University of Oxford has found evidence of "formally organized social media manipulation campaigns in 48 countries" with impact on elections and referendums. "In emerging and Western democracies, sophisticated data analytics and political bots are being used to poison the information environment, promote skepticism and distrust, polarize voting constituencies, and undermine the integrity of democratic processes." (Bradshaw & Howard, 2018). And a report published by the non-profit New Knowledge [1] detailed the actions of the Internet Research Agency, the entity presiding to Russian interference in the American presidential election, with disturbing conclusions regarding the targeting of minorities, operations of voter suppression, sowing divisions and the specific targeting of prominent figures: "it was absolutely intended to reinforce tribalism, to polarize and divide, and to normalize points of view strategically advantageous to the Russian government on everything from social issues to political candidates.". For the sake of brevity, we'll adopt the operational concept of democracy as advanced by Larry Diamond on a talk [2]: "a system of government with four key elements: a political system for choosing and replacing the government through free and fair elections; the active participation of the people, as citizens, in politics and civic life; protection of the human rights of all citizens; A rule of law, in which the laws and procedures apply equally to all citizens." It is this political system, which stands as the norm in developed countries, that is endangered via behaviors enabled by Facebook.

A lot has been reported by traditional media outlets, with the credit due to the extended investigation [3] of the Guardian/Observer exposing the Cambridge Analytica scandal and its aftermath. But it was not the only one: Wired explained [4] that Facebook executives had "plenty of reasons to keep its head in the sand", not only because of the legal consequences but also because it is "not easy to recognize that the machine you've built to bring people together is being used to tear them apart"; Bloomberg claimed [5] that "Facebook's political unit enables the dark art of digital propaganda". At the same time, many industry insiders and political scientists have been warning against Facebook, mostly using the media. On Prospect magazine, Sameer Rahim claimed [6] "Mark Zuckerberg is a bigger threat to democracy than Donald Trump"; Niall Ferguson wrote on the Globe and Mail [7] that "social networks are creating a global crisis of democracy"; Alexis Madrigal, writing for The Atlantic [8], explained "The dominant social network had altered the information and persuasion environment of the election beyond recognition while taking a very big chunk of the estimated $1.4 billion worth of digital advertising purchased during the election. There were hundreds of millions of dollars of dark ads doing their work. Fake news all over the place."; the examples could go on and on, from dozens of credible media outlets all over the world. All this has been reinforced through various works detailing the derailing of democracy (Sunstein, 2017), the damage to the public sphere (Wachter-Boettcher, 2017), the death of expertise (Nichols, 2017), the promotion of lies (Rabin-Havt, 2016), the existential threat of big tech, the social wrecking ball character of Silicon Valley, the general undermining of democracy (Taplin, 2017) and civic society (Vaidhyanathan, 2018), and the hindering of the economy and innovation in technology (McNamee, 2019). These early historians of the social age are technologists, expert journalists, and scholars leading the critical analysis of this phenomenon, and pointing to the same conclusion: Facebook poses a threat to democracy and something needs to be done. This essay will investigate what needs to be done and by whom, starting by framing the topic and understanding its roots.

Facebook and Democracy: it's complicated

Early in this decade, social media seemed to be a tool for good. Many pundits and commentators reveled on the openness of the platforms, because they were allowing suppressed voices to be heard and civil discussions to be had: "Today, it occurs to me that Facebook, Twitter, and YouTube may be the Gutenberg press of the Middle East, tools like this that enable people to speak, share, and gather." (Jarvis, 2011). The internet was going to give voices to everybody, thankfully undermining traditional media: "It isn't just that our communications tools are cheaper; they are also better. In particular, they are more favorable to innovative uses (…) The new model assumes that the devices themselves are smart; this in turn means that one may propose and explore new models of communication and coordination without needing to get anyone's permission first (to the horror of many traditional media firms)." (Shirky, 2008)

 

The sophistication of communication

The so-called Arab Spring, the movement that began a drive towards democracy in some North Africa and Middle East countries, became a badge of honor. Social media has widely been credited as responsible for the events that took place, first in Tunisia, then in Egypt, and later in Morocco, Libya, Syria, and Jordan (albeit on much smaller scales). The "social media revolutions" crediting was everywhere: the media played a real part in building this fantasy about young people leading revolutions from behind the screens and infecting whole countries with democratic ideals. As always, the truth is much more nuanced. Technology did play a role, but the major credit has to go to the device and not to a specific platform: "Mobile phones and new technology have played an inspiring and remarkable task in reaching and pursuing people internationally. Although talking about abrupt social change might be too optimistic and quite unworkable, it should be acknowledged that mobile telephony and the spread of the use of social media in socio-political related issues is giving the source for steady and long-term social changes. Already people feel increasingly comfortable in taking action because of the power embedded in these tools. Distances have been reduced and times have been restrained." (Alhindi, Talha, & Sulong, 2012). The evolution was also notable in terms of hardware: "Simple cell phones have now evolved into sophisticated smartphones. The latter have more to offer than the former in terms of access of internet, use of apps, better-integrated interfaces and high quality display touchscreens to name a few." [9].

Smartphones allowed for common citizens to take pictures and record videos, sharing them instantly within their own networks. That easily furthered the scope of social movements and has was badly understood by the powers that commanded these countries. "The Arab Spring gave dictators such as Mubarak and Ben Ali a hard lesson; that if you stop one method of communication, we may find other methods also. For example, Mubarak's interfering of the mobile phone networks in the preliminary stages of the public revolution in Egypt meant that he was unintentionally sending out two signals. The first was that the mobile networks were important in the uprising and that SMS played an important tool in social mobilizing Egyptians. Secondly he also gave the green signal to demonstrators that electronic forms of communication were a key driver in mobilizing the masses. In doing so, the public were simply going to react to these measures by using Twitter and Facebook. Despite them being less pervasive in Egypt than mobiles, these two alternative tools were fundamental in mobilizing the masses" (Alhindi et al., 2012). Wael Ghonim, the man behind the pages that led the scheduling of very important protests leading to the Egyptian revolution, wrote a book about the social movement and named it "Revolution 2.0". The entire book is a claim that Facebook is an immensely useful tool to join people together and to spread ideas around. And it plays to Facebook credit the fact that, as soon as the protests became national, the Egyptian government decided to shut down access to Facebook and Twitter (a practice still happening in some countries today).

All this emboldened Facebook and Silicon Valley. Those were the days when Mark Zuckerberg enjoyed taking credit for social change. In his letter to investors in 2012 [10], right after the immensely successful IPO, he wrote: "By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few. Through this process, we believe that leaders will emerge across all countries who are pro-internet and fight for the rights of their people, including the right to share what they want and the right to access all information that people want to share with them."

 

The business model as the strategy

Wael Ghonim would rethink his vision a few years later [11]: "I once said, 'If you want to liberate a society, all you need is the Internet.' I was wrong. I said those words back in 2011, when a Facebook page I anonymously created helped spark the Egyptian revolution. The Arab Spring revealed social media's greatest potential, but it also exposed its greatest shortcomings. The same tool that united us to topple dictators eventually tore us apart."

The academic Zeynep Tufecki, working on the fringes of the political implications of emerging technologies, dedicated a prescient book to the coordination of political movements in digital times: "Digital technologies are so integral to today's social movements that many protests are referred to by their hashtags (…) Activists can act as their own media, conduct publicity campaigns, circumvent censorship, and coordinate nimbly." (Tufekci, 2017)

What Ghonim and Tufecki remind us is that the spreading of messages is only as powerful as the message itself – and the platform's own strategy for specific content. All these "social media revolutions" happened at the same time Facebook became a giant in online advertising thanks to its advanced user targeting and perfect timing. When the whole internet became a click-based business model, Facebook and Google followed a simple line: the more clicks you get, the more you earn, and more engaged the user is with the message. These lessons were well learned by two different movements: one fighting for profit and the other fighting for political upheaval. Both came together in hurting democratic open societies, with the active help of Facebook, and it became very clear that "serving what people want" was the better way to reach a wide audience.

 

Letting the business model run amok

Five years ago, online ads such as banners and interstitials were still a relevant source of profit for whoever got a wide audience, and as traditional (and not so traditional) media tried to grapple with this novel reality, there were new actors ready to please the audiences who were never satisfied with the content that made its way to their "filter bubble" (Pariser, 2011).

That gave way to a new class of digital entrepreneurs: producers of fake content intended to please the masses and acknowledge the confirmation biases eased in by the superficial experience within social media platforms. A thorough investigation published in Wired magazine explained clearly how the business of fake content worked for years. By showing what happened in a small village in Macedonia, a Wired reporter exposed [12] a known-but-downplayed way of life for thousands of people around the world. At the same time, ideological forces were exploiting Facebook's ability to micro target the audiences to feast on their prejudices and preconceptions. It happened during the Brexit referendum and on a much wider scale during the 2016 American presidential elections. Donald Trump's victory was powered by a well-oiled machine that kept feeding audiences with fake stories – properly targeted to specific consumers by grace of Facebook's advertising engine. One of the world's top experts on fake news quickly found [13] that "viral fake election news stories outperformed real news on Facebook", and that the most engaged "news" during the election cycle was a fake one: "Of the 20 top-performing false election stories identified in the analysis, all but three were overtly pro-Donald Trump or anti-Hillary Clinton". This was not happening by chance. It was a well-thought out strategy put into place by Trump's social media team, led by its ideologue and chief manipulator Steve Bannon. The use of Facebook's tools and the amount of money invested were enough for Trump to receive a call from Zuckerberg himself [14], congratulating the candidate for the "successful campaign".

All this was known, but the scale and meticulousness of the work done was a mystery until the Cambridge Analytica scandal broke out. Thanks to the work originally published in the Guardian/Observer, we now know that a private company funded by a Trump supporter and responsible for managing his digital campaign had "harvested millions of Facebook profiles of US voters, in one of the tech giant's biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box." [15]. The investigation is still unfolding, with ramifications that expose manipulation of millions of voters both in Britain and the United States, with active interference from foreign actors – namely Russian political activists behind what appears to be the bigger manipulation of a democratic process in a free society ever: so far, 3,000 ads have been linked to Russia, and 470 Facebook pages (which resulted in hundreds of millions of shared posts) were run by Russian operatives [16].

Some commentators fear the forces operating at this scale are even bigger than that. "What happened in 2016 was much more than just a Kremlin 'black op' that exceeded expectations. It was a direct result of the profound change in the public sphere brought about by the advent and spectacular growth of the online network platforms. In many ways, the obsessive focus of the American political class on the Russian sub-plot is a distraction from the alarming reality that – as the European competition commissioner Margrethe Vestager argued – the big tech companies, and the way their services are used by ordinary people, pose a much bigger threat to democracy." [17]

 

The unavoidable clash with political powers

As the Cambridge Analytica scandal unfolded into a much larger problem, more accusations came against Facebook. The United Nations expert investigating the genocide in Myanmar said that the platform "played a role in spreading hate speech there (…) As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media" [18]. Several media reports confirm the assertion: The Washington Post reported [19] that "For Buddhists in Burma, even a quick scroll through Facebook's news feed provides fuel for hatred and nationalistic fervor" and The Guardian quoted a researcher specialized in Facebook trends to confirm that "hate speech exploded on Facebook at the start of the Rohingya crisis in Myanmar last year, analysis has revealed, with experts blaming the social network for creating 'chaos' in the country."

In Sri Lanka, the same pattern of events took place more recently: "(…) Facebook's newsfeed played a central role in nearly every step from rumor to killing. Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact." [20]. The same has happened – and been thoroughly reported – in South Sudan, leading to a very descriptive title on Buzzfeed [21]: "How To Use Facebook And Fake News To Get People To Murder Each Other". The same pattern was detected in Ukraine, India, Philippines and Brazil. None of these reports led to a change in policy from Facebook. If anything, it led to a wider problem, with disinformation techniques spreading to other platforms such as WhatsApp [22] which, not by chance, is owned by Facebook.

It took the Cambridge Analytica scandal and disinformation campaigns targeting political events as Brexit and the U.S. 2016 presidential elections for Mr. Zuckerberg to worry. In January 2018, in the middle of the investigation into the Russian interference in the American elections, a burst of three blog posts from Facebook [23] tried to put the company back on track by admitting that sometimes social media can be bad for democracy: "Facebook was originally designed to connect friends and family — and it has excelled at that. But as unprecedented numbers of people channel their political energy through this medium, it's being used in unforeseen ways with societal repercussions that were never anticipated. In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform. We're working diligently to neutralize these risks now. We can't do this alone, which is why we want to initiate an open conversation on the hard questions this work raises."

It clearly wasn't enough to tame the growing concerns about Facebook's role on the public sphere. Now Facebook was a target for wide-open research about hate speech: "we find that anti-refugee hate crimes increase disproportionally in areas with higher Facebook usage during periods of high anti-refugee sentiment online. (…) Taken at face value, this suggests a role for social media in the transmission of Germany-wide anti-refugee sentiment." (Muller & Schwarz, 2017).

The burden of proof against the American platform kept growing, and six short years after lauding Facebook as a tool to create more responsible governments, Mark Zuckerberg embarked on an apology tour: several testimonies were given, and three historic audiences took place in front of the legislative powers of both the United States and the European Union. He apologized profusely, took the blame for some errors [24] and asked for forgiveness [25]. On every public appearance, the same problems were discussed: Facebook actively helped on the dissemination of false information, allowed for foreign meddling in elections, promoted hate speech and broke every conceivable expectation of data privacy. The truth is, no apology ever seemed to make any difference, as detailed on "Why Zuckerberg's 14-Year Apology Tour Hasn't Fixed Facebook" [26].

 

Three arguments against Facebook

The case against Facebook seems to be a straightforward one. The argument that surmounts from the works that recently reflected on the challenges that such a powerful social network poses to democracy is that Facebook can't correct its mishaps because they are engrained in its own DNA. In short, the problem with Facebook is Facebook itself. As it stands, there are three key arguments being made about the interference of Facebook on democracy. They are the core of the following discussion.

 

Polarization

Let's start with the more widespread idea: the weight of filter bubbles or echo chambers and its consequences on polarization. These are the ways "Facebook reward users with more of what they tell the companies they want, thus narrowing fields of vision and potentially creating echo chambers of reinforced belief. (…) it gives you more of the stuff with which you would engage and less of the stuff with which you would not. Facebook does this with predictive scoring of each item, whether posted by a friend or purchased as an advertisement. Your preferences become clear to Facebook over time." (Vaidhyanathan, 2018). And the algorithms excelled at predicting engagement and at maximizing time spent.

Several authors have pointed out inconsistencies in the filter bubble hypothesis, although some results lead to contradictory results. Fletcher and Nielsen found "little support for the idea that search engine use leads to echo chambers and filter bubbles. To the contrary, using search engines for news is associated with more diverse and more balanced news consumption, as search drives what we call "automated serendipity" and leads people to sources they would not have used otherwise." (Fletcher & Nielsen, 2018b) Another study claimed two conclusions: one, "that while social media and search do appear to contribute to segregation, the lack of within user variation seems to be driven primarily by direct browsing"; other, that the outlets that dominate partisan news coverage are still relatively mainstream, ranging from the New York Times on the left to Fox News on the right." (Seth Flaxman, Sharad Goel, & Justin M. Rao, 2016) The reference to Fox News is very meaningful because it is a textbook example of an echo chamber aligned with one side of the political spectrum disregarding any concern for the truth, repeating falsehoods and ignoring relevant facts altogether. Taking into consideration that TV is a more passive mode of media consumption than the ones provided by the internet, this aligns with the findings of yet another study that claims "analytic thinking plays an important role in people's self-inoculation against political disinformation" and that "our evidence indicates that people fall for fake news because they fail to think; not because they think in a motivated or identity-protective way." (Pennycook & Rand, 2018). And, if one can reasonably argue that the internet does not promote filter bubbles, it is also safe to argue social media does rely on echo chambers to keep users engaged.

It might not be obvious why is this a problem, but it is – on an individual and global level. But why? "If likeminded people stir one another to greater levels of anger, the consequences can be dangerous. Terrorism is, in large part, a problem of hearts and minds, and violent extremists are entirely aware of that fact. They use social media to recruit people, hoping to increase their numbers or inspire "lone wolves" to engage in murderous acts. (…) More broadly, echo chambers create far greater problems for actual governance (…) they can lead to terrible policies or a dramatically decreased ability to converge on good ones." (Sunstein, 2017).

Several experiences were conducted to expose this, such as the "Blue feed, Red feed" developed by The Wall Street Journal [27]. And the way this is achieved is also under tight scrutiny, because there is a component of serial gamification bordering on substance abuse: "Tech humanists say this business model is both unhealthy and inhumane – that it damages our psychological well-being and conditions us to behave in ways that diminish our humanity." (Tarnoff & Weigel, 2018).

One could argue that echo chambers are good for the individual user – I know what I like, I don't need to go and search for it if it is already at my disposal. Even that is obviously debatable, because we are arguably better if we are free to choose between different ideas. But, apart from Facebook officials, literally no one thinks this is good for democracy in general. The close mindedness it enforces denies the basic parameters of a free society, trapping each citizen in its own preconceived ideas and limitations. And in most cases these filter bubbles blocks people from learning the truth about a false rumor or a falsehood, because if a user engages with a lie the system is not going to expose him to a contradiction – because it does not want to afford the risk of losing him. The algorithms Facebook engineers wrote are designed to keep a steady restricted and repeated diet, ensuring that the amount of time spent is maximized.

 

Business model as identity

The reason why they are designed and constantly tuned like that is the second argument against the platform. Facebook is a huge advertising machine that needs to keep users engaged – to provide more ads, but also "better ads", meaning ads more tuned to the user preferences. All this feeds a gigantic machine that typifies users, predicts behaviors and conditions responses. Or so it is as Facebook sells it to prospective advertisers, be them commercial or political [28] – they even go so far as embedding its own staff in bigger political campaigns, helping candidates maximize their efforts on the platform (thus spending more). The incentive is right there, because it is the only way Facebook monetizes its business [29].

In the hands of experts, the access to so many data points about a specific user is a powerful tool. Brad Parscale, Trump's digital director, claimed to have fine tuned the same messages into fifty or sixty thousand variations [30], targeting its supporters but also their adversaries with messages intending to alienate prospective voters [31].

The system has been in place for quite some time, allowing also for practices of discrimination by using racial profiling: In the fall of 2016, journalists at ProPublica found that "Facebook was allowing advertisers to target customers according to their race, even when they were advertising housing — something that's been blatantly illegal since the federal Fair Housing Act of 1968. To test the system, ProPublica posted an ad with a $50 budget, and chose to target users who were tagged as "likely to move" (…) while excluding users who were African American, Asian American, and Hispanic. The ad was approved right away." (Wachter-Boettcher, 2017).

Then comes the discussion on privacy, about who owns the right to the digital information and how to regulate. While discussing a mild effort to legislate some of the digital practices towards privacy, a citizen discovered that [32] "… the United States, unlike some countries, has no single, comprehensive law regulating the collection and use of personal data. The rules that did exist were largely established by the very companies that most relied on your data, in privacy policies and end-user agreements most people never actually read". It is more than a right to privacy as in controlling what one reveals; it is a right to own the image of oneself: "What it means is that Facebook, once again, controls how its users represent themselves online—preventing people from choosing to identify themselves the way they'd like, while enabling advertisers to make assumptions. And because all this is happening via proxy data, it's obscured from view — so most of us never even realize it's happening." (Wachter-Boettcher, 2017).

 

Content moderation and user exploitation

The third axis of the case against Facebook is the fact that it has become, in practice, an intolerance machine promoting censorship and welcoming hate speech. Facebook keeps promoting its love for getting people together and building communities, but that is the least of their worries: "Facebook is a carefully managed top-down system (…) It mimics some of the patterns of conversation, but (…) is a tangle of rules and procedures for sorting information (…) is always surveilling users, always auditing them, using them as lab rats in its behavioral experiments. While it creates the impression that it offers choice, Facebook paternalistically nudges users in the direction it deems best for them, which also happens to be the direction that thoroughly addicts them." (Foer, 2017).

Zuckerberg's company keeps on censoring content with a loose criterion. There is a widely known "nipple ban" as well as a nudity prohibition, but it has been enforced unevenly: "According to Facebook, its policies on nudity have become "more nuanced over time," and the company now recognizes that the naked female body can signify more than sexual titillation; it can also be used to protest, raise awareness, or educate. "While we restrict some images of female breasts that include the nipple, we allow other images, including those depicting acts of protest, women actively engaged in breastfeeding, and photos of post-mastectomy scarring," the guidelines read. All other "uncovered" female nipples need not apply" [33]. Another problem with the loose criteria happened when Facebook censored the famous image of the Vietnam war know as ´Napalm Girl´, triggering an interesting reaction [34]: "Norway's largest newspaper has published a front-page open letter to Facebook CEO Mark Zuckerberg, lambasting the company's decision to censor a historic photograph of the Vietnam war and calling on Zuckerberg to recognize and live up to his role as "the world's most powerful editor". As all this happens, the tools to "get people together" keep on being used to promote fear, hate and even death. Facebook Live, a tool to live broadcast personal events, has been a favorite of suicidal [35] and violent actions (Morse, 2017).

An internal memo authored by Andrew Bosworth was leaked by Buzzfeed [36], showing how aware of the problem the company is: "We connect people. That can be good if they make it positive. (…) That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned. That isn't something we are doing for ourselves. Or for our stock price (ha!). It is literally just what we do. We connect people. Period.".

In a recent interview [37], Mark Zuckerberg himself tried to explain why he was more worried about nipples than about lies, conceding he allows holocaust deniers to keep their pages: "I find that deeply offensive. But at the end of the day, I don't believe that our platform should take that down because I think there are things that different people get wrong. I don't think that they're intentionally getting it wrong (…) It's hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly (…) I just don't think that it is the right thing to say, 'We're going to take someone off the platform if they get things wrong, even multiple times.'". In the same interview he explained why he allowed for a notorious extreme right-wing activist, Alex Jones, to keep posting lies on Facebook: "There are really two core principles at play here. There's giving people a voice (…) Then, there's keeping the community safe (…) We're not gonna let people plan violence or attack each other or do bad things. Within this, those principles have real trade-offs and real tug on each other. In this case, we feel like our responsibility is to prevent hoaxes from going viral and being widely distributed. The approach that we've taken to false news is not to say, you can't say something wrong on the internet. I think that that would be too extreme (…) at the same time, I think that we have a responsibility to (…) make sure that those aren't hoaxes and blatant misinformation". A few days later, Facebook suspended and then banned Alex Jones and his Infowars pages. The criteria for action against a post is murky at best and puts a lot of pressure on the moderators, often hired as external help without access to the main company [38].

So, the platform built around Facebook helps promotes hate speech and disinformation because that helps engaging users and drives high returns. The fact remains that Facebook will do mostly anything to keep spinning ads to users, whether using true or fake content. It was about holocaust deniers, but it could as well be about flat-earthers, vaccine-deniers or any other conspiracy one can think of: it all reigns supreme in the Facebook kingdom of likes and engagement metrics. With this, it became a threat to democracy and open society: "So while Mark Zuckerberg thought he was forging a social network to connect people, by encouraging us to share content from other sites so easily he actually divided us by connecting us. (…) over time, as Facebook structures our feeds to reward those who interact most frequently with us, our tribes solidify. Because we yearn for those small bolts of affirmation (…) we habitually post items that have generated the most response. (…) Facebook also rewards us for that and pushes that rewarding content out farther, faster, and more frequently. If the item is false, hateful, or completely absurd, it hardly matters to the community. In fact, highly disputable, divisive, or disreputable content can become even more valuable as a signal of identity." (Vaidhyanathan, 2018). The pattern is incessant. Silicon Valley in general and Facebook in specific have an ideology that is being enforced on most of the world. As Zuckerberg's creation grew, it needed to justify its new scale to its investors, to its users, and to the world. According to Dustin Moskovitz, who cofounded the company with Zuckerberg at Harvard, "It was always very important for our brand to get away from the image of frivolity it had, especially in Silicon Valley." (Kirkpatrick, 2011). The fight to achieve transparency of individuals – in order to sell their data to advertisers – has always been the ultimate goal.

With the threat of impending regulation, Mark Zuckerberg is trying to save its business model by forcing a new path. His new plan consists on deeper integration between its bigger companies and services (Facebook, Messenger, Instagram and WhatsApp) and claiming to protect user's privacy [39] because it "gives people the freedom to be themselves and connect more naturally, which is why we build social networks.". The monumental shift intend to shed its business from regulation and come up with new excuses to claim that, this time, there will be no more mishaps. But it is probably too late for that – and it should be, as I explain below.

 

Paths going forward

Given the severity of the accusations against Facebook, the discussion must move on to understand how to curb its power. The European Union has adopted a threefold approach that aims to tax, control and split the company. That fight has been led by the company's most formidable adversary: Margrethe Vestager, the European Commissioner for Competition. She has made very clear that the EU will not allow for a free reign in assaulting privacy, disrupting democracy and promoting hate speech – therefore it is enacting legislation to undermine its operations in Europe. The recent fines to Facebook's illegal activity are one side of this action, the other being legislation such as the GDPR and the directive about copyright reform about to be implemented – and more regulation is to be expected to avoid bundling the data of the different social networks owned by Facebook. In an interesting interview [40], Vestager declined to say if Facebook has too much power, by replying that "It is difficult to answer in general because you would need first to have a theory of harm". It is precisely that theory that seems to have been built in the last two years, forming the basis for a new approach in the EU: a wide initiative to combat voter manipulation through social media, a code of practice on disinformation and support for the creation of an independent network of fact-checkers. That has been followed by asking "regulators to work together" [41] to toughen the stance against the tech giants. The fact is that, being Facebook an American company, there's only so much the European Union can do.

In the United States the discussion about the need to break Facebook has also moved further, and the company is now seen as a monopoly facing no competitors. Like the phone companies in the late twentieth century, Facebook can also be forced to break in two or three parcels, allowing for increased competitiveness in the social media business. Another common quoted reference is Microsoft, who was forced to stop its monopolistic actions, ending its dominance over the software business. Some authors pointed in this direction already [42]: "Rather than trying to humanize technology, then, we should be trying to democratize it. (…) First, it requires limiting and eroding Silicon Valley's power. Antitrust laws and tax policy offer useful ways to claw back the fortunes Big Tech has built on common resources. After all, Silicon Valley wouldn't exist without billions of dollars of public funding, not to mention the vast quantities of information that we all provide for free. In addition to taxing and shrinking tech firms, democratic governments should be making rules about how those firms are allowed to behave – rules that restrict how they can collect and use our personal data, for instance, like the General Data Protection Regulation coming into effect in the European Union later this month. But a more robust regulation of Silicon Valley isn't enough. We also need to pry the ownership of our digital infrastructure away from private firms.". Several other authors have weighed in: "Strong antitrust intervention would be the best way to address the concentration of power that is Facebook. The United States should break up Facebook. It should sever WhatsApp, Instagram, Oculus Rift, and even Facebook Messenger from the core Facebook application and company. Each of these parts should exist separately, competing against each other for labor, capital, users, data, and advertisers. Future mergers and acquisitions should raise serious questions about the potential power of Facebook (or Instagram or Oculus Rift) with even more user data to exploit." (Vaidhyanathan, 2018)

The discussion is framed by big tech, not only Facebook. But the social media brand led by Mark Zuckerberg owns great part of the blame for putting these companies and their business practices under the spotlight. The idea finally became mainstream when an American senator and Democratic candidate for the presidency of the United States presented a bold plan to "break up tech". Elizabeth Warren unveiled her plan with three goals in mind: protect small businesses, restore competitiveness and innovation into the tech sector and protect the future of the internet. Although it is a one-candidate proposal, it has gathered wide support from both sides of the aisle, showing that major lobbying efforts have been unable to prevent the dissatisfaction with big technological companies. The plan is a lot more explicit than the ones currently under discussion in Brussels but follows some of the guidelines from the document presented by the British bipartisan commission from the House of Commons.

The one major criticism put forward by some techno evangelists is that a political decision like this will place America behind its rival, China. But questioning American dominance is precisely what many in Europe and the UK want, since a strain on big tech companies might stimulate new forms of innovation coming from other parts of the world. Even India, an aspiring superpower in technology, is discussing how to curb Google, Facebook and Amazon and replace them with local-born companies. In the United States, the discussion will be framed by the Senate and President races in 2020. But the tide has turned, and regulation is coming hard for the megacompanies – with Facebook being the easiest, bigger target around. Furthermore, the social media platform has so far only acted when threatened in its business model: when Germany approved a law forcing companies to take responsibility for fake content in its platforms, Facebook acted, and it is now believed 15% of the workforce of fact-checkers for the company work in Germany (in spite of the country having only 2% of its users).

A part of the British report points to the need of digital literacy, defending it becomes "a fourth pillar of education, alongside reading, writing and maths." The idea is solid and needs to be applied urgently: the speed of change is accelerating and the integration of Artificial Intelligence, Robotics, Mixed Reality and Big Data will fundamentally change our societal ecosystem. Tools are required for citizens to navigate new public spaces – and education must be a stepping stone for that. Several authors have weighed in, questioning the role society puts technology in: "The problem (…) lies in the irrational ways we think about science and technology. When we make a cult of technology and welcome its immediate rewards and conveniences into our lives without consideration of the long-term costs, we make fools of ourselves. When we ignore or dismiss unapplied science, the search for knowledge for its own sake, we also make fools of ourselves. Somehow — in the United States, at least — we have managed to do both of these things." (Vaidhyanathan, 2018).

This seems to be the right path forward. Curbing Facebook power will reduce its influence on democratic processes, allow users to regain some degree of privacy and entice new competition in the social media arena. For that to happen authorities in both the US and EU must address important questions that will test their long-term perspectives in terms of capitalism, regulation and citizen's rights. As the discussion evolves, so will the actions. And they will frame our understanding of the present.

 

Conclusion

This essay has discussed the reasons for reducing the power of Facebook by curbing its monopolistic actions and regulating its activity. Furthermore, it has laid the ground for the arguments in favor of regulation by major powers, most notably the American federal government and the European Union.

This essay has shown that Facebook has repeatedly acted both directly and indirectly to subvert democratic processes, restrain innovations and lower speech in the public arena. It has demonstrated that Facebook is indeed a media company, with active choices leading to a curation of content individually tailored to each user – and that all this happens because it is part of the business model of the social media company. The pattern is clear, with repeated actions followed by excuses without any committed change. More so, this essay showed there is a consensus forming against big tech in general and Facebook in particular, paving the way for regulation that allows an effective system of checks and balances. The arguments laid out in this essay indicate not only that regulation is desirable, but also that is necessary.

All the major solutions presented so far by the British government, the EU and presidential candidates in the US point to the same legalistic approach based on reliable data and historical examples. Effective regulation should therefore focus on constraining its business model, force it to take responsibility for its role in the public arena and allowing for competition on freedom for incumbents to grow.

The study did not consider the use of broad regulation concerning all major digital platforms, although the solutions presented can certainly be configured to apply to similar business models and practices. Notwithstanding these limitations, the essay suggests that the pure scale of harm Facebook is causing demands a swift and decisive action.

For future analysis, it would be interesting to assess the effects of the regulation efforts within the EU, most notably the GDPR and the copyright directive, but also of the new laws concerning speech and falsehoods in Germany and France. At the same time, we will witness a race between Facebook's integration of its products and the implementation of regulation forcing its dissolution – the winner of this competition will lay the ground for whatever comes ahead in terms of the power for the social media company

 

References

Alhindi, W. A., Talha, M., & Sulong, G. B. (2012). The Role of Modern Technology in Arab Spring. Archives Des Sciences, 65(8), 12.         [ Links ]

Bradshaw, S., & Howard, P. N. (2018). Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation (p. 26). Oxford: University of Oxford.         [ Links ]

Cohen, N. (2017). The Know-it-Alls: The Rise of Silicon Valley as a Political Powerhouse and Social Wrecking Ball. New York: The New Press.         [ Links ]

Fletcher, R., Cornia, A., Graves, L., & Nielsen, R. K. (2018). Measuring the reach of "fake news" and online disinformation in Europe [Factcsheet]. Reuters Institute for the Study of Journalism. Retrieved from https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2018-02/Measuring%20the%20reach%20of%20fake%20news%20and%20online%20distribution%20in%20Europe%20CORRECT%20FLAG.pdf        [ Links ]

Fletcher, R., & Nielsen, R. K. (2018a). Are people incidentally exposed to news on social media? A comparative analysis. New Media & Society, 20(7), 2450–2468. https://doi.org/10.1177/1461444817724170        [ Links ]

Fletcher, R., & Nielsen, R. K. (2018b). Automated Serendipity: The effect of using search engines on news repertoire balance and diversity. Digital Journalism, 6(8), 976–989. https://doi.org/10.1080/21670811.2018.1502045        [ Links ]

Foer, F. (2017). World Without Mind. New York: Penguin Press.         [ Links ]

Ghonim, W. (2012). Revolution 2.0: the power of the people is greater than the people in power: a memoir. Boston: Houghton Mifflin Harcourt.         [ Links ]

House of Commons. (2019). Disinformation and "fake news": Final Report. Retrieved from https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf        [ Links ]

House of Commons. (2018). Disinformation and "fake news": Interim Report. Retrieved from https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/363.pdf        [ Links ]

Jarvis, J. (2011). Public Parts. New York: Simon & Schuster.         [ Links ] 

Kirkpatrick, D. (2011). The Facebook effect: the inside story of the company that is connecting the world. New York: Simon & Schuster Paperbacks.         [ Links ]

Muller, K., & Schwarz, C. (2018). Fanning the Flames of Hate: Social Media and Hate Crime. SSRN Electronic Journal, 6 December. https://doi.org/10.2139/ssrn.3082972        [ Links ]

Nichols, T. M. (2017). The death of expertise: the campaign against established knowledge and why it matters. New York: Oxford University Press.         [ Links ]

Pariser, E. (2011). The filter bubble: what the Internet is hiding from you. New York: Penguin Press        [ Links ]

Pennycook, G., & Rand, D. G. (2019). Lazy, not Biased. Cognition, 188, 39-50 https://doi.org/10.1016/j.cognition.2018.06.011         [ Links ]

Rabin-Havt, A. (2016). Lies, Incorporated - The World of Post-Truth Politics. New York: Anchor Books.         [ Links ]

Roger McNamee. (2019). Zucked: waking up to the Facebook catastrophe. New York: Penguin Press.         [ Links ]

Seth Flaxman, Sharad Goel, & Justin M. Rao. (2016). Filter Bubbles, Echo Chambers and Online News Consumption. Public Opinion Quarterly, 80 (Special Issue 2016), 298–320. https://doi.org/10.1093/poq/nfw006        [ Links ]

Shirky, C. (2008). Here comes everybody: the power of organizing without organizations. New York: Penguin Press        [ Links ]

Sunstein, C. R. (2017). #Republic: divided democracy in the age of social media. Princeton: Princeton University Press.         [ Links ]

Taplin, J. (2017). Move fast and break things. New York: Little, Brown and Company         [ Links ]

Tufekci, Z. (2017). Twitter and tear gas: the power and fragility of networked protest. New Haven: Yale University Press.         [ Links ]

Vaidhyanathan, S. (2018). Anti-Social Media. Oxford: Oxford University Press.         [ Links ]

Wachter-Boettcher, S. (2017). Technically Wrong. New York: W. W. Norton.         [ Links ]

 

 

Submitted: 26th August 2018

Accepted: 30th May 2019

 

How to quote this article:

Andrade, D.Q. (2019). Paving the way for regulation: how the case against Facebook stacked up. Observatorio , 13(3), 113-128.

 

 

Note

[1] New Knowledge. (2018). The Tactics and Tropes of the Internet Research Agency. Retrieved in 2018, July 18, from https://disinformationreport.blob.core.windows.net/disinformation-report/NewKnowledge-Disinformation-Report-Whitepaper.pdf

[2] Diamond, L. (2004, January). What is Democracy [pdf]. Retrieved in 2018, July 12, from https://web.stanford.edu/~ldiamond/iraq/WhaIsDemocracy012004.htm

[3] Cadwalladr, C., & Graham-Harrison, E. (2018, March 17). "Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach", The Guardian. http://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

[4] Vogelstein, F., & Thompson, N. (2018, February 12). "Inside Facebook's Two Years of Hell", Wired. https://www.wired.com/story/inside-facebook-mark-zuckerberg-2-years-of-hell/

[5] Etter, L., Silver, V., & Frier, S. (2017, December 21). "The Facebook Team Helping Regimes That Fight Their Opposition", Bloomberg. https://www.bloomberg.com/news/features/2017-12-21/inside-the-facebook-team-helping-regimes-that-reach-out-and-crack-down

[6] Rahim, S. (2018, May 14). "Why Mark Zuckerberg is a bigger threat to democracy than Donald Trump", Prospect. https://www.prospectmagazine.co.uk/magazine/why-mark-zuckerberg-is-a-bigger-threat-to-democracy-than-donald-trump

[7] Ferguson, N. (2018, January 19). "Social networks are creating a global crisis of democracy", Globe and Mail. https://www.theglobeandmail.com/opinion/niall-ferguson-social-networks-and-the-global-crisis-of-democracy/article37665172/

[8] Madrigal, A. C. (2017, October 12). "What Facebook Did to American Democracy", The Atlantic. https://www.theatlantic.com/technology/archive/2017/10/what-facebook-did/542502/

[9] Sarosh, I. (2016, February 29). "Role of Cell Phones in Arab Spring". Revoevoref: https://revoevoref.wordpress.com/2016/02/29/role-of-cell-phones-in-arab-spring/

[10] Zuckerberg, M. (2012). Founder's Letter, 2012 [Facebook update]. Retrieved from https://www.facebook.com/notes/mark-zuckerberg/founders-letter/10154500412571634

[11] Ghonim, W. (2015). "Let's design social media that drives real change" [Video File]. Retrieved from https://www.ted.com/talks/wael_ghonim_let_s_design_social_media_that_drives_real_change

[12] Subramanian, S. (2017, February 15). "Inside the Macedonian Fake-News complex", Wired. https://www.wired.com/2017/02/veles-macedonia-fake-news/

[13] Silverman, C. (2016, November 16). "This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook", BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook

[14] Warzel, C., & Mac, R. (2018, July 19). "Congratulations, Mr. President: Zuckerberg Secretly Called Trump After The Election", BuzzFeed News: https://www.buzzfeednews.com/article/ryanmac/congratulations-zuckerberg-call-trump-election-2016

[15] Cadwalladr, C., & Graham-Harrison, E. (2018, March 17). "Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach", The Guardian. http://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

[16] Madrigal, A. C. (2017, October 12). "What Facebook Did to American Democracy", The Atlantic. https://www.theatlantic.com/technology/archive/2017/10/what-facebook-did/542502/

[17] Fergurson, N. (2018, January 19). "Social networks are creating a global crisis of democracy", Globe and Mail. https://www.theglobeandmail.com/opinion/niall-ferguson-social-networks-and-the-global-crisis-of-democracy/article37665172/

[18] Miles, T. (2018, March 12). "U.N. investigators cite Facebook role in Myanmar crisis", Reuters. https://www.reuters.com/article/us-myanmar-rohingya-facebook/u-n-investigators-cite-facebook-role-in-myanmar-crisis-idUSKCN1GO2PN

[19] Gowen, A., & Bearak, M. (2017, December 8). "Fake news on Facebook fans the flames of hate against the Rohingya in Burma", Washington Post. https://www.washingtonpost.com/world/asia_pacific/fake-news-on-facebook-fans-the-flames-of-hate-against-the-rohingya-in-burma/2017/12/07/2c1fe830-ca1f-11e7-b506-8a10ed11ecf5_story.html

[20] Taub, A., & Fisher, M. (2018, April 21). "Where Countries Are Tinderboxes and Facebook Is a Match", New York Times. https://www.nytimes.com/2018/04/21/world/asia/facebook-sri-lanka-riots.html

[21] Patinkin, J. (2017, January 16). "How To Use Facebook And Fake News To Get People To Murder Each Other". BuzzFeed News: https://www.buzzfeednews.com/article/jasonpatinkin/how-to-get-people-to-murder-each-other-through-fake-news-and

[22] Goel, V., Raj, S., & Ravichandran, P. (2018, July 18). "How WhatsApp Leads Mobs to Murder in India", The New York Times. https://www.nytimes.com/interactive/2018/07/18/technology/whatsapp-india-killings.html

[23] Chakrabarti, S. (2018, January 22). Hard Questions: What Effect Does Social Media Have on Democracy? [Facebook update]. Retrieved from https://newsroom.fb.com/news/2018/01/effect-social-media-democracy/

[24] Waterson, J. (2018, May 22). "Five things we learned from Mark Zuckerberg's European parliament appearance", The Guardian. http://www.theguardian.com/technology/2018/may/22/five-things-we-learned-from-mark-zuckerbergs-european-parliament-appearance

[25] Hern, A. (2018, April 11). "Five things we learned from Mark Zuckerberg's Facebook hearing", The Guardian. http://www.theguardian.com/technology/2018/apr/11/mark-zuckerbergs-facebook-hearing-five-things-we-learned

[26] Tufekci, Z. (2018, April 6). "Why Zuckerberg's 14-Year Apology Tour Hasn't Fixed Facebook", Wired. https://www.wired.com/story/why-zuckerberg-15-year-apology-tour-hasnt-fixed-facebook/

[27] Keegan, J. (n.d.). "Blue Feed, Red Feed", Wall Street Journal. http://graphics.wsj.com/blue-feed-red-feed/ (Retrieved August 9, 2018)

[28] Field, M. (2018, February 6). "Facebook deletes claims it helped influence 2015 general election for Scottish National Party", The Telegraph. https://www.telegraph.co.uk/technology/2018/02/06/facebook-deletes-claims-helped-influence-2015-general-election/

[29] Facebook. (2018). "Reports Fourth Quarter and Full Year 2017 Results". Retrieved from https://investor.fb.com/investor-news/press-release-details/2018/Facebook-Reports-Fourth-Quarter-and-Full-Year-2017-Results/default.aspx

[30] Beckett, L. (2017, October 9). "Trump digital director says Facebook helped win the White House", The Guardian. https://www.theguardian.com/technology/2017/oct/08/trump-digital-director-brad-parscale-facebook-advertising

[31] Wong, J. C. (2018, March 19). "It might work too well: the dark art of political advertising online", The Guardian https://www.theguardian.com/technology/2018/mar/19/facebook-political-ads-social-media-history-online-democracy

[32] Confessore, N. (2018, August 14). "The Unlikely Activists Who Took On Silicon Valley — and Won", New York Times. https://www.nytimes.com/2018/08/14/magazine/facebook-google-privacy-data.html

[33] Farokhmanesh, M. (2018, April 24). "Facebook sure has been thinking a lot about nipples", The Verge. https://www.theverge.com/2018/4/24/17275114/facebook-community-guidelines-nipples-nudity

[34] Wong, J. C. (2016, September 9). "Mark Zuckerberg accused of abusing power after Facebook deletes 'napalm girl' post", The Guardian. https://www.theguardian.com/technology/2016/sep/08/facebook-mark-zuckerberg-napalm-girl-photo-vietnam-war

[35] Guynn, J. (2017, April 27). "Facebook Live is scene of another suicide; police say 'I hope this isn't a trend'", USA Today: https://www.usatoday.com/story/tech/news/2017/04/26/facebook-live-another-suicide/100941914/

[36] Mac, R., Warzel, C., & Kantrowitz, A. (2018, March 29). "Facebook Executive In 2016: "Maybe Someone Dies In A Terrorist Attack Coordinated On Our Tools", BuzzFeed. https://www.buzzfeednews.com/article/ryanmac/growth-at-any-cost-top-facebook-executive-defended-data

[37] Swisher, K. (Author) (2018, July 18). "Zuckerberg: The Recode interview" [Audio Podcast]. Retrieved from https://www.recode.net/2018/7/18/17575156/mark-zuckerberg-interview-facebook-recode-kara-swisher

[38] Newton, C. (2019, February 25). "The secret lives of Facebook moderators in America", The Verge. https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

[39] Mark Zuckerberg. (2019, March 6). A Privacy-Focused Vision for Social Networking [Facebook update]. Retrieved from https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/

[40] Rankin, J. (2018, June 8). "EU tech czar Margrethe Vestager: "Social media could deactivate democracy", The Guardian. https://www.theguardian.com/world/2018/jun/08/margrethe-vestager-eu-tech-regulator-i-fear-social-media-will-deactivate-democracy

[41] Boffey, D. (2018, August 3). "Brussels in EU-wide drive to combat voter manipulation online", The Guardian. http://www.theguardian.com/world/2018/aug/03/brussels-in-eu-wide-drive-to-combat-voter-manipulation-online

[42] Tarnoff, B., & Weigel, M. (2018, May 3). "Why Silicon Valley can't fix itself", The Guardian. http://www.theguardian.com/news/2018/may/03/why-silicon-valley-cant-fix-itself-tech-humanism

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons