Al Masalla-News- Official Tourism Travel Portal News At Middle East

جانبى طويل

THE DANGER OF FACEBOOK DEMOCRACY.. Sir James R. Mancham

THE DANGER OF FACEBOOK DEMOCRACY

 

 

 

 

 

 

By : Sir James R. Mancham, KBE
Founding President of the Republic of Seychelles
 


aTP-  Arab tourism portal News- Over recent times I have been giving much thought to the influence which Facebook has played in the modification of public opinion with respect to different aspects of governance and/or the profile of personalities as the Facebook political chess game is being played. For example, somebody who actively plays Facebook can easily destroy the reputation of others who make it a point not to get involved in the Facebook game. 


The reputation of a respected personality in society can be destroyed on Facebook without the victim even knowing about it if he is one who shuns and avoids the Facebook game. There is a well recognised legal jargon to the effect that what is not denied is deemed admitted. 


Well, if you are not engaged with Facebook you are not aware what is being written about you and therefore you do not proceed to deny what is being said. Hence, someone who could be seen as a “Statesman” in the eyes of the general public opinion could go down in the eyes of a certain section of the same society as being a traitor. 

I have been particularly worried about the Facebook game following the Presidential Election in Seychelles in December 2015 and now of course after the Clinton/Trump campaign which has resulted into a Trump’s victory. 

On my way back from Paris to Seychelles two days ago, I have been impacted by an article I read which has been written by Nathan Heller in a recent issue of the New Yorker which I feel should be read by all people in Seychelles who are in a questioning mood about the state of democracy today and in the future.

Mr Editor, please allow me to quote the article in its entirety. Attempting to make a precis of it could risk minimising the status of the seriousness of the article appropriately entitled “The Failure of Facebook Democracy”. 


“In December of 2007, the legal theorist Cass R. Sunstein wrote in The Chronicle of Higher Education about the filtering effects that frequently attend the spread of information on the Web. “As a result of the Internet, we live increasingly in an era of enclaves and niches—much of it voluntary, much of it produced by those who think they know, and often do know, what we’re likely to like,” Sunstein noted. In the piece, “The Polarization of Extremes,” Sunstein argued that the trend promised ill effects for the direction—or, more precisely, the misdirection—of public opinion. “If people are sorted into enclaves and niches, what will happen to their views?” he wondered. “What are the eventual effects on democracy?”
This month has provided a jarring answer.

 

 

The unexpected election of Donald Trump is said to owe debts to both niche extremism and rampant misinformation. Facebook, the most pervasive of the social networks, has received much scrutiny and blame. During the final weeks of the campaigns, it grew apparent that the site’s “news” algorithm—a mechanism that trawls posts from one’s online friends and rank-displays those deemed of interest—was not distinguishing between real news and false information: the sort of tall tales, groundless conspiracy theories, and oppositional propaganda that, in the Cenozoic era, circulated mainly via forwarded e-mails. (In the run-up to the election, widely shared false stories included reports that Pope Francis has endorsed Donald Trump and that Hillary Clinton had commissioned murders.) On Thursday, the Washington Post published an interview with what it called an “impresario of a Facebook fake-news empire.” He took responsibility. “I think Trump is in the White House because of me,” he said. “His followers don’t fact-check anything—they’ll post everything, believe anything.”

 


Facebook is not the only network to have trafficked phony news, but its numbers have been striking. A much-cited Pew survey, released in May, suggested that forty-four per cent of the general population used Facebook as a news source, a figure unrivalled by other social networks.

 

 

An analysis this week by Craig Silverman, of BuzzFeed, found that the twenty top-performing fake news stories on the network outperformed the twenty top real-news stories during the final three months before the election—and that seventeen of those fakes favored the Trump campaign. Trump’s exponents, including the candidate himself, routinely cited fake information on camera. In the eyes of critics, Facebook’s news feed has become a distribution channel for propagandistic misinformation.

 

 

“As long as it’s on Facebook and people can see it . . . people start believing it,” President Obama said right before the election. “It creates this dust cloud of nonsense.”

 


 The criticism has been hard to shake. Mark Zuckerberg, Facebook’s founder and C.E.O., dismissed complaints at a conference late last week and again in a lengthy post over the weekend. “The hoaxes that do exist are not limited to one partisan view, or even to politics. Over all, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other,” he wrote. “I believe we must proceed very carefully though. Identifying the ‘truth’ is complicated.” Few members of the public were appeased (not least because Facebook’s advertising strategy is premised on the idea that it can move the needle of public opinion), and even some Facebook employees were uneasy.

 

 

On Monday, BuzzFeed’s Sheera Frenkel reported on an anonymous cabal of “renegade Facebook employees” who found Zuckerberg’s claims dishonest. They were working to develop formal recommendations for change. “You don’t have to believe Facebook got Trump elected to be a little chilled by its current estrangement from fact,” Brian Phillips observed in a cutting piece on MTV.com. “One of the conditions of democratic resistance is having an accurate picture of what to resist.”

 


 The democratic effects of widespread misinformation were Sunstein’s preoccupation when he wrote about “self-sorting” in 2007. He cited an experiment previously run in Colorado. The study used liberal subjects from Boulder and conservative subjects from Colorado Springs. Participants had been divided into groups and instructed to discuss controversial issues: same-sex unions, global warming, affirmative action. Researchers recorded individual opinions before and after fifteen minutes of discussion. Trends emerged.

 

 

When participants spoke with politically like-minded people, their opinions usually became more extreme. Liberals grew more liberal in their thinking on a given issue; conservatives, more conservative. The range of opinion narrowed, too. Like-minded participants drifted toward consensus.

 


 Sunstein projected that a similar drift would occur online, where information in support of preëxisting views was readily available (and even hard to avoid, due to the way Internet browsing works). He called the polarization that it produced “enclave extremism.” One contributing factor, he contended, was the social flow of information: people who hung out with people of a similar view were apt to encounter a disproportionate amount of information in support of that view, intensifying their support.

 

 

He thought more purely social effects were involved, too: “People want to be perceived favorably by other group members.” Most citizens, on most issues, don’t know precisely what they think, and are susceptible to minor suasion. Enclave opinion, which builds confidence in one’s views, allows general thoughts to sharpen and intensify. The risk was that bad ideas could gain wide adherence if the self-sorting worked out right.

 


 Sunstein did not account for Facebook algorithms or the spread of demonstrably false information. The first factor amplifies the enclave effect he described; the second nurtures confident extremism. Even when information is accurate, enclave extremism helps explain how those who trade in fact, such as journalists, could manage to get the big-picture stuff, such as the electoral mood of the country, completely wrong. In the days after Trump’s election, many bemused coastal pundits lamented what the writer Eli Pariser has called a “filter bubble”: an echo chamber of information and opinion which, in this case, led those writing the news to be disproportionately exposed to information in line with their existing theories. The more we rely on the digital sphere as our window onto the world, the more vulnerable to its weaknesses we are.

 


 A couple of years ago, reporting from San Francisco, I noted an erosion of public meaning which seemed to be getting in the way of civic progress. A key cause, I suggested at the time, was technology’s filtering effects—the way that, as we lived more of our lives in a personal bespoke, we lost touch with the common ground, and the common language, that made meaningful public work possible. Perhaps filtering effects are at play, but nothing I’ve seen since has changed my mind. The most dangerous intellectual spectre today seems not to be lack of information but the absence of a common information sphere in which to share it across boundaries of belief.

 


Pauline Kael, a New Yorker film critic for many years, once famously quipped, in a speech, “I live in a rather special world. I only know one person who voted for Nixon.” Enclave extremism isn’t new, in other words. What may be fresher is our oblivion of the moments when we’re living in its thrall. If a majority of Americans are getting their news from Facebook, then Facebook surely has a civic obligation to insure the information it disseminates is sound. The long-term effects of enclave extremism, Sunstein observed, can be bad news for democracy: “Those who flock together, on the Internet or elsewhere, will end up both confident and wrong, simply because they have not been sufficiently exposed to counter-arguments. They may even think of their fellow citizens as opponents or adversaries in some kind of ‘war.’ “A Presidential Administration with that outlook is dangerous. But a confident, misinformed public is much worse”. 


 
Well, Mr Editor, I believe it could be a good thing for your readers in Seychelles and abroad to read this Opinion Page a second time to better appreciate the serious danger of Facebook Democracy today and in future. It could represent the end of democracy as we have so far known it.

 


نبذة عن الكاتب

مقالات ذات صله