Don't miss

Replay


LATEST SHOWS

EYE ON AFRICA

President Robert Mugabe emerges from house arrest

Read more

MEDIAWATCH

Harassment and hypocrisy in Washington

Read more

THE WORLD THIS WEEK

Military pressures Robert Mugabe to step down, Macron mediates Lebanon crisis

Read more

FRANCE IN FOCUS

France raises a glass to tourism

Read more

FOCUS

France's newest political party accused of 'old' methods

Read more

#THE 51%

Hear me roar: The growing economic power of older women

Read more

#TECH 24

The future of surgery

Read more

DOWN TO EARTH

The tiny parasite threatening your salmon sushi

Read more

ENCORE!

Director Joachim Trier: True horror is a 'lack of self-acceptance'

Read more

Americas france 24

Zuckerberg denies fake news on Facebook swayed US election

© Josh Edelson, AFP | Anna Klaile does a cartwheel in front of the Facebook sign and logo in Menlo Park, California on November 4, 2016

Video by FRANCE 24

Text by FRANCE 24

Latest update : 2016-11-11

Facebook chief Mark Zuckerberg on Thursday rejected the idea that bogus stories shared on the social network paved a path to victory for President-elect Donald Trump.

"The idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way I think is a pretty crazy idea," Zuckerberg said during an on-stage chat at a Technonomy technology trends conference in California.

The Facebook co-founder rhetorically asked why people would think there would be hoax stories about one candidate but not the other.

He also dismissed worries about Facebook users existing in "bubbles" where they only see news or perspectives echoing their viewpoints.

"Voters make decisions based on their lived experience," Zuckerberg said.

"You don't generally go wrong when you trust that people understand what they care about and what's important to them and you build systems that reflect that."

He added that research gathered at Facebook suggests news-filter bubbles are not a problem.

The social network found that while people may have a lot of friends very much like themselves, almost everyone at Facebook has someone in their mix who breaks the mold in some way, such as religion, ethnicity, or background.

However, Zuckerberg added, Facebook has also found that people are less inclined to click on links or otherwise check out shared stories that don't line up with their views.

"We just tune out," Zuckerberg said of the pattern. "I don't know what to do about that."

While acknowledging the importance of the election, he advised maintaining faith that most progress in innovation is made by private citizens, typically without help from the government.

"These elections make a real difference in the world, but it would not be right to suggest that it changes the fundamental arc of technology or progress over time," Zuckerberg said.

Evolution of Facebook

The News Feed at Facebook has evolved from early days of being about sharing personal tidbits with friends or family to becoming a platform for important news.

Facebook continues to adapt to that shift, modifying the way it ranks stories as well as the community guidelines regarding what might be offensive, according to Zuckerberg.

"Facebook definitely has filters that allow people to see more of the content they agree with than the content they do not agree with,” Michael Brand, a professor of data science at Australia’s Monash University, told FRANCE 24 on Friday.

Monash University Professor Michael Brand discusses Facebook and news bias

“Facebook has a financial interest in giving news that is confirmatory and optimistic because that is what makes people stay on the page and continue clicking. That’s how Facebook makes its money,” Brand said. “But in terms of how much effect that causes, well, there have been studies including studies by Facebook, that show that the effect exists and is even substantial.”

The professor said false stories are six times more likely than true ones to be shared.

“People like to hear these things. People like to share them with their friends because they confirm their beliefs,” Brand explained to FRANCE 24. “Now, algorithms are obviously picking up on that and are amplifying that effect because Facebook wants to show you what you’d like to hear. But I think the blame here should be put on how we consume our news and there should be thoughts regarding whether Facebook is a news outlet, in which case it should be regulated as a news outlet,” he added.

Zuckerberg, for his part, conceded there was a lot that could be improved and that the process was ongoing, but contended it was misguided to put blame on Facebook.

"I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw some fake news," Zuckerberg said.

"If you believe that, then I don't think you have internalized the message that Trump supporters are trying to send in this election."

(FRANCE 24 with AFP)

Date created : 2016-11-11

  • Vietnam

    Facebook backs down from ‘Napalm Girl’ censorship

    Read more

  • USA - FRANCE

    Why the father of a Paris attacks victim is suing Facebook, Twitter, Google

    Read more

  • EU

    Facebook, Twitter, YouTube, Microsoft agree EU deal on removing hate speech

    Read more

COMMENT(S)