The world's largest social network, with a population greater than that of any country on Earth, by default won't consider facts, honesty or professionalism when judging news organizations.
Instead, Zuckerberg and his team are going to survey random people, maybe some of your friends, maybe not, who'll decide what publications are most trustworthy. Whatever Facebook learns from us and a Facebook spokesman told me it won't make any of those details public will filter down into how often you see my stories in your feed.
Yes, your ranting Uncle Ed may help determine whether you see the next big scoop from The New York Times or Wall Street Journal or CNN or Fox News.
"People who use Facebook have made clear that they want to see accurate, informative and relevant news on Facebook, as well as news from sources they trust," a Facebook spokesman told me. "The question was how to measure that. We could try to make that decision ourselves, but that's not something we were comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask the community, and have their feedback determine the ranking."
So, he added, "We decided that having the community determine which sources are broadly trusted would be most objective."
Why does all this matter? More people than ever get their news from social media sites, with Facebook taking the top ranking in a Pew survey.
So it's probably no surprise that reaction to Zuckerberg's decision by media experts and those who follow tech closely has been largely negative, given that Facebook seems to be abdicating its responsibility as a news distribution service by not vetting the pieces people share.
Facebook to rank news outlets by trustworthiness
This isn't that different from how Facebook's acted before. The social network has been criticized for allowing Russian agents, white supremacists and other propagandists to use Facebook to fool readers with real "fake" news stories and for creating filter bubbles, through which Facebook's mysterious algorithms only show you stories that reinforce a point of view.
This latest move to crowdsource credibility seems like a logical extension of that, said Michael Kearney, an assistant professor at the University of Missouri School of Journalism. "It feels like Facebook is taking the easy route to please people now," he said. That stands in stark contrast to the work that fact-checker websites often have to do.
What's odd is that Facebook employs some of the smartest engineers on the planet, all working to "bring the world closer together." Why isn't it smart enough to figure out how to clean up its propaganda and fake news problems?
Now Playing:Mark Zuckerberg hopes Facebook's newest tools can lead...
Some people believe Zuckerberg instead may just be playing us. Emily Bell, a professor at Columbia's Graduate School of Journalism, is going with the "playing us" theory, given that Facebook makes its money (more than $10 billion in 2016) from letting advertisers target users.
"If Facebook wants to recognize 'trusted' publishers then it should pay those publishers a carriage fee similar to the model adopted by cable companies," he wrote.
There's one more theory to consider. Andrew Keen, a tech critic and author, thinks maybe Zuckerberg is responding to souring attitudes. People, he told the tech news site Recode, are realizing that the way social networks operate is "not in their best interests."
"Mark Zuckerberg has been rearranging the deck chairs on the Titanic with these latest reforms," Keen said. "I'd like to see him really acknowledge the problem and deal with it directly and come up with radical solutions."