Facebook is now rating users on how trustworthy they arein identifying and reporting real and false news.The ratings system was first reported by The Washington Post.A Post reporter interviewed Facebook product manager Tessa Lyons,who leads the company's efforts to fight misinformation.Lyons told the newspaper the systemwas developed and put into place over the past year.In January, Facebook announced a similar systemto produce ratings for the trustworthiness of news sources.At that time, Facebook founder Mark Zuckerberg said those ratingswould be based on information provided by Facebook users.Zuckerberg said news sources receiving higher trustworthy ratingsfrom the community would be prioritized in the social media service's News Feed.The new ratings system is designed to predicthow effective a user is at identifying and reporting false news stories.Every user is given a rating between zero and one, the Post reports.Lyons told the newspaper such a system is necessarybecause some users are incorrectly reporting whether a story is true or false.It's "not uncommon for people to tell us something is false simplybecause they disagree with the premise of a storyor they're intentionally trying to target a particular publisher," Lyons said.As an example, she said a user's trustworthiness rating would go upif they previously reported a news story as false,and the story was later confirmed as false by an independent fact checker.In addition to information from users, Facebookuses machine learning systems to choose stories to be checked for truthfulness.The company has a partnership with several major news and fact-checking organizationsthat examine news stories reported as possibly being false.Lyons noted that the numbered rating is not the single measureFacebook uses to judge a user's overall trustworthiness.She said the company also uses other "signals" to rate users,but did not provide further details.In addition to taking steps to fight misinformation,Facebook has also sought to limit effortsby foreign organizations to influence the U.S. political process.Facebook previously found evidence that false accounts created in Russiaand other nations were used to try to influence American voters in the 2016 election.On Tuesday, Facebook said it had identifiedand removed hundreds of accounts linked to Russia and Iran.It said the accounts were part of separate disinformation campaigns on Facebook.In announcing the findings, Facebook chief Zuckerberg saidthere was still a lot the company does not know about the operations.However, he described the campaigns as "sophisticated"and well-financed efforts that are likely to continue."You're going to see people try to abuse the services in every way possible...including now nation states," Zuckerberg said.Spokesmen for both Iran and Russia denied any stateinvolvement in the activities described by Facebook.This week, American software maker Microsoft reported it had taken control of several websites created by hackers linked to Russia's government.The company said the websites were made to look like they belongedto the U.S. Senate and conservative research groups.But they were actually false websites created in an effort to gather personal details of users.Microsoft warned the hacking incidents were further evidencethat Russia is expanding its attacks before U.S. congressional elections in November.