One of most pernicious means underway to crush independent news sites is the release of software tools that brand them as unreliable. This means that hidden developers and the parties that fed them information are beyond any accountability, yet would serve as censors.
Last week, the Financial Times described efforts to use software to designate certain sites as suspect:
Concern over the impact on voters of soaring amounts of fake news during the US election has sparked a hackathon where the technology industry and the media’s top thinkers are seeking to find new ways to prioritise the truth.If you read the article in full, you’ll see it depicts Wikipedia as a gold standard. As Gary Null discussed yesterday in his Progressive Commentary Hour show (we were a guest; the archived interview should e up later today), it is in fact very difficult to get corrections of Wikipedia entries. Similarly, on certain topics, such as economics, Wikipedia minimizes or excludes non-mainstream views even when they have solid empirical underpinnings and have been given a hearing in academic journals and the press.
A community has gathered to share ideas around a 58-page Google document started by Eli Pariser, the author of a best-selling critique of social media, The Filter Bubble: What the Internet is hiding from you. A professor has circulated a spreadsheet of reliable and less reliable news sources for comment, while hackathons at Princeton and in the Bay Area have produced prototype products that Facebook could copy…
A team of students won a prize sponsored by Google at a Princeton hackathon last week by creating a quick and dirty prototype of a product that does just that: showing Facebook users a “trust rating” for stories they see, based on an online safety rating provided by “World of Trust”.