At beaches and pools you'll
sometimes see a notice that reads "Lifeguard On Duty," or more often,
"No Lifeguard On Duty—Swim At Your Own Risk." Recently Microsoft, originator of the Edge
mobile browser, started including a feature in it called NewsGuard. The user must activate it, but once he or she
does, every news site that's been rated by NewsGuard (about 2000 so far) gets
either a green checkmark or a red exclamation point. Green means the site has passed enough of the
nine criteria NewsGuard uses to assess credibility and transparency to meet
with their approval. And of course, red
means the site flunked. The example
NewsGuard uses of a site that flunks is RT.com, which is operated by Russia but
doesn't make that fact exactly obvious.
The fact that such an
influential organization as Microsoft thought it was a good idea to include
this third-party app (NewsGuard is an independent operation based in New York
City) says something about the anxiety that tech and social media companies
feel concerning the issues of fake news, divisiveness, and related
matters.
Reasons for this are not
hard to find. As we learned how Russia
tried to influence the 2016 elections with fake social media accounts, we were
bombarded with tweets from the Oval Office saying all sorts of things, some of
which were actually true. When Facebook
founder Mark Zuckerberg was called before Congress last spring concerning
misuse of Facebook data by the research firm Cambridge Analytica, he appeared out
of his depth when he was asked about the finer points of free speech and what
his firm's responsibilities were with regard to spreading disinformation and
falsehoods, as well as selling information on users that could be used in
politically suspect ways.
On its own website,
NewsGuard boasts that it employs "professional journalists," not
algorithms, to evaluate news sites.
These journalists presumably sit around a table and debate whether a
given site is hiding its true source of financing, for example (not always an
easy thing to determine), or whether the news that shows up on it can be
verified by independent and multiple sources.
This is nothing more than good journalism, or what used to be called
good journalism. In an era when the word
"viral" means something good, at least when it comes to news,
"good" often substitutes for "popular," but there's a big
difference.
Here's where the
philosopher's distinction between "objective" and "subjective"
comes in handy. We have a sense that
objective news is better than subjective news, but there's a problem with
that. As the late Mortimer Adler wrote,
"We call something objective when it is the same for me, for you, and for
anyone else. We call something
subjective when it differs from one individual to another and when it is
exclusively the possession of one individual and of no one else." By that criterion, there aren't that many
objective news reports anywhere.
Pictures of a solar eclipse, maybe—obituaries, at least with regard to
the facts about a death. But maybe the
late So-and-So was a nice person to you, but a real SOB to others. Was he a nice guy or not? That's subjective, as is most of the news
reported by even the most sober and responsible journalists, unless it's
C-Span-type relaying of an event without any selection, editing, or other
intervention by a third party.
So, saying some news sites
are objective and others are subjective wouldn't get us very far. Instead, NewsGuard falls back on the
distinction between truth and falsehood, and relies on sources other than the
site itself to reveal falsehood. But of
course, those sources may not get it right either, whatever "right"
means. The upshot of all this is that if
you, as a NewsGuard evaluator of a website, find that most people and
institutions you trust say that a thing is false or misleading, you're going to
decide it's false or misleading, and you'll give that site a red "do not
trust" rating.
The fear in some circles is
that a liberal or other systematic bias may reveal itself in the ways that
NewsGuard rates sites. And I'm sure that
something like this will happen. Already
RT.com has run a story saying that NewsGuard is "controversial." It's understandable that the site used by
NewsGuard itself on its own website as an example of a red-rated source,
complains about the red rating.
The deeper question is
whether the NewsGuard feature will make any difference to users. The hope is that the hapless passive consumer
of news, who formerly was suckered into believing all kinds of claptrap, will
now see the red rating on his favorite sites and will turn over a new leaf,
avoiding places like Breitbart and the Drudge Report and becoming a more
enlightened and useful citizen and voter.
To some, that's a hope. To others, that's a fear, which is why many
news sources whose common characteristics are hard to discern, but may generally
be classed as conservative (with exceptions), have expressed concern that the
wide availaibility of NewsGuard will lead to some sort of discrimination
against them.
If it's a problem, it's not
one that I would personally spend a lot of sleepless nights over. For one thing, NewsGuard doesn't keep you
from viewing a site. It just tells you
that there may be problems with it, and details the problems. In that sense, it's just a kind of
fact-checker or background-provider, and I see no particular harm in that.
As long as using NewsGuard
is voluntary, and as long as its ratings, or something similar, don't acquire
the force of compulsion or law and succeed in banning sites altogether, it
seems to me that the app can do more good than harm. Of course, I haven't bothered to check
whether they're rating my site, but I doubt that it's one of the top 2000 news
sources that NewsGuard has inspected. We
try to tell the truth here, but most readers know this blog mixes opinion with facts. For those who can't tell
the difference, maybe NewsGuard will help.
No comments:
Post a Comment