By EMERSON LYNN

As the investigation of Russia’s interference in the 2016 election continues what we are learning is that there are a lot of bad actors and that false news spreads faster and carries further than news that is accurate.

Here is the studied observation from Sinan Aral of the Massachusetts Institute of Technology: “Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information.”

The MIT report examined 126,000 stories tweeted by roughly three million people more than 4.5 million times. That’s a sizable data base. What the report found was that stories that were not true were 70 percent more likely to be pushed forward than were stores that were true.

The researchers had no idea as to why people would push something that was not true over something that was, except that people have a tendency to push something that’s “novel” something that is on sensationalism’s western slope. It appeals, evidently, to our more salacious side.

The Russians know this. So do the Chinese. So do special interest groups within our own country. They all know that we’re not motivated to do the hard work of verifying the accuracy of “news” pushed their way by others.

The Chinese government is handling this by making sure no one else is in the social media or informational space. They have cameras on every corner. They monitor every call, every

post. Any false news would be rooted out before it took hold. Such is the authoritarian’s way.

The Russian’s aren’t much further behind. Or North Korea. Or Iran, etc. Technology has become the weapon of the world’s strongmen. All of its obvious advantages aside, it has marginalized the importance of democracy.

So how does a nation like ours combat such rampant levels of disinformation?

Here’s what doesn’t work: Any sort of organized campaign against technology itself. We can’t pretend that the technology doesn’t exist. We can’t ask people to switch to flip phones. We can’t divorce ourselves from the technology that allows us to share information.

What we can do is to hold the technology companies involved to a higher standard, and ultimately, depending on the damage done, or the threats faced, regulate them to the point of being responsible.

We need the same sort of behaviors from the social media giants that, for example, exist with newspapers. Newspapers can be sued for defamation- tech giants cannot. Newspapers have editors; more or less the sieve that separates the chaff from the grain. Tech giants do not, particularly at the granular level required.

That lack of regulation, or editorial oversight began because the tech businesses were organized around the need to scale their operations. Regulation is an impediment. There was also great appeal in having something be completely open to all users, no restrictions.

But the lack of restrictions, and the unrelenting goal

of size and profitability has given us what we have, which is a tiny handful of companies that control a huge percentage of the audience and what they read, see and hear.

How’s that working?

Marvelously, if the goal is an easy way to share pictures of your children with the rest of the family. Miserably, if the goal is to tell people the truth about what is happening about them.

What needs to be required of the social media giants is a means by which their users are identified as reputable or not. That can be done with the algorithms that exist. The companies can easily identify the information that is disputable; what they need to do is to identify those who have a tendency to share the information with others.

An idea being pushed is identifying those who share information by an easily recognized code. You get the number one if you don’t share disputed information. You get the number two if you do so occasionally. And you get a three if you push all disputed information to all your friends all the time. Instead of numbers it could be colors. Or a good emoji, a sorta good emoji, or something that looks like the Tasmanian devil.

Pick your poison. But when we know that it’s more appealing to spread disinformation than information that’s true, then it behooves us to do what’s necessary to restore truth’s primacy. That outweighs, one would hope, a marketplace that places a value on profitability alone.

Emerson Lynn is co-publisher of The Essex Reporter.