Living in a “post truth” reality is truly disturbing.
"At the moment we have no way of understanding whether what one sees online is organic or part of a campaign, a human or a bot; who has created certain pieces of content and to what end; why an algorithm shows us one piece of content over another; which bits of our own data are being used to target us with certain messages.
In practical terms, media needs to create content that forces politicians to talk about constructive and practical solutions to problems, and then develop technologies and methodologies to hold them to account going forward, testing their policies and promises."
The media lacks the wherewithal and the discipline to even dream of accomplishing such a thing. I don’t see it as any more attainable than the vacuous concept of making America great again. I think we’ve passed the point of no return on both.
While I appreciate your cynicism in this regard I can’t disagree more.
The specter of fake news has simply increased the importance of MSM. The Washington Post, the New York Times, the BBC, the CBC–these are where I turn to for news, and I’m mostly happy with their accuracy.
They aren’t perfect, but no human endeavor is. I don’t expect perfection from NASA, my parents, or the Pope. Why should I expect it from a news outlet? I track what they say and then watch what actually happens, and mainly I’m looking at accuracy.
Of course, I know the difference between the news side of news outlet and the opinion side. A lot of people can’t wrap their heads around that.
I’m always amused by people who deride MSM–and then quote WP or NYT to prove their point!
Sometimes news outlets get things spectacularly wrong, just like surgeons do, or NASA–or merchant marine officers. Dan Rather, Brian Williams, Judith Miller are examples of journalists who lost their jobs because of inaccurate reporting. But the point is, they lost their jobs. Their news outlets have standards. Not all of them do. What if we held politicians to the same standards of accuracy?
When someone in power says, “Everybody is lying to you except me” know with certainty that you’re listening to a demagogue at best and tyrant at worst.
As for the prosperity of the USA, I am encouraged by something I read by James Michener in his non-fiction book about Spain, Iberia, written in 1970. In it he writes about the same magnitude of worries present Americans have about their country’s future. He wrote about a generation of young people strung out on LSD and pot. He writes about political polarization and violence, and racial unrest. Instead of climate change, the threat then was nuclear annihilation. To read him, the America of 1970 was on the verge of collapse. Or not.
Since 1970 the USA has just became more powerful and prosperous. It has a new set of challenges. Michener said then that every nation reinvents itself in a cycle that lasts about 18 years, and the people brought up in the old cycle are not inclined to be rosy about their cycle coming to an end and a new one starting up. 1945–1963–1981–1998–2016…
Of course not every journalist is so biased as to be untrustworthy but no human being or news outlet is completely neutral. You’re right about the way to get to the truth, or at least as close as we can which is to find the middle ground between various interpretations.
When I refer to the general media I think of the news sources with an obvious agenda and their effect on people whose opinion is reinforced by the fact that they only listen to reports that confirm their existing beliefs and discard anything else as fake news.
I can only hope you’re right about the cycles and that the general media will regain some sense of responsibility. Time will tell but I don’t expect it to happen in my lifetime, keeping in mind that I’m approaching the end of the trail.
As to making America great again, for those whose interpretation is that we can turn back the clock, that ship has sailed.
A very interesting look at the ongoing changes in public discourse. This quote puts its finger right on what’s going on in the political landscape:
Information, in this worldview, precedes essence. First, you have an information warfare goal and then you create an ideology to fit it. Whether the ideology is right or wrong is irrelevant; it just needs to serve a tactical function.
The article concerns itself mostly with things that have already happened, but I believe there are dramatic changes afoot, especially with regards to this:
On a more insidious level, the logic of a propaganda that targets people with unique messaging based on the data they leave
I think it’s hard to fully appreciate how much the advent of high level neural AI has to say for this. So far, this process is largely driven by manpower aided by primitive bots. The recently presented GPT-2 is particularly frightening, as it is able to produce text with stunningly eloquent coherence. As one pundit put it, “I know journalists who don’t write this well.” Once this technology fully matures, crafting targeted propaganda will no longer be manpower bound, but a simple matter of computing power. Essentially, it is to information warfare what nuclear weapons are to conventional warfare, and we have only just begun to see the outline of the changes it will bring.
I think I posted this video in another thread, but it bears re-posting in this context:
Another worrying development in the same genre is the emergence of deepfakes. High fidelity moving images have long enjoyed a position as irrefutable proof, because they have been so costly to falsify that it was essentially impossible without the aid of a AAA Hollywood budget. Now it is rapidly becoming a trivial matter. When people have already started losing interest in the truth because it is hard to discern, what will happen when the gold standard of irrefutable proof gets destroyed?
“Objectivity is a myth imposed upon us” says Dmitry Kiselev
In the context of this article, it makes sense to refute that claim. However, one should bear in mind that any philosopher worth his salt would be able to defend it as objectively true. Truth is not absolute, but merely a projection of reality that depends entirely on your point of view.
Pomeranzev doesn’t get carried away with technical details, and rightfully so, seeing as the article is already long enough. However, there is some technological context needed, for example in this statement:
Implementing a regulatory approach that focuses on behavior and mechanisms of information production means redesigning user interfaces to make these mechanisms of information production interpretable.
[Corrections Below] That is simply not feasible on a technical level, because the information isn’t there. The sort of neural algorithms used for content targeting by Google et al don’t have human readable data at the intermediary levels. Even Google’s analysts can’t really tell why YT chose to suggest that particular cat video, since the logic is so fuzzy as to make no sense. They could tell you which of a million neurons fired at what level to float that video to the top of the pile, but that doesn’t tell you very much at all. This is one of the great challenges in neural AI development, which no amount of regulation could hope to hasten.
The Russians may have invented information warfare in its modern form, so it’s only right that they get the glory, but at least the West isn’t far behind:
Yes, but is that what is being said? If there is information and the source can not be verified isn’t the meta-information that it is not verified sufficient?
Implementing a regulatory approach that focuses on behavior and mechanisms of information production means redesigning user interfaces to make these mechanisms of information production interpretable. It requires public oversight of algorithms, easily searchable databases of commercial and issue based ads and campaigns (with information about who they were created by and targeted at), the ability to research how different forces try to game algorithms, and so on. It is by no means a blanket ban on anonymity, which is often important for safety reasons, but accounts should state that they are avatars rather than deceiving people into thinking they are something or someone they are not.
For example if I read something in my local newspaper, on-line or the paper version I can call or email the paper or the reporter with questions, comments because it’s a known source.
I wish them luck in their future battlefield encounters with China’s PLA Unit 61398. They’ve been at it for awhile.
Lots of useful links in this Wikipedia article’s list of references:
You’re right, I read that a bit quickly. My point was that much of the yet available meta data is not very useful. Of course, a handy database of information origin would be very helpful, but I still fail to see how it could be built.
Very difficult to reverse engineer but my take was that making the data available for a public database should be a regulatory requirement for social media companies. Analogous to the requirement that companies that sell processed food are required to put the ingredients on the label.
Here’s a good analysis of a state-sponsored information op:
One thing to remember is the huge majority of people are too busy just trying to survive to delve into whether the latest news is fake or real. With the ubiquitous cell phone social media and the bots that determine what one is interested in can quickly link one to BS. Facebook is notorious for this. Also, Google’s search engine zeros in on ones latest interest. If you start looking up Jews, Holocaust, power…next thing you know you may get links or social suggestions to extremist views which seem to be legitimate. History has shown that monopolies must be controlled lest they control the population. Social media is one that needs to be controlled. Giving people information they ask for is one thing but automatically suggesting some BS that relates to search is both irresponsible and dangerous. BUT, social media and so called news organisations will fight that because? They make lots of money off of every click.