After two years of investigation and constant media coverage, the Mueller Report is finally finished. While anyone outside the Justice Department has yet to read the full report, Attorney General William Barr has released a summary. The so-called Russiagate story is not yet over, however, as there are now calls for the entire Mueller Report to be made public. Exactly what the Russiagate story is and how it started is expertly told by Matt Taibbi in his “It’s official: Russiagate is this generation’s WMD” piece. What I’m more interested in is how this whole story is indicative of human nature.
Social media, and twitter in particular, has recently become popular in the conversation about freedom of speech. This surrounds the issue of Twitter punishing people for posting right-wing and conservative ideas more than people on the left. Alex Jones being banned and Kathy Griffin not being banned are two exemplary cases.
The fear here is that Twitter is policing people for wrongthink. Only left-wing and liberal ideas are allowed, and with Twitter being a primary hub for communication, this threatens to silence right-wing and conservative views from the public conversation. This would give left-wing and liberal ideas de facto hegemony in western culture. This has prompted people to call for Twitter usage to be treated like a utility or even a human right, in the sense that humans have a right to free speech.
I think this is a misguided way of thinking about Twitter. Being banned from Twitter does not infringe on a person’s right to free speech. It only infringes on their ability to have that speech heard by a larger audience. This brings up the questions: do humans have a fundamental right to be heard? Is being heard a part of our right to free speech?
In this day and age, with social media and politically biased news media, it seems that truth isn’t truth. Thus it has become popular to talk about ‘my truth‘ when people talk about their opinion. It’s interesting that people can believe wildly different versions of things that happen. From Russiagate to Uranium One, it all seems to depend what ‘our truth’ is for any particular group.
The alternative, unfortunately, is to have some version of a single, orthodox ‘right Truth’ that we all must agree on, possibly enforced by a Ministry of Truth or a Truth Force.
I’m okay with a US Space Force. But what we need most is a Truth Force — one that defends against all enemies of accurate information, both foreign & domestic.
— Neil deGrasse Tyson (@neiltyson) August 20, 2018
The problem with this is obvious: who gets to say what the Right Truth is? What are their motivations for saying that A is the Right Truth but B is not the Right Truth? Is it ever possible for one source to give the truth, the whole truth, and nothing but the truth?
There is a truism that in everyone’s life, there will come a time when youth/pop culture doesn’t make sense anymore. You’ll hear the music or see the shows kids these days are listening to and you’ll just think “I don’t get it – how do people enjoy this?” The irony is that, when you’re a pre-teen, or teenager, or early-mid twenties person, you think this will never happen to you. You’ll always be hip and with it. And then it does happen to you. But you don’t think it’s that you aren’t hip and with it, it’s just that youth/pop culture became shallow and derivative. It’s them that sucks, not you.
This is all still very true. I’ll call the gradual stratification theory. In this view, there is more-or-less a homogenous culture that is spread throughout a particular age group, often categorized by the Strauss-Howe Generational Theory – the one we use to categorize people as baby boomers, or generation X, or millenials and so forth. Sure, this is variation in interests and across regions, but for the most part, those things which are massively popular are either consumed by a large number of people or are at least known about by the majority of people. This is a result of very little choice about what to consume. Before cable the eighties, there were three channels on TV, only a handful of movies came out each year, watching movies at home was at best a luxury, there were no video games. And, of course, there was no internet.
It’s obvious that the internet changed these paradigms in a fundamental way. But I think it’s important to really appreciate how much has changed. And what it has changed to is what I call the granular bubble theory of culture. One of the biggest changes that have come about, mostly (but not all) because of the internet, is on-demand content. No longer do people need to just turn on the TV and watch whatever is on those three channels at the time slot the network decided on. Now people seek out content – it’s all demand based rather than supply based. But that part is obvious. What’s more, though, is that people find those things that suit their taste – news sites, blogs, podcasts, forums, Twitter users, Facebook users, Youtube channels – and settle on those places. Why would people need to go outside of those places that supply them with the content they want to consume?
This creates both a granular culture and a culture bubble. The granular is because that near homogeneity that’s seen in the purely gradual stratification theory is gone for the most part. I consume the content I want and you consume the content you want, and there does not need to be much overlap. Sure, we might have the same taste on books, but then I can go immerse myself in a world of my taste in music and you your taste in music, and never do those worlds have to cross paths. But it also becomes a bubble culture, because I ONLY pay attention to the things I enjoy and culture in the other granular bubbles evolve on their own without my even being aware. And then those bubbles perpetuate themselves, because all the advertisements, all the external links, all the searches and everything in that bubble only points you to other areas of that bubble using algorithms that find out what granular areas you are interested in. You can watch shows you like on Hulu, and when the advertisements come on, it will literally ask you if that advertisement is relevant to you – you are not just choosing your content, but you are tailoring what is marketed to you.
Back in the 1950’s, people could go into work with the implicit assumption that three quarters of the people there watched “Leave it to Beaver” yesterday, and the other quarter at least knows what the show is. It was a part of the small amount of content that was available to consume on one of the only ways of consuming it. In the 1960’s, the culture changed as someone else supplanted the popular culture of the 1950’s, but that culture changed rather homogeneously. Same into the 1970’s. And then in the 1980’s, cable TV, VCR’s and video games became a big(ger) thing. Now I can watch TV channels that play content I enjoy all day and I can buy movies and video games that have content I want to watch/do. This only grew into the 1990’s, when more channels, more gaming systems, compact disks, and even computers started becoming a thing. Then it really blew up in the 2000’s when the internet really came into it’s own, when we started to have more and more of that content available to use everywhere on our cell/smart phones, and user generated content in the form of blogs, video hosting channels, self-published books, and social media became popular. Things have only become more granular and more enshrined in their bubbles in the twenty-teens (2010 to present) as on-demand has only picked up and solidified – user generated content is quickly surpassing professional content in popularity. And yet, someone who is incredibly famous on the internet can probably go home on Thanksgiving and have nobody in their family that knows or understands what they do.
I’m not judging whether any of this is a good thing or a bad thing, merely that it’s a different thing. And probably something we’ll all have to get used to, because as of right now, I don’t see anything coming that will shift us back, or even stranger yet, into some completely new direction.