Making Sense of the Noise

As someone who has had an excess of free time on their hands over the last two years, I’ve put a fair bit of effort into keeping track of what’s happening in American and global politics.  It started back in 2016 during Trump’s campaign.  Before that, I made a point of avoiding politics and the news.  During the election, my two primary news feeds were the Google app and Facebook.  It was really interesting to see Facebook get as political and dis-informational as it did.  I managed to get into a variety of debates before I chalked it up as a lost cause.  Even the Google feed wasn’t great.  I had to find a better way of staying informed.

I guess you could say I took an interdisciplinary approach to figuring out what was going on.  I read philosophy books to get a better understanding for the fundamental concepts of things like democracy and morality.  I read an astrophysics book to try and connect the small to the big.  I read psychology books to better understand why people do what they do.  I even read a neuroscience book to understand how and why people think the way that they think.  And all the while, I was consuming about 10-20 news articles a day.  That was probably on top of another 10-20 search queries a day, just on random things I wanted to know.

It was a lot of information, and it didn’t happen quickly, but I eventually started to piece it all together.  There are certainly gaps in my understanding of what’s happening as I’m missing key pieces of information.  But for what’s on the public record, I got a pretty good handle as to what’s going on.  My friends have actually started to tease me for it but they also use me for updates so I think my efforts are appreciated.

One of my friends asked me the other day what my process is for making sense of all this noise. She wanted to know what sources she could trust, or where she should be looking, or anything to help make things a bit more clear.  I’ve given that some thought and I’ve tried to isolate the algorithm that my brain uses to go from data to information to knowledge.

Step 1: Find a platform that effectively aggregates news sources and stories according to your preferences.  Personally, I’ve found Reddit to be most effective.  It’s open platform means that voices of all shapes and sizes are on there.  I can check in on what the far right thinks about a major event, and then just as easy, check in on what the far left are saying.  I’ve found that these communities, when not censored, provide some great insight into what people are thinking and feeling.  While far from perfect, there’s also a degree of accountability within the community and that makes for some solid fact checking.  After I curated my home feed, I was able to receive an effectively unlimited feed of interesting data.  It’s extremely easy to navigate as well since it just shows up as a scrolling feed of headlines, pictures, and videos.

Step 2: Scan the headlines.  I’ve found that most news reports these days contain one new relevant data point, surrounded by filler.  The filler is usually a summary of previous reports leading up to the the current report, and it’s often filled with biased commentary.  And for each new data point being reported on, dozens of publications will write a story on it.  Being able to scan headlines for key info not only keeps you from reading redundant material, it also helps you stay focused on the facts.

Step 3: Does the article provide a question or an answer?  I’ve found that a common approach to filling in the news cycle is asking the question that we’re all thinking, and trying to answer it with no new information.  Something to the effect of, “Robert Mueller Probe to End Soon?”.  We’d all like to know the answer to that question and your headline would suggest that you know something we don’t.  But you never do.  The reality is that if any reporter or news outlet had factual information that indicated when the Mueller probe would end, it would be a story in itself and breaking news.  I tend to filter out the questions, unless it’s one I haven’t asked myself before.  What I’m really looking for while scanning these headlines is answers.

Step 4: Verifying new information.  The first thing I look for is who’s reporting on it.  If the same event is being reported on by all major outlets, there’s a good chance that it happened.  Next step is jumping into one of the articles and looking for the quoted material and who it’s sourced from.  Almost every article will be referencing the same quote so you don’t really have to worry about the bias of the reporter unless you start reading their commentary.  Once you know what was said and who said it, you see how that fits into your larger understanding of the situation.  If it fits in neatly, in it goes and your understanding of what’s happening has grown.  If it doesn’t fit in neatly, it’s time to figure out why.

Step 5: Analysis.  When a headline or quote fits neatly into what I already know about what’s going on, it’s like adding a collectible to your collection.  In most cases, I knew what I was looking for, I was already looking for it, and I knew exactly where it would go once I found it.  But now and then, a new piece of information doesn’t fit neatly into what I already know and I now have to figure out why.  Sometimes the information is incorrect, sometimes my understanding of the situation is incorrect, and sometimes it’s somewhere in between.  Regardless, the process is always the same, to dig in until I understand what’s going on.  More often than one might expect, it leads me down some rather deep rabbit holes.  These journeys can take me through a variety of sources, including news articles, Wikipedia articles, scientific studies, historical texts, and plain old books.  Until I understand it, it doesn’t get added to my understanding.  I’d like to think that one of my most valuable skills in this process is being okay with leaving things in the maybe pile and not jumping to conclusions.

Step 5: Adoption.  Once a new data point is verified and it clicks into place, I move on.  I don’t mind having to go back and undo some of that work because something isn’t what I thought it was.  I’ve found that the conspiracy theory crowd have a hard time seeing the bigger picture because they get stuck on the validity of key details.  It’s like the scientific consensus, you don’t need to be 100% certain to move on, just 99%.  If someone presents new evidence that doesn’t align with what you previously thought?  Sweet.  Time to go figure out why and learn something new.

 

It would seem as though the filtering process that I’m applying to my data feed is an elaborate IF statement.  If the new data passed my verification process, I add it to my programming.  If it doesn’t, I analyze further.  If it’s because the data is corrupt, I identify it as such and set it aside.  If the data proves to be valid, I perform a self-diagnostic to understand why it conflicts with my existing programming.  More often than not, it’s my existing programming which needs to be updated to accommodate for the new data.  Sometimes the self-diagnostic can’t find anything wrong with the data or the programming, in which case it’s identified as such as set aside for further review.  Every once in a while, I’ll get an update which helps me make sense of something that was set aside for further review.  Then, even that piece of information gets tucked away neatly where it belongs.  Maybe a bit methodical, but I gotta say.. seems like a good way to go about things.