This week I had the good fortune of chairing a panel at the Society for Applied Anthropology in Salt Lake City. My Fellow Panelists and I decided to stream it live on YouTube, where it will live for people to rewatch. The title of our panel was Virtual Communities and Imaginary Worlds. The panel was a lot of fun and it was an honor to be on the panel with two brilliant researchers.
You can watch the video here. You will find descriptions of each panelists talk and the timestamps for their presentations on the YouTube page. The last 20 minutes are Q&A.
In the last few years numerous films, books, research papers, and podcasts have been published on the problem with social media. I address some of this in my previous blog Why Social Media Can Be Such a Dumpster Fire but that blog only highlights some of the social difficulties of navigating platforms as an individual and why there is so much conflict.
Social media is creating and/or reinforcing massive problems in our society. So here I wanted to write something short and open a forum for discussion of how we could possibly fix these systems and make them work for us, rather than the other way around.
My background is an as anthropologist, but more specifically, my area of research during my time in graduate school focused heavily on media systems. I read quite a bit of literature and research and conducted some of my own (You can find the documentary version here) on the relationship between culture and media. However, I would hesitate to call myself an expert, especially on social media. So what I am going to do here is outline what I know, a few ideas I have, and then, hopefully in the comments below people can engage in some serious discussion or propose ideas. My ultimate goal is to gather some of the best ideas and hopefully create an open letter for public consideration. Perhaps we could send these letters to some of the heads of these platforms or even state or federal representatives and propose action. There is no doubt that we must address the problems on social media, because we know, pretty clearly now that some elements of these platforms are making our world a worse place to live. Like any good tool, we must learn to wield them wisely.
This is a bottom up approach, something that anthropologists called, collaborative methods. This is when you source the wider community to formulate questions and answers that people may not have considered.
How do you understand media systems?
The most helpful element of graduate school that included a large component of media research was that I learned how to ask better questions about media. Generally, there are three major sites of research in media (of course there are more but these three cover a lot). These sites also have their own power dynamics, politics, and are areas that shift culture and representation. These three areas are, Production, Distribution, Consumption.
Production – The site of production is, where is the media or platform created. Who is making it? For what purpose? The entire production and post production side of media systems is all of this. Social media is less concerned with the production side of things, since most of the content isn’t produced on these platforms. That doesn’t mean how the platforms are built isn’t a concern though.
Distribution – A large portion of the issue with social media platforms is distribution. There is some overlap here with production via algorithms, but algorithms are largely centered in the distribution side of things. You may have heard already how YouTube’s algorithms shifted a lot of people toward wild conspiracy theories and helped spread fake information like wildfire. They have corrected some of this but have a long way to go.
Consumption – This is the site of consuming the media. What are they watching it on? How long are they watching? What habits do people have surrounding consumption? Do certain algorithms change consumption habits or not? Do people watch this stuff individually or do they share it with others? What makes something go viral? How do things like confirmation bias prevent people from consuming different viewpoints?
You can probably tell that these three sites of research have a lot of overlap, because well, creation of any medium is a cycle and creates all sorts of feedback loops, both positive and negative. Take algorithms for example. While algorithms determine what get’s distributed where (distribution), someone makes those algorithms (production) and then the viewers behavior (consumption) changes how the production and distribution cycle works. Over time these algorithms are refined to work better in both distribution and consumption. Better of course is a relative term here. If your goal is more revenue, that can create all kinds of problematic algorithms and is why YouTube accidently promoted massive amounts of conspiracy theory videos that had no basis in fact.
So why break it down? Because rather than tackling the whole system at once, it’s important for us to break down each site of research and understand the dynamics of each layer, so that when they are viewed as a whole they make more sense. It’s similar to how we learned how the human body works. We took it apart, analyzed it, and overtime our knowledge about how each organ functions in relation to one another.
So, I leave you with an approach (it may or may not be the best one) to try and think about these issues. I am going to make a few suggestions of my own here below for things I think would help move social media platforms in a new and better direction, but the point here is to source ideas from a variety of people with different backgrounds and disciplines so that we can come up with something collaboratively to solve this wicked problem.
My Ideas on Solutions:
Reinstitute the fairness doctrine and expand it to social media. The fairness doctrine was designed to make sure that all media exposes it’s viewers to a diversity of viewpoints on any topic. Air time had to be given in equal measure to conservative and progressive viewpoints (that is a bit of a simplication but go to the Wikipedia link above for more) forcing the viewer to consider several sides of each story. How could we do this on social media though? I think perhaps any article that is shared on social media, could have a required and connected article already attached to it from an opposing viewpoint. Is the article from fox news? Create a way so that an article on the exact same topic from MSNBC is sitting side by side with it, contrasting both headlines. This way the user and anyone who sees the shared post is required to see the different headlines from the different services. It can be incredibly illuminating to see two very different headlines on the same topic or event sitting next to each other, and it encourages critical thinking. Will many people dismiss the counter article? Sure, but what it does do is help those of us living in our bubbles to realize, there is a completely different way of thinking about the exact same set of facts. Facts are facts, but facts can also be understood from different viewpoints in quite a few cases. Showing different interpretations can at least give us pause and consider someone else’s viewpoint, even if we don’t agree with it.
Label Everything. I think a label could be created that sits at the top of every single news article regardless of the source. Blue could represent something that has good factual information. Yellow something that is opinion and needs fact checking. Red could be something that is clear and blatant misinformation. There should also be labels for satire (maybe purple?) as well and labels for blogs like this (perhaps gray) that are not as easy to evaluate because they don’t have a wide reach. Things that are difficult to evaluate could have a note above them saying, not vetted or evaluated for facts, until it is vetted. Labeling things might become annoying for a while, but using labels could help us understand what is good and bad information. Yes, some people will ignore it, but people might be more hesitant to share misinformation it is clearly labeled as misinformation.
Open the code for algorithms so people can see how they work and what they are doing. This one is tricky because a lot of these platforms are not interested in sharing their code. There are a lot of trade secrets and very likely, an attempt to force them will be met with a lot of pushback. That doesn’t mean that these platforms don’t need to be reined in and held accountable. I don’t believe that any of these platforms set out to do harm, but there does need to be social responsibility and accountability on these systems so they don’t hurt society. Why should we do this? Well this book Weapons of Math Destruction really helped me to understand how dangerous it is to keep coding closed. We need to understand how to refine and correct these algorithms so they don’t cause unintended harm.
Those are my three ideas for improving social media. Would they work? What do you think? The point here isn’t to throw up road blocks and say, well that would never get passed in congress or something similar. The point here is to brainstorm ideas for solutions. What solutions do you think would work? Let’s collaborate and see if we can take our ideas and make the world a little bit better. Even if we only improved things a little bit, it could have a massive impact on our culture and society. If we want things to get better, we have to try and tackle them. We can’t wait for people in charge to do something, we must find our own ways to source ideas and act. Be the change and all that right?
Please keep your comments and opinions respectful. I will moderate these comments. Differing viewpoints are very welcome (and frankly needed), but I will be sure to keep this a place of respect.