Notes From C.Thi Nguyen Interview About Games and Society

Link: https://www.preposterousuniverse.com/podcast/2021/10/18/169-c-thi-nguyen-on-games-art-values-and-agency/

C. Thi Nguyen received his Ph.D. in philosophy from the University of California, Los Angeles. He is currently associate professor of Philosophy at the University of Utah. He has written public philosophy for venues such as Aeon and The New York Times, and is an editor of the aesthetics blog Aesthetics for Birds. He was the recipient of the 2020 Article Prize from the American Philosophical Association. His recent book is Games: Agency as Art.

I only excerpted topics of particular interest to me so you should listen to the podcast for a fuller understanding. Anything bold is my emphasis.


Intro by host Sean Carroll

One of the good things about the artificiality of games, you know when you’ve won or lost, unlike life. But…that clarity of knowing when you’ve won or lost is very seductive outside the context of a formal game. It’s very seductive in life.

Thi has developed this understanding to study things like echo chambers and cult leaders. A cult is in many ways like an echo chamber. In both cases, it’s not just a filter bubble where you prevent information you don’t want from getting in, but it’s like a strategy for preemptively undermining claims from outsiders that the cult leader or the echo chamber doesn’t want you to believe in, right? You give people ways to discount outside information.

One of the reasons why cults and echo chambers are so seductive is that they bring clarity to values and moral reasoning, maybe a little bit too much clarity: They make it too easy, they make things cut and dried in a way that the world itself is often not so cut and dried. So he has a whole understanding of why we’re so seduced by conspiracy theories, by cult leaders, by echo chambers, and how it relates to this seductiveness of clarity that we get from thinking about games and point. We get points, we get likes on our tweets, we get steps on our Fitbit. This engages our brains for interesting evolutionary reasons, and that feature of human psychology can be gamed, if you like, by the leaders of cults or echo chambers.

Games

What is a game?

The notion of a game is really disputed in philosophy. It’s very storied. In philosophy, it’s a particularly famous concept, because when Wittgenstein was like, “No, you can’t define concepts,” his example of an undefinable concept was a game.

There’s an amazing book from Bernard Suits called The Grasshopper, which is an attempt to define games, which actually takes itself as a response of Wittgenstein and also secretly about the meaning of life and the relationship between games and the meaning of life.

The short version is, “To play a game is to voluntarily take on unnecessary obstacles for the sake of making possible the experience of struggling against them.”

  • Part of the idea of the game is that the goal of the game is partially constituted by obedience to certain restrictions:

    If you’re trying to get to the top of a mountain to get some rare drug that’s only there, you’re not playing a game. You’re just trying to get to the top of the mountain. If you’re trying to climb the mountain as a mountain climber, then certain restrictions are part of what you’re doing. So the medical seeker is not a game player, and the mountain climber is. And one way you can tell is if someone goes by in a helicopter and says, “Hey, you want a ride?” The medical climber will say, “Of course, get me the cancer drug.” And the other person is like, “Of course not. What do you think I’m doing?”

  • Autotelic: It’s worth engaging in the activity for the sake of the engagement and the doing rather than the product.

The charm or allure of games

  • Struggle

    The aesthetic experience of struggle can be accessed in a safe way by making the difficulty manageable.

Some struggles are beautiful, some struggles are satisfying. And what games do is, they let you tweak the activity to maximize that satisfaction…Most of life’s challenges are too big or too boring and little for us. In games, we get to modify the world of the game and the abilities we’re allowed in the game until they fit just right.

Example:

I feel like things like chess are kind of tuned to maximize that moment, you get more and more of those moments. In my normal life moving around the world, I get to feel graceful once a week, but rock climbing tunes you into the part of the activity that has that feeling. It’s built to constantly call out of you that incredible experience of delicate, graceful, perfect motion.

Manageable because of clarification

Games are these circumscribed spaces where the actions in space have been often been leaned down and clarified, so your actions can fit. They’re clarified, not just because the actions you can perform have been clarified, it’s because your values have been clarified.

Bounded and limited beings make things that make them feel temporarily okay, like spaces where we don’t feel too little for this vast world. The real world is this existential hell-scape of too many values, and games are like this temporary balm, where the world makes sense for a little bit.

  • Games as art

    Games are sculpted experiences of practicality. I think what’s interesting in games is, if games are sculpted practicality, then the beauty emerges in the practical action. So in other words, when you play a game, it’s not the game that’s beautiful, it’s you that’s beautiful.

    I think a lot of the literature about games has been going around looking for qualities that are in the game, like, Oh, the graphics are beautiful, the sound is beautiful, the story is beautiful. And they’re not looking at how radically different games are. And I think there are other things like this that are also mostly neglected, but the thing that makes games unique is that they’re sculpted action.

The danger in having your values clarified

  • Gamification

    Game values are hyper crisped-up, and that’s fine if you put away those values at the end. But when you gamify something like education or communication, then you’re forcing a singular clarified value system on a real-world activity. Think of how Fitbits or Twitter engagement can orient your goals towards local maxima. Twitter’s gamification squashes an individuals’ pluralistic values and gets everyone, insofar as they’re motivated, to be motivated in the same direction.

    • Education examples:
      • Sean Carroll: I used to be at the University of Chicago, which obviously has always been academically super-duper strong, but back in the day, it wasn’t the place you applied if you were interested in Harvard or Stanford or Princeton, it was less well known. So suddenly, there was a strategy that they undertook at the University of Chicago because they were being hurt in the US News rankings, and they were being hurt because the only people who applied to the University of Chicago were the ones who really wanted to go there. And you are rewarded in the US News rankings by having a high selectivity, by rejecting most of the people who apply, so they intentionally encouraged people to apply knowing they would reject them, ’cause it increased their selectivity, and they leapt up in the rankings. That’s an example of maybe the goal perverting the original aspiration.
      • Law School culture
        A study about law school culture when the US News and World Report started ranking them charts a bunch of stuff like what you’re talking about, about people gaming the rankings. One of the things they point out is that different law schools used to follow different missions before the rankings, but if their mission is skewed to the ranking at all, then you drop in the rankings, so it’s forced everyone to pursue the same values.

        Before the rankings, prospective law students used to talk about what different law schools valued and talk about their own values and decide what their values were, to pick which school to go to. Now, they, say 99% of the students just assume their goal is to get into the best school, where the best school is set by the ranking. So they don’t go through the process of value self-deliberation. You end up outsourcing your values, you end up letting somebody else perform value deliberations for you, and what goes into those values are often very much based in what’s in the interest of large-scale institutions and the kind of information management systems at large-scale information.

People worry a lot about games creating violence, and there’s actually a lot of data that mostly they don’t. And I think part of that is that the violence in games is fictional, and we have a lot of information that people are mostly capable of screening off fictions. The thing that I’m really worried about is people becoming used to the idea that the goal is some simple, quantified thing that people share, and what we’re supposed to do is do everything in our power to up that simple measure, and one thing on to note, that’s not fictional.

We shouldn’t worry about games creating serial killers, we should worry about them creating Wall Street bankers.

[obligatory reference to James C Scott’s Seeing Like A State: “Look, what you should think is that large-scale institutions generally see the parts of the world that are processable by large-scale bureaucratic machines, which are quantified data, they can’t register the part.” So, think about this, a large-scale school district or an educational bureaucracy can’t register individual student evaluation data, they can only register aggregatable data like GPA. So, says Scott, large-scale institutions have reason to remake the world along lines that are more regular, so that they can be legible to the institution and actable on by the institution.]

  • The satisfying aspect of games common to conspiracy theories

The satisfaction we crave in the clarity and simplification of games is why we are drawn to conspiracy theories. Conspiracy theories are tuned to give you the exact same pleasure [as games]. Someone has changed the nature of the world, apparently, to make it tractable.

How the argument works:

    1. There’s a source of anxiety: The world is complex

      Scientists are hyper-specialized, no-one understands everything, at some point you realize that you have to just trust tons of stuff that you have no ability to grapple with.

    2. Conspiracy theories circumvent our need to trust

      They say “Don’t be sheep. Don’t trust other people. Here is a vision of the world, where you can contain the world in you. You can explain all of it with this one powerful explanation.” It is a game-like pleasure, but exported to a place where it’s dangerous.

      • Refers to a book by Elijah Millgram, called The Great Endarkenment. It talks about how knowledge must not be individual quest given the fact that the world is so hyper-specialized that no one can know more than a tiny amount of it. [This stands in contrast to the] ideal of intellectual autonomy that drove the Great Enlightenment. But it doomed itself, because it created all the science that made it impossible to be intellectually autonomous. If you still hold to the old ideal of intellectual autonomy, where everyone can understand everything, what you get is being driven to anti-vaxxing and various conspiracy theories in which you reject trust in the sciences.]
    3. Conspiracy theories hack our desire for clarity

      It’s not that clarity is always bad. When you have intellectual success, you do have this feeling of clarity. My worry is the feel of clarity actually comes apart from real understanding, and that outside actors can game it.

      For example, in the course of evolution, it made perfect sense for us to pursue sugar and fat because calories are scarce, it’s hard to get enough fat and so on. Then the world changes and industrial forces figure out that they can maximize the feeling of sugar and fat separate from any nutritive qualities. And then if you’re still stuck on that old heuristic and chasing sugar and fat, then you’re screwed. Clarity can be like cognitive sugar. Someone can aim to max out the feeling of clarity, and the way that looks like is a conspiracy theory.

Trust’s role in conspiracy theories

1. Trust is tricky because there’s a lot you cannot trust, yet being able to trust is not optional

If you devote your entire life to it, you can understand one one-millionth of the human landscape of knowledge. [Even worse] not only can you not understand everything else, you don’t even have the capacity in yourself to pick the right experts to trust.

I have a PhD in philosophy, I have a lot of education. If you gave me a good, a real statistician, and a fake one, I don’t have the mathematical skills to tell the difference between a good statistical paper and one that gives a bad result. I don’t have that in myself. So what you get is actually this incredibly iterated and very fractal chain of trust.

[An example of why we have no choice but to trust in many cases:

There was one proof of a really famous theorem that literally only one guy understood, and he’s getting old, so a whole bunch of people had to have a project to re-write the proof in a way that other people could understand it. And that whole process of dealing with the fact that you need to trust some things, while you shouldn’t trust everything, is a tricky thing that is probably under-theorized.]

Politically, conspiracy theories tend to be associated with the right, but the left’s appeal to science fall flat because they are not appreciating the role of trust:

      • The false argument they spout:

        “Oh, these fucking anti-vaxxers, anti-maskers. Think for yourself, look at the science, evaluate the science.”

        When the truth is “I can’t evaluate that science.”

      • The obstacle is not whether people trust Science with a capital “S”, it’s the trust of its messengers

        Look at the people that are legitimated in a certain institutional structure, which involves a background trust in those institutions. And I think there’s this vision where for a lot of us, like when you look at anti-vaxxing, anti-masking, the climate change denialist space, what we want to say something like, “Oh, those people are totally irrational.” But I think what you have to think instead is, they have an entirely different basic framework of trust, for a different set of institutions. The degree of rationality there depends on the degree to which we can justify our trust in our institutions. And that’s a really, really complicated matter, and it’s not like the authoritative institutions are always right. There are plenty of historical cases where they are corrupted, right?

        It’s hard to be against conspiracy theories as a blanket statement ’cause sometimes there are conspiracies. We have plenty of historical examples where all the institutions in a particular country have become corrupted, have taken over the news media, are issuing fall statements. That’s a real thing that happens.

2. Echo chambers are not about ignorance, they’re about trust

An echo chamber is a system in which people have been taught to systematically distrust people on the other side. The book Echo Chamber doesn’t quite say the world around Rush Limbaugh is a cult, but they basically almost say it. This book is an empirical analysis of the world around Rush Limbaugh — Rush Limbaugh’s top people just systematically distrust and dismiss people on the other side.

This is is different from not hearing them at all (which is the filter bubble argument). The problem goes back to trust, not irrationality or disbelief of science.

A large segment of the population has had their trust subverted and undermined and directed toward what we think of as like the wrong institutions.

The way back is not to wave the evidence in people’s faces. I think people want to be like, “Oh, climate change denialists, just look at the evidence, here is the evidence,” but of course they’re not showing you the evidence, they’re showing scientists that they trust who process the evidence, ’cause not even a climate change scientist, a particular climate change scientist, has looked at all of the evidence for climate… It’s all processed.

Someone whose trust has been systematically undermined in that set of institutions will not trust evidence from sources they distrust**. And that is rational.

This is a complicated problem without easy solutions. But the first step is recognizing the story is that the other side doesn’t hear us. It’s that their trust has been undermined.**

A lot of public policy figures are fixated on the filter effect thinking we just need to create these public spaces where people can meet each other and talk to each other. That’s a standard view in a lot of political philosophy and public policy. And I think that’s not going to work if trust has already been systematically undermined. It doesn’t matter if you meet and hear the other side, if you already have a prevailing story that says they’re malicious, manipulative, evil people.

The grand takeaway

The need for clarity as evidenced by our love of both games and conspiracy theories can cause us to overoptimize and seek safety in echo chambers.

What do we do about it?

While there are no answers some hints that can point in helpful directions:

  1. Transition between perspectives

    Playfulness is the quality to transition between different world perspectives, easily, lightly, to hold your perspective lightly and slip between different ones. This can be done via literally traveling and playing games where you are forced to slip in between different value perspectives.

    It’s hard to do, it’s hard to put on different worldviews as like different outfits. This is a resonant argument for reading and earning widely:

    This is going to become the world’s oldest chestnut, but sometimes I think like, this is what the fucking humanities are for. Read some art, read some novels, motherfucker, and if you want the background paranoid view, it’s something like the world has very good reasons to get us to onboard to super simple, clear targets. And when I look at universities cutting humanities programs in favor of business schools and STEM, because those are higher-earning jobs or lead to more clearly measurable productivity, I’m like, of course, reading weird subtle art, experiencing weird subtle art, including games, but also including novels and music and all this other stuff, is this stuff that might have clued you in to different value perspectives other than make a lot of money and get a good job. And of course, they’re going to get cut out in a world dominated by hyper-simplified institutionalized values.

    One of the suspicions I have is that certain domains, especially the domains that science has a lot of success with, are the kinds of domains that admit of extraordinary clarity. And other domains, like the domain of life value and the domains of personal health and fitness and aesthetic joy, are not domains that admit of the same systemic clarity. And when we demand them, we start hitting simplified targets.

    There’s a difference between qualitative and quantitative data. Qualitative data is really rich and nuanced and subtle, but it’s really context-sensitive. It doesn’t aggregate, it doesn’t travel well.

    [insert quote: Not everything valuable can be measured, and not everything we measure is valuable]

  2. Be aware that there are forces that are trying to manipulate us

Like the sugar analogy, is this moral view or worldview too yummy? Is it just too satisfying? Did someone make this just for me and people in my cohort to swallow down?

This is definitely not a blanket. The thing is, you also get clarity and pleasure from really getting at truths. You can’t throw all that stuff away, you just have to realize that the signal has been amenable to perversion and misuse.

Leave a Reply