An interesting phenomena is that the better someone is at Magic, the less likely they are to be an asshole. This isn’t an accident, and it isn’t because getting better at Magic makes people kinder or wiser. It’s because people who cling to toxic mindsets and faulty frameworks invariably wind up stuck at the MCQ and GP-level.
Personally, my biggest period of growth in Magic was when I stopped playing in person and starting playing exclusively on Magic Online. On Magic Online, there was no one to impress, no big fish in small ponds I felt pressured to agree with. There’s was just me, the screen, and my Event Ticket balance. In the end, I qualified for my first three Pro Tours via Magic Online MCQs.
Magic content these days focuses too much on details and not enough on attitudes, perspectives, and processes. In this article, I’d like to introduce some ideas and frameworks that have served me well.
Alternatives, Uncertainties, and Outcomes
A good decision can have a bad outcome, and a bad decision can have a good outcome. Sometimes you play around their 4-of and lose to their 1-of, and sometimes you beat their 1-of by playing into their 4-of. In the abstract, you can be fully confident that the first decision was good and the second decision was bad regardless of what happens.
In practice, Magic is messier. But that only makes the process more important. To make a good decision, you need to clearly distinguish between your alternatives, uncertainties, and outcomes. And to evaluate a bad decision, you need to determine which axis you failed on.
In the toy example, the alternatives were clear: you could play into the 1-of and beat the 4-of or you could play into the 4-of and beat the 1-of. But fully identifying your alternatives in a normal game of Magic is a overwhelming task. My friend Jacob Nagro wrote a 2000-word article just on when and whether to play a land each turn. When you don’t make the best play because you simply didn’t consider the best play, your failure was a failure of identification. This is the mistake I’m personally most vulnerable to, defaulting to my autopilot and overlooking the best lines.
- Always Tank Before Playing a Land!
- Jacob Nagro
Now, what if your opponent had indicated that they had their 1-of and not their 4-of earlier in the game, and you simply hadn’t registered that implication? What if you could have inferred their holdings from their body language, but didn’t? These are failures to correctly ascertain the uncertainty you’re facing, ie. the true probabilities that your opponent has their 1-of versus their 4-of. By default, your opponent is 4 times as likely to have a 4-of than 1-of and more likely to have nothing than either card. However, the information you gather through the course of the game can and should influence your beliefs.
The key distinction here, however, is that when you correctly identify the best alternative given an incorrect understanding of the world, your failure was of perception, not action. When you play into their 1-of and lose, it’s certainly possible you made a bad decision. But deciding you made a bad decision means convincing yourself that it was was truly more likely your opponent had their 1-of than their 4-of.
Lastly, we assumed for simplicity that when your opponent had nothing, you would win 100% of the time. However, say your opponent was 95% to have nothing, 4% to have their 4-of, and 1% to have their 1-of. Then if your line that played around their 4-of lost even 5% of the time when they had nothing, then it would still be correct to play into the 4-of even though they were 4 times more likely to have it. In this case, your mistake was that you failed to properly assess the outcomes of your alternatives.
There are a lot of ways to make a bad decision in Magic, and correctly identifying what mistake you made is key to avoiding the same mistake in the future.
Que Sera Sera
A natural advantage of approaching decisions this way is that it isolates the factors you can’t control – the uncertainties. As a player, you don’t have any control over whether you’ll ever draw your third land after keeping a 2-lander. You can’t control how many times you’ll mulligan in a tournament, or how many die rolls you’ll win. You need to correctly characterize the uncertainties relevant to your problem, but you can’t influence their realization. And any time and energy you spend stressing about the things you can’t control is time and energy you’re not spending on the things you can.
My personal pet peeve at the moment is hyperbole. As a Magic player, I’m sure you’re familiar: “deck A crushes deck B”, “I’ve never lost this matchup in my life”, “deck C is unplayable”, etc. These statements are basically meaningless. Sample sizes in Magic are universally such that no individual’s experience is statistically significant. And with the power level of modern Magic designs amplifying the impact of mana problems, almost no matchups are more lopsided than 65-35 when both pilots play well (And few are even that extreme).
Instead of leaning on exaggerations and anecdotes, I’ve found it far more valuable, both personally and for the people I’m working with, to try to articulate exactly why I think one deck is favored against another and to use numbers to more precisely state my position. I’ve had multiple frustrating conversations where a friend claimed that deck A destroyed deck B, only to discover after a long discussion that they meant they thought the matchup was 55-45. Being clear is harder, certainly. But that’s no excuse.
Thinking Fast and Slow
In Nobel laureate Daniel Kahneman’s book Thinking, Fast and Slow, he describes the two systems humans use to think.
The first system is fast, automatic, and to a large degree rooted in the systems in our brains extant prior to our development of full cognition. It’s our instincts. System 1 is responsible for making snap judgments, for solving problems we’re familiar with, and for keeping us breathing.
System 2, on the contrary, is slow, conscious, and deliberate. It’s based in the frontal cortex and it’s the distinguishing factor between humans and animals. When we refer to “thinking” in conversation, we’re typically referring to system 2.
Both systems have strengths and weaknesses. If we had to fully analyze every single one of our decisions, to the extent I described above, we’d go crazy. We make a decision with system 2 once, and default to our conclusion with system 1 thereafter. Luis Scott-Vargas has written about how he vastly improved after learning to trust his instincts.
System 1 also processes information we’d consciously dismiss or ignore. In Blink, Malcolm Gladwell tells the story of how art historians and museum curators knew a statue was forged, but none could articulate why. The statue’s origin remains inconclusive despite extensive technical analysis, but curators eventually discovered that at least the statue’s provenance was fabricated. Gladwell describes a number of other system 1 miracles, spanning from military training to predicting divorce.
However, because system 1 is rooted in our oldest cognitive systems, it can be systematically wrong. For example, we’re all instinctively afraid of bears, sharks, and spiders, because these animals represented a threat to us thousands of years ago. But these days, spiders are more likely to be benign or actively helpful than dangerous, hunting pests like mosquitoes, flies and cockroaches. And sharks keep away from coasts, warded off by boats and pollution. These fears are deeply instilled in us, but they’re not rational. Sometimes we need system 2 to override our antiquated instincts so we can make the right decision, like taking spiders outside or leaving them alone instead of killing them.
Conversely, using system 2 is sluggish and stressful, but we have full control of it. When we make a decision using system 2, we can act with clarity and without regrets.
The key to thinking clearly isn’t using one system to the exclusion of the other, but consciously recognizing which system to use in each situation and being aware of that system’s shortcomings. When we experience negative emotions in response to a bad outcome or are tempted to act rashly, we need to recognize system 1 for what it is. And when we feel overwhelmed by a complicated board state but feel inexplicably drawn to a line of play, we should accept that system 2 can’t solve every problem.
Ronald Wright proposed that, “Socialism never took root in America because the poor see themselves not as an exploited proletariat but as temporarily embarrassed millionaires.” It’s not directly parallel, but I imagine that MCQ grifters see themselves as temporarily embarrassed MC champions. There’s invariably someone railing in the comments about how the author is wrong in every Magic article, and it’s easy to imagine them adding rares to their draft deck at FNM because they’re not winning as much as they think they deserve to.
The single quality that has best served me through my tenure as a competitive Magic player has been humility. I’m as vulnerable to arrogance and self-importance as anyone, but my progress unfailingly stalls as my mind closes. I don’t agree with every piece of Magic content I read, and blind acceptance isn’t a better policy than blind rejection. But I find I’m right more often when I allow for the possibility that I might be wrong.
One of my fundamental principles is that beliefs are only valuable insofar as they’re useful. The ideas in this article have all been useful for me, and I hope they’ll be useful for you too. But they need not be.
Thanks for reading. Until next time.
Allen Wu (Twitter)