RPG Design and Ludonarrative Dissonance

Analog role-playing games distinguish themselves from other games* by their inclusion of a fiction layer: imagined events the players devise, share, and engage with as an integral part of play. Designing the interface between mechanics and fiction is thus the central challenge in creating an RPG, and often acts as the point of divergence between different philosophies of design and play preferences. For example, hardcore character immersionists prefer that wherever possible, the player’s input into the fiction should be limited to the reach and will of a single character in the narrative. Mechanics that allow a player to take on a broader authorial role, editing the environment or dictating the actions of other characters, run counter to HCI play preferences.

When setting out to design an RPG, then, it helps to know what design patterns already exist for navigating the mechanics/fiction interface, and what pitfalls those design patterns sometimes hold for players’ engagement and enjoyment. This post is more of a braindump of things I’ve run into over the years than a comprehensive thesis, but I hope it will provoke some thought!

To start, the RPG chicken/egg question: which comes first, fiction or mechanics?

Mechanics first: A player makes decisions grounded in mechanical systems, and engaging with those systems helps generate fictional content. In Dungeons & Dragons 5th Edition, I expend my action for the turn and a spell slot of 3rd level, choose a target location, and roll eight six-sided dice. That then translates into a fictional event where an explosion of magical flame bursts forth, scorching enemies and setting scenery on fire!

Fiction first: A player narrates fictional action, the content of which activates mechanical systems. In Dungeon World, I describe my character darting across a stone bridge over which “Pit and the Pendulum” style blades swing. That triggers the “Defy Danger” move, and I must roll two six-sided dice adding my DEX score to see if the character makes the crossing without mishap.

Note that a game will almost inevitably feature both modes, as mechanics and fiction move one another forward in a cycle. But considering where a player’s decision-making process is likely to start, or which of the two directions of flow has greater emphasis, can help inform your game’s core priorities.

If an RPG features constant mechanical engagement, but fictional content is optional, thin, or an afterthought, that produces a problem we might call boardgame regression. The design risks sliding out of RPG territory by neglecting the fiction layer altogether, resembling a board game (albeit perhaps a flavor-rich one, something like Dead of Winter). This is the most common criticism I hear leveled against the fourth edition of Dungeons & Dragons. It was all too easy to play the game as a series of boardgame skirmishes, with little to no reference to characterization, plot, or even an imagined scene beyond the positions of figurines and their associated fluctuating numbers. To avoid this pitfall, ask yourself: is it possible for players to disengage from the fiction and still play? How does fictional content help drive mechanical decisions?

When an RPG features narrated fiction that only rarely or tenuously grounds itself in mechanical systems, I call this trouble slipperiness. Players are uncertain if their narrative contributions make any difference to the game state, or lack trust that the rules will back them up in the event of dispute or ambiguity. I run into this most frequently with games that give one player authority to secretly edit or override (“fudge”) game mechanics, the “Rule Zero” espoused in texts like Exalted. To guard against slipperiness, consider: how does my game differ from a minimalist collaborative storytelling setup, where players volunteer bits of story to be adopted by consensus? How do mechanical decisions and outcomes generate fiction? Is a situation where players break or drift the rules distinguishable from one where they play by them as written?

Friction in the mechanics/fiction interface needn’t be so pervasive as the above, however! It can occur sporadically in play, within specific rules or procedures. Even an overall functional D&D4 game sometimes hits moments where a mechanical outcome has occurred, but it’s difficult to picture what happened fictionally. “I use Arterial Slice on the skeleton! It’s now bleeding for 5 ongoing damage.” “Wait, what? Skeletons don’t have any arteries to bleed from.” In a Wicked Age features mechanics that only activate for physical conflict, so if a player narrates a character intimidating, bribing, or otherwise attempting to persuade another, the rules cannot help determine if their ploy is successful–a moment of slipperiness in an otherwise grounded game. The general term for these jolts is “ludonarrative dissonance” (hat tip to Kevin Weiser for that!), a place where game and fiction aren’t quite harmonious.

Ludonarrative dissonance can also arise within a game’s reward cycles. Mechanics might encourage an action that doesn’t make fictional sense. In Burning Wheel, given a minor expense a group of player characters would like to pay for, it is often in the group’s mechanical best interest to have the poorest character make the purchase with the help of more wealthy characters, rather than the wealthiest character dispensing with the buy alone. But coming up with a justification for that approach from the characters’ perspective tends to be tortured at best! Or an action that flows naturally from narrative and characterization could prove a terrible choice mechanically. In a recent D&D5 game, a player attempted to win over a villain driven by anger and despair, putting the spell “Beacon of Hope” on him to instill a sense of optimism and possibility. Reasonable, yes? But all that really accomplished, game-wise, was to make said villain more resistant to the heroes’ magic in the ensuing battle scene.

There’s one last rules/fiction pattern I’d like to call attention to, as it’s one I’ve struggled with in recent memory. I’ll call it the justification veto. In a justification veto setup, a player has access to certain mechanical resources–skills, character traits, or what have you–that need to be brought into the fiction in a meaningful way for them to grant bonuses. A classic example is Aspects in FATE, freeform descriptions of a character’s tropes that if I can explain how they help me with the task at hand, grant the ability to spend a Fate point for a reroll or dice result boost. That “if I can explain how” is the rub, though. If I’m a couple points away from succeeding on a roll, the rules urge me to find a way to bring one of my Aspects into the scene. The success or failure of that effort, however, rides on my ability to narrate that Aspect in a convincing manner for the context. If the other players (particularly the “GM” whose word on such matters is final) feel it’s too much of a stretch, the use is vetoed: neither the proposed narration nor the bonus take effect.

Justification vetoes are a very natural pattern to draw upon, helping ensure that mechanical bonuses are grounded in coherent fiction and vice versa. I’ve used them myself, in my game Blazing Rose! But the experience of pausing game flow for a “Mother may I” petition can frustrate players, especially those with different levels of skill in navigating mechanical systems vs. weaving persuasive narrative-grounded arguments. (I would not be at all surprised if neuroscience revealed these skill sets operate in disparate regions of the brain.) If that’s an experience you’d like to avoid in your design, put this pattern in a “use with caution” column.

A few games work around the justification veto’s drawbacks in clever fashion. In Chuubo’s Marvelous Wish-Granting Engine, applicability is not a binary “yes, you may” / “no, you may not”; a player may use any skill on their character sheet for literally any purpose. (An example in the book describes using a Cooking skill for the intended action “I blow up the Earth with my mind.”) Rather, the GM’s assessment of how much the action stretches the skill diminishes its effectiveness, making it more costly to get favorable results with. That still encourages matching mechanical elements to appropriate fiction, without the inherent frustration of shutting down a player’s contribution outright!

What pitfalls have you encountered in the interface between rules and fiction? What design patterns or play behaviors help avoid them? What other insights have you gleaned from the matter of clouds and boxes?

* “What’s a game, then?” Well, hypothetical wiseass, I don’t have an essentialist definition for you that would reliably include all games and exclude all non-games. As a usually-useful approximation, though: a game is a rules-structured, temporally bounded activity sustained by one or more behavioral reward cycles.

Gender of Choice

Some years back, I heard an NPR segment about students defying gender norms, including such odd approaches as insisting that one’s gender was “truck” and should thus be referred to with pronouns like “it.” I made a few faltering starts at writing a blog post about my thoughts on it, but never quite finished. The topic came back to mind with March 31st’s Transgender Visibility day and this delightful little comic by @papayakitty on Twitter.

What’s my gender?

I mean, I’m a guy, sure; biologically male, wear masculine clothing more often than gender-neutral clothing, and feminine clothing only when cosplaying, etc. But I do rather delight in “crossplay” when the (uncommon) opportunity comes up. I’ve roleplayed female characters with increasing frequency since I was maybe seven or eight years old, and while it’s been a more or less novel thing as time’s gone on, it’s never felt awkward or wrong. When the Internet came into flower and I established online identities on services like AOL, IRC, GameSpy Arcade, and later Furcadia, I frequently presented myself as a girl. People tended not to realize I was playing cross-gender unless the point was specifically mentioned out of character. (I even wrote a poem about the ugly reactions people had to the disconnect when revealed; it reads pretty clearly as an adolescent transgender lament.) I went to an all-boys high school, but I tended to disdain the connotations thereof, amending statements of my gender identity with such qualifiers as “male, low testosterone.” I still feel that having Ranma Saotome’s curse would be pretty awesome. I’ve had people ask me if I’m gay due to my love of romance themes in my entertainment. My all-time favorite movies (The Princess Bride, Magnolia, and 500 Days of Summer) might be called “chick flicks”… I could go on.

Thing is, I don’t think it makes sense to consider me “transgender” in the sense most commonly meant by that. I don’t experience gender dysphoria when looking at myself or presenting as male. I have enjoyed every privilege inherent in cis white maleness, and feel it would be disrespectful to those less privileged to insist otherwise. “Thinking it would be cool to be a woman” is a far cry from even what little I’ve glimpsed into the life experiences of my transgendered friends.

Then again. What even is gender?

Wracking my brain for anything that would qualify as essential to the genders or even the biological sexes, I don’t come up with a lot. It sort of makes sense to have some outward signifiers of “bearing male gametes,” in a world where that’s both of practical concern on a day-to-day basis, and the level of scientific understanding and interpersonal communication is weak enough that you couldn’t just have the conversation, “Can you have children with me, and do you want to?” But we don’t live in such a backward world by now, thank the Primes, and for someone like me who isn’t interested in children in the first place, it’s all rather unnecessary. Everything else we associate with the genders or sexes is contingent, mere statistical truth at best. We can say “as a species, homo sapiens features sexual dimorphism, with such-and-so genital structures and secondary sexual characteristics,” but individuals’ physical characteristics can and do diverge wildly from those baselines. And the various personality traits and aesthetic choices associated with either gender are even fuzzier, ranging from laughably arbitrary (pink used to be a masculine color and blue feminine) to equal parts harmful, offensive, and untrue (“men tend to be physically violent”).

People operate under schemes of categorization for cognitive ease, though, so it’s psychologically practical to think of someone as based on a template with variations. “He’s very much a bro,” “she’s a tomboy,” “he’s a guy but likes sewing,” or whatever. They also help with personal identity; group membership is a powerful human need, and resonance or solidarity with fellow “men” and “women” is of great use and comfort. These labels become problematic, though, when they influence our behavior in discriminatory ways, lead us to jump to unfounded conclusions, or perpetuate stereotypes that shore up unjust systems of power. And when it comes to gender, it’s difficult to use the categories without falling into any of those traps.

Labels like “agender,” “demigirl” etc., as mentioned in the abovelinked comic, then serve a dual purpose: they defy standard assumptions about gender while still providing the psychic value of a group identity to belong to. They seem pretty darn cool to me! Of the ones I’ve poked at, “demiboy” (or “demiguy,” which doesn’t have as nice a sound to it) feels most in tune with my own experiences. If I were to embrace that label, what would it suggest? A greater freedom of choice in fashion and affect, I suppose… I have often envied women their lovely options in clothing.

And/or I could develop a female tulpa to the point where I could switch her into the dominant consciousness… hah!

Gamers, Pure and Special Just the Way They Are

There’s a persistent thread in children’s entertainments that goes, in various forms, “you’re beautiful just the way you are.” It’s a sentiment meant to guard against bullying, especially on the basis of factors beyond one’s control: appearance, family background, etc. But I wonder if some folks, exemplified by recent hate movements like GamerGate, have taken this message to heart with respect to things that are under one’s control.

“I’m special just the way I am,” if taken at face value, can be used as an out from any need to change or moderate one’s behavior. In fact, calls to behave differently or better are seen as part of a system of shame and bullying. If one’s personality is just the way you are, part of an immutable identity, then criticism of one’s behavior is inherently pointless and unjustified. “I’m perfect just the way I am! How dare you ask me to change?” So, for instance, the stereotypical image of the gamer, with its crude, obsessive, poorly groomed basement dweller, insofar as it is an accurate picture of an individual, is a thing to be embraced. Discarding personal hygiene in favor of more gameplaying time is the way I roll! Anyone who thinks I should change my ways is just a bully.

You can see this belief surface in other ways, too. For instance, there is a tendency to drag up many-years-old comments by an individual that have some hateful component to them, and hold them up as representative of that person’s true self. After all, if someone acted in a certain way at one point in time, and personality or behavior is a fixed part of one’s identity, then any change should be treated as suspect. Apologies for such past behavior are disingenuous, capitulation to outside pressure at best. Jim Sterling and Ian Miles Cheong have received a great deal of this treatment.

Of course, there are hypocrisy and double standards here too. For instance, if Breitbart columnist Milo rescinds his past disparaging remarks about the gamer community, that’s accepted and praised. Apparently, the hardcore gamer identity is the true one, and movements in its direction can be genuine. So long as it’s unsullied by disagreement with the gamer core, at least: people who don’t toe the party line, such as Anita Sarkeesian, continue to be treated as posers even if they begin to play games in the hardcore fashion. One can always rationalize a belief like “we’re special just the way we are” in a way that stays in harmony with one’s political agenda.

We should thus be on guard against the tendency to absorb messages that reinforce our entrenched sense of self and render us defensive against change. There are plenty of messages in children’s media and elsewhere that teach moral growth and abandonment of problematic behaviors, but if we cherry-pick those messages that say we don’t need to change, the rest fades into the background. I don’t know how to bring a greater self-awareness to those who have chosen this entrenched identity mantra, but I can at least celebrate counterpoints. And I can resist the little cultural memes that reinforce this idea, such as saying “that’s just the way he is” in response to someone’s bad behavior. That’s the way he is, but it’s never just the way he is. People can change for the better. I must always believe that, to have any hope for the world.

The Patriarchy and Other Conspiracies

Kali Ranya:

When I made it to high school English and thus graduated from analysis of story structure like exposition, climax, and resolution to the exegesis properly called “literary criticism,” I found it a wondrous experience. Here were these stories I already enjoyed reading made into a whole new sort of game, going between the lines to guess at the author’s hidden meanings! It was like playing at spies with Shakespeare across the centuries, he penning his poetry with a wink, I winking back as I set to the task of decoding it.

In college I was in turn introduced to feminism, and the lens of feminist criticism. Here I encountered discussion of the patriarchy: how so many cultural mores, laws, and artistic themes were instruments of oppression, means to keep women “in their place” and men in positions of power over them. I got the idea well enough to score good grades via the approach, but the rhetoric of it always struck me as rather strange. It wasn’t like a bunch of villainous dudes sat down in a boardroom discussing how best to put one over on the wimminz, and came to the conclusion that images of underwear-clad female bodies with the heads cropped out of the picture would be an excellent stratagem. But that was the conspiratorial scenario that the instruments-of-oppression discussion seemed to convey.

It wasn’t until recently that I realized how these two things are related.

The author is dead. It is a truth of human existence, a fact of human nature that we can never truly know another person’s intentions. We can only see the effects of what they do, and if we are so inclined, guess at what thoughts led to those actions. Perhaps the person speaks up about what they meant, saying or doing what they did, but we can only take what they say as fact insofar as we trust them. Shakespeare is a beloved icon of Western culture, and I dearly wanted the sort of intellect and refinement associated with being conversant with him; so it was simple to believe that the Bard had with skill and intent buried themes in his work for generations to unravel and discuss. I can only imagine that peers of mine who thought Freshman English a waste of time likely believed the unpacking of deep textual meanings to be so much teacherly sleight of hand. I, being in a place of privilege myself, found it a stretch to ascribe malice to men simply looking to make a buck or raise a family in the same traditions as they grew up. If I were marginalized and frustrated by constant belittling of my gender, race, or orientation, I would not have the energy or inclination to give the benefit of the doubt to those perpetuating the system.

This is, I think, the deep source of many conflicts: religious, political, geek-tribal, etc. Or if not the source itself, then at least a cause of the constant talking past another we do, the bizarre and frustrating sense that the folks on the other side of whatever divide are speaking a different language. A devout Catholic believer, feeling well loved by the Church, having a rapport with its representatives, remembering many occasions of support and comfort from it, might be shocked and dismayed to hear of clerical abuses; but in the end will accept the clergy’s remorse and reassurances at face value. Someone with less deep-seated an investment in the Church’s authority, someone who perhaps feels disconnected from their fellows there, or who has had experiences of their worries and complaints falling on deaf ears within the hierarchy, or someone not a believer at all, will be much more inclined to see cover-up, hypocrisy, and emptiness in the same ostensibly reassuring words. (This is not to deny or make light of the possibility that such a startling event could break even the deepest-held trust; I’m talking about trends and tendencies here.) Whose interpretation is more correct? It’s hard if not impossible to know, because we cannot look behind the mask of the sermonizing priest’s face to lay bare his thoughts.

I’m not sure where to go with any of that, really, save to recognize it when I see it, especially in myself. There are conspiracies in the world, and there is malice, but the places we see them often say as much about us as they do about the people we perceive to bear those ill intents.


Kithia Verdon:

Every so often, an article comes out talking about how people’s use of the Internet is wrecking their brains, impairing their ability to focus, a sort of induced attention deficit disorder. Normally, I’d be skeptical of such hand-wringing. It has an old-fogey kids-these-days feel to it. But my own experience lately makes me feel it’s plausible.

Sometime over the last few years, I’ve become a compulsive skimmer. It’s become more and more difficult for me to really concentrate on what I’m reading. If I’m cruising the Internet, the urge to spin off another tab and jump to some other thing, be it Facebook or email or Twitter, keeps distracting me. If I’m reading a blog post, I quickly tire of working through the author’s thoughts and arguments, and skip to the end to see their conclusions or summary. Even reading something in print, my eyes constantly twitch to the paragraphs ahead, peeking at what’s next with a manic impatience.

That scares me some, because I wasn’t always that way. I majored in Philosophy, for crying out loud, and I didn’t cut corners on the reading, at least not often. I find it hard to imagine my today-self, who has to double back to a sentence sometimes three and four times before it sinks in, getting through one of those assignments and coming to class ready to discuss it the next day. I’m only 30! It’s not right for my mind to have decayed that far.

Maybe it’s not the Internet’s fault. It could be diet, or stress, or something in the water. But it makes sense, at least, that it could be my computer routine. Habit is a powerful thing, the pathways of the brain used for a particular behavior getting strengthened and comfortably worn until that behavior and its mindset become the path of least resistance among the synapses. And what I’m experiencing seems like optimized blog-cruising behavior. If you’re following several high-traffic blogs with hundreds of posts in the queue, each of them on a particular topic so there’s a degree of similarity among the posts within one, there’s simply not enough time in the day to open up every article and read it front to back. You have to skip around, skimming headlines, popping open the ones that seem interesting, and moving on if it doesn’t provoke your interest. Unfortunately, an ingrained habit like that can bleed out to places where it’s not as appropriate, like trying to enjoy a novel or familiarize oneself with the mechanics of a role-playing game.

More importantly for my purposes here, though, habits can be changed, with similar effort made to build up an alternative behavior and let the old one lapse. So I’m pondering what I might do to reverse the trend. I’ve got these three ideas so far. What else can you think of?

Luddite Saturdays. Based on Weekend Luddite, the idea is to regularly switch off all staring-at-a-screen type devices. No PC, no tablet, no Xbox or Playstation, no DS or PSP, not even any gaming on the cell phone. (I’ll permit myself DVDs/Netflix if the girlfriend wants to watch something together.) I’ve chosen Saturday, being a day when I have no computer-necessary obligations and which has the most idle time to waste. Having to find non-computery things to do with myself all day will help reintroduce me to the wonderful world outside the boxes, making it feel more natural to seek out entertainments requiring focus and motivation.

Bloggy Sunday. In something of the inverse of the above, I’m thinking I’ll do all my ADD-style Internetting on one day of the week. Catching up on blogs, forums, Facebook, Twitter, G+, and the like–anything carved up into lots of little pieces to jump around between–will take place on Sunday and only on Sunday. This will help establish a specific context where scatterbrain behavior is appropriate, keeping it contained.

Mindfulness meditation. The ancient practice of quieting the “monkey mind” and centering oneself in the present moment has been shown to alter brain activity for the better. If I can get a meditative mindset rolling, it’ll become easier to notice when I’m getting flaky and bring myself back to the task at hand. I’ve been wanting to do this for a while, actually, even before I noticed myself becoming computer-brained, so the trick will be finding a good time and quiet space to sit and meditate regularly. Maybe first thing in the morning on workdays?