Traditional methods of player input in videogames have involved the mapping of button presses, analogue stick direction and pressure, mouse manipulations, and now gestures to actions in-game.
These established conventions of accepting input are excellent proxies for determining player intent, but they fail to indicate another crucial dimension of player input—player sentiment.
With very few exceptions, games have ignored a player’s emotional response when tailoring gameplay experiences. This lack of inclusion has been tied to the inability of traditional controllers – mouse and keyboard, gamepad, etc. – to adequately measure physiological signals.
However, the technology to measure and quantify these signals is now more readily available and reliable, and the opportunity exists for videogames to incorporate emotion as an additional axis of player input.
Very interesting so far!
With the addition of emotional input, gameplay can be tailored to the individual emotions of the player.
For example, difficulty could be dynamically adjusted dependent upon the player’s current frustration level. Or, indices of emotional arousal may be utilized as inputs to the game. One can imagine a sniper receiving an accuracy boost in-game if their heart rate remains low but suffer a penalty when their heart rate rises above a pre-determined threshold.
With the ability to quantify a wide array of player emotion – frustration, enjoyment, engagement, boredom, fatigue, etc. – the design space of gameplay experience can multiply to create experiences that are not possible with traditional controller inputs.
I suppose a lot of you now want to get hooked up to these things so you can EXPERIENCE THE GAME, and whatnot.
Currently, Valve is looking into the use of physiological signals to dynamically adjust gameplay based upon the arousal of the player, to create gameplay elements that directly depend upon player emotional state, to investigate alternative methods of player control, to quantify response to gameplay while playtesting and to examine potential uses in matchmaking/social gameplay experiences.
In particular, this talk will cover how Valve is incorporating actual measurements of player arousal into the algorithms governing the AI Director in Left 4 Dead 2, how we created a mod of Alien Swarm that dynamically adjusts difficulty based upon the player’s SCL response and how we are using eye movements as replacements for a mouse and gamepad in Portal 2.
For details on these experiments, and some extra stuff we didn’t put in here, head over to EDGE.