IEZA Framework

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
IEZA framework

The IEZA framework is a 2-dimensional framework that describes the auditory environment of video games. It was developed by Sander Huiberts and Richard van Tol at the Utrecht School of the Arts between 2003 and 2008, and it can be used for the analysis and synthesis (conceptual design) of sound in computer games.


The IEZA framework uses two dimensions to describe sound in computer games. The first dimension makes a distinction between sound emanating from the fictional game world, such as the footsteps of a game character, and sound coming from outside the fictional game world, such as a musical score. Stockburger (2003) describes this distinction by using the terms diegetic and non-diegetic. The second dimension makes a distinction between sound related to the activity of the game on one hand, and sound related to the setting of the game on the other.

Four domains are formed by the two dimensions: Interface, Effect, Zone and Affect.


Sound in the Interface domain expresses what is happening in the game. In many video games this is sound related to activity in the HUD, such as sounds synced to health and status bars, pop-up menus and the score display. Although sound in this domain often follows ICT interface sound design conventions (abstract iconic and non-iconic signs), there are many games that intentionally try to blur the boundaries of Interface and Effect by mimicking the diegetic concept. In Tony Hawk's Pro Skater 4, Interface sound instances consist of the skidding, grinding and sliding sounds of skateboards.


Sound in the Effect domain expresses activity in the game world. Sound usually consists of a mix of one-shot sound events in the game world (either triggered by the player or by the game itself), such as the sound of an explosion, and continuous sound streams, such as the sound of a continuously burning fire. Sound of the Effect category often mimics the realistic behavior of sound in the real world. In many games it is the part of game audio that is dynamically processed using techniques such as real-time volume changes, panning, filtering and acoustics.


Sound in the Zone domain expresses the setting (for example the geographical, cultural and/or topological setting) of the game world. A zone can be understood as a different spatial setting that contains a finite number of visual and sound objects in the game environment (Stockburger, 2003, p. 6). It might be a whole level in a given game, or part of a set of zones constituting the level. Zone is often designed in such a way that it reflects the consequences of game play on a game's world.


Sound in the Affect domain expresses the setting (for example the emotional, social and/or cultural setting) of the game. Affect is often designed in such a way (using real time adaptation) that it reflects the emotional status of the game or that it anticipates upcoming events in the game.

Applications of IEZA[edit]

The IEZA framework is featured in a book chapter by Ulf Wilhelmsson and Jakob Wallén (2011). The authors combine IEZA with the model for the production of film sound by Walter Murch (1998) and the affordance theory by Gibson (1977). The framework is also used as a conceptual framework for the functioning of game audio in relation to immersion (Huiberts, 2010). Whitehead (n.d.) fuses several frameworks for game audio, amongst the IEZA framework. Conway (2010) used the framework for an analysis of digital football games. IEZA has been evaluated at the Utrecht School of the Arts Adaptive Music Systems Research Group under Jan IJzermans as a design resource in educational, academic and practical settings.

See also[edit]


  • Huiberts, S. & Tol, R. van, (2008). IEZA: a framework for game audio Retrieved December 1, 2008, from:
  • Stockburger, A. (2003). The game environment from an auditive perspective In: Copier, M. and Raessens, J. Level Up, Digital Games Research Conference (PDF on CD-ROM). Utrecht, The Netherlands: Faculty of Arts, Utrecht University.
  • Wilhelmsson, U. and Wallén, J. A Combined Model for the Structuring of Computer Game Audio In: Grimshaw, M. (2011). Game Sound Technology and Player Interaction: Concepts and Developments. University of Bolton, UK.
  • Huiberts, S., Captivating Sound: the Role of Audio for Immersion in Games. Doctoral Thesis. University of Portsmouth and Utrecht School of the Arts, Portsmouth, 2010. Online version
  • Murch, W. (1998). Dense clarity - Clear density Retrieved March 10, 2010, from
  • Gibson, J. (1977). The theory of affordances In Shaw, R.E., & Bransford, J. (Eds.), Perceiving, acting and knowing. New Jersey: LEA.
  • Whitehead, I. (n.d). Sound For interactive Games: Theoretical Concepts and Implementation Practice Retrieved February 7, 2012, from:
  • Machen, S., Game Audio Rules - Game Audio Paper. Retrieved February 5, 2013, from: Leeds Metropolitan University
  • Conway, S. (2010), If it’s in the game, it’s in the game. An analysis of the Football Digital Game and its players. Doctoral thesis, University of Bedfordshire.
  • Åsén, R. (2013). "Game Audio in Audio Games: Towards a Theory on the Roles and Functions of Sound in Audio Games." (Student paper). Högskolan Dalarna. Retrieved January 5, 2014 from;jsessionid=b0ea5773168d0e99565b3772a288?pid=diva2:682971&searchId=null
  • Cudworth, Ann Latham (2014). "Virtual World Design", A K Peters/CRC Press.

External links[edit]