I've added Interactions to the links section. Especially interesting was the special issue "More funology: games" which featured the following articles,
Pagulayan, Randy and Steury, Keith. "Beyond usability in games"
Chao, Dennis L. "Computer Games as Interfaces"
Laurel, Brenda. Narrative construction as play"
The ACM requires subscription to view these articles.
Showing posts with label pagulayan. Show all posts
Showing posts with label pagulayan. Show all posts
Friday, 14 September 2007
Monday, 3 September 2007
Thief 3 Usability Testing
Shannon Lucas and Denise Fulton present a brief summary of their usability testing project of Ion Storm's game Thief 3: Deadly Shadows in an article called "What We Learned Evaluating the Usability of a Game"
They made use of the "Rapid Iterative Testing and Evaluation" (RITE) methodology developed by Pagulayan et al. at the Microsoft Game User Research Group to guide designers and programmers during the early Alpha stages of the game.
Interesting observations are that while they didn't have time to review the videotaped footage, this was useful as a resource for programmers to observe their AI under real operating conditions. It doesn't sound like they made use of data visualisation or game metrics, but rather relied upon subjective evaluation from the players first-hand experience. For example, the conclusion that "8 out of 10 players had difficulty using the blackjack tool" appears to be based on player reports rather than measurements of the weapons use during gameplay.
"What We Learned Evaluating the Usability of a Game". Usability and User Experience Community of the Society for Technical Communication. October 2004 issue (Vol 11, No. 2).
They made use of the "Rapid Iterative Testing and Evaluation" (RITE) methodology developed by Pagulayan et al. at the Microsoft Game User Research Group to guide designers and programmers during the early Alpha stages of the game.
Interesting observations are that while they didn't have time to review the videotaped footage, this was useful as a resource for programmers to observe their AI under real operating conditions. It doesn't sound like they made use of data visualisation or game metrics, but rather relied upon subjective evaluation from the players first-hand experience. For example, the conclusion that "8 out of 10 players had difficulty using the blackjack tool" appears to be based on player reports rather than measurements of the weapons use during gameplay.
"What We Learned Evaluating the Usability of a Game". Usability and User Experience Community of the Society for Technical Communication. October 2004 issue (Vol 11, No. 2).
Labels:
fulton,
lucas,
microsoft game user research,
pagulayan
Psychology in Game Research
There's a lightweight article over at The Escapist titled "The Perception Engineers" about the role of psychology in game usability analysis, with comment from Daniel Gunn, Randy Pagulayan and Tim Nichols from Microsoft Game User Research.
Labels:
gunn,
microsoft game user research,
nichols,
pagulayan,
psychology
Wednesday, 22 August 2007
Halo 3
Yesterday Wired ran an article about Randy Pagulayan's team at Microsoft Games User Research and their user testing of Halo 3.
It's an entertaining read with a few little tidbits of information on how they conduct their usability evalusation process. Highlights for me include the observation that dying can be fun. This is clear to a game player or developer, but might come as a surprise to a usability specialist focussed on effective and efficient interfaces. That "skin meters, cardiac monitors, and facial electromyographs" were needed to deduce this is quite surprising. I'd have thought that a questionnaire would suffice. Of course, this depends on the kind of game. If you get to see an entertaining animation, or a tactically-useful slow-motion replay of your death then there's clearly 'value' for the player who died. However, in a multi-player deathmatch game where you have to sit and wait for several minutes until all of the other players finish their game, this is clearly going to be a frustrating experience - unless there is some activity to take part in such as chatting with other dead characters, or trailing other live players.
Most interesting for me is the description of the tools Pagulayan's team devised to capture and visualise play throughs. This is where I see a double programmer / usability expert as being extremely useful to game development. Conditionally compiled code could be added to a particular build of the game to enable data tracking, perhaps offloaded to disk or to a database on the local network. The other side of this process would be to construct tools to analyse and visualise the data. It would work like a semi-independant tools and QA team, providing analysis back to designers.
It's an entertaining read with a few little tidbits of information on how they conduct their usability evalusation process. Highlights for me include the observation that dying can be fun. This is clear to a game player or developer, but might come as a surprise to a usability specialist focussed on effective and efficient interfaces. That "skin meters, cardiac monitors, and facial electromyographs" were needed to deduce this is quite surprising. I'd have thought that a questionnaire would suffice. Of course, this depends on the kind of game. If you get to see an entertaining animation, or a tactically-useful slow-motion replay of your death then there's clearly 'value' for the player who died. However, in a multi-player deathmatch game where you have to sit and wait for several minutes until all of the other players finish their game, this is clearly going to be a frustrating experience - unless there is some activity to take part in such as chatting with other dead characters, or trailing other live players.
Most interesting for me is the description of the tools Pagulayan's team devised to capture and visualise play throughs. This is where I see a double programmer / usability expert as being extremely useful to game development. Conditionally compiled code could be added to a particular build of the game to enable data tracking, perhaps offloaded to disk or to a database on the local network. The other side of this process would be to construct tools to analyse and visualise the data. It would work like a semi-independant tools and QA team, providing analysis back to designers.
Labels:
halo,
microsoft game user research,
pagulayan
Subscribe to:
Posts (Atom)