Position:
1) Most games usability studies create novel (domain-specific) heuristics defined by one or two researchers.
2) Heuristics should be defined by experts based on prior, proven application to the domain.
3) There are no broadly accepted heuristics that apply through the diversity of video games.
4) Therefore the heuristics created by researchers are under-analysed and overly trusted.
5) UX is not broadly employed in the games industry.
Proposal:
Create a methodology to quantifiably evaluate current heuristics by user testing, such that industry can understand, prioritise and employ their value.
e.g., From Federoff's HEURISTICS AND USABILITY GUIDELINES FOR THE CREATION AND EVALUATION OF FUN IN VIDEO GAMES:
"Controls should be intuitive and mapped in a natural way"
In order to evaluate this the terms 'intuitive' and 'natural' need to be defined in a more concrete way. One approach would be to conduct UX testing across representative user samples to confirm that their natural and intuitive expectations are met.
Notes:
Federoff concedes,
"If game heuristics were further studied and verified, then heuristic evaluations (Nielsen, 1994) and heuristic walkthroughs (Sears, 1997) would also be ways in which games could be evaluated quickly and cheaply"
"A last suggestion for further research is to verify the following compiled list of heuristics identified in the literature and case study."
c.f.,
Laitinen, Sauli. "Do usability expert evaluation and test provide novel and useful data for game development?". (Journal of Usability Studies, 2.1, February 2006), pp. 64-75
Nielsen says,
"My recommendation is normally to use three to five evaluators ..."
How to Conduct a Heuristic Evaluation
Metrics for Heuristics: Quantifying User Experience (Part 1 of 2)
Metrics for Heuristics: Quantifying User Experience (Part 2 of 2)
@inproceedings{142834,
author = {Jakob Nielsen},
title = {Finding usability problems through heuristic evaluation},
booktitle = {CHI '92: Proceedings of the SIGCHI conference on Human factors in computing systems},
year = {1992},
isbn = {0-89791-513-5},
pages = {373--380},
location = {Monterey, California, United States},
doi = {http://doi.acm.org.ezproxy.sussex.ac.uk/10.1145/142750.142834},
publisher = {ACM},
address = {New York, NY, USA},
}
@inproceedings{1145583,
author = {Guillermo J. Covella and Luis A. Olsina},
title = {Assessing quality in use in a consistent way},
booktitle = {ICWE '06: Proceedings of the 6th international conference on Web engineering},
year = {2006},
isbn = {1-59593-352-2},
pages = {1--8},
location = {Palo Alto, California, USA},
doi = {http://doi.acm.org/10.1145/1145581.1145583},
publisher = {ACM},
address = {New York, NY, USA},
}
Quantifying developer experiences via heuristic and psychometric evaluation
Kline, R. Seffah, A. Javahery, H. Donayee, M. Rilling, J.
Dept. of Comput. Sci., Concordia Univ., Montreal, Que.;
This paper appears in: Human Centric Computing Languages and Environments, 2002. Proceedings. IEEE 2002 Symposia on
Publication Date: 2002
On page(s): 34- 36
ISSN:
ISBN: 0-7695-1644-0
INSPEC Accession Number: 7481027
Digital Object Identifier: 10.1109/HCC.2002.1046339
Date Published in Issue: 2002-12-10 17:23:27.0
Heuristic Evaluation Quality Score (HEQS): A Measure of Heuristic Evaluation Skills
Making Usability Recommendations Useful and Usable
A Structured Process for Transforming Usability Data into Usability Information
Post-Modern Usability
Heuristic Evaluation of a User Interface for a Game-Based Simulation
Quantitative Design
Towards the Design of Effective Formative Test Reports
Wiberg, Charlotte (2003). A Measure of Fun: Extending the scope of web usability. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-210 (2008-07-06)
A Measure of Fun - Charlotte Wiberg
Desurvire, H., Jegers, K., Wiberg, C. (2007) Developing A Conceptual Framework for Analysis and Design of Evaluation Methods. Presented at the workshop ‘Beyond Current User Research: Designing Methods for New Users, Technologies, and Design Processes’ at the CHI 2007 conference, San Jose CA, USA, April 2007.
Desurvire, H., Jegers, K., Wiberg, C. (2007) Evaluating Fun and Entertainment: Developing A Conceptual Framework Design of Evaluation Methods. Accepted at the workshop ‘Facing Emotions: Responsible experiential design’ at the INTERACT 2007 conference, Rio, Brasil, September, 2007.
Jegers, K. & Wiberg, C. (2006) Early User Centered Play testing of Game Concepts for Pervasive Games. Presented at the Player Centred Design Workshop, CHI 2006 conference, Montreal.
Sunday 6 July 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment