Show simple item record

dc.contributor.advisorEllis, Newton C.
dc.contributor.advisorMitta, Deborah A.
dc.creatorDykstra, Dean Julian
dc.date.accessioned2024-02-09T20:43:10Z
dc.date.available2024-02-09T20:43:10Z
dc.date.issued1993
dc.identifier.urihttps://hdl.handle.net/1969.1/DISSERTATIONS-1526964
dc.descriptionVitaen
dc.descriptionMajor subject: Industrial Engineeringen
dc.description.abstractThe need for cost-effective usability evaluation has led to the development of alternative usability methods. Reducing the number of participants in a usability test or using heuristic evaluation are the two approaches most often associated with discount methods. In this dissertation, an alternative heuristic method was developed based on the concept of a double specialist (knowledge of usability techniques and of a software product domain). Double specialists were previously shown to provide heuristic evaluation superior to that of regular usability professionals, primarily because of their knowledge of usability problems typically found within the given domain. Similar knowledge, based on competitive usability testing, was used to devise a domain-specific heuristic checklist. The product domain chosen for the research was online calendars. The calendar checklist was compared to general heuristics and to streamlined usability testing with two usability test subjects. Fifteen software developers and fifteen usability professionals were each randomly assigned to one of the three evaluation methods. They were then provided with a previously unseen software calendar and were asked to find as many usability problems as possible. Participants using the calendar checklist found more total problems, and a significantly higher percentage of user-oriented problems and severe problems than did those using the general heuristic method, suggesting that domain-specific heuristics do provide an improved heuristic approach. Streamlined usability testing, however, was found to be the most effective discount usability method. Usability professionals and software developers did not differ in their ability to find user-oriented or severe problems. It was concluded that when time and resources permit, usability testing is the method of choice. When user testing is not feasible, however, domain-specific heuristics may provide results approaching the effectiveness of the double specialist.en
dc.format.extentxiii, 215 leavesen
dc.format.mediumelectronicen
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.rightsThis thesis was part of a retrospective digitization project authorized by the Texas A&M University Libraries. Copyright remains vested with the author(s). It is the user's responsibility to secure permission from the copyright holder(s) for re-use of the work beyond the provision of Fair Use.en
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectMajor industrial engineeringen
dc.subject.classification1993 Dissertation D9956
dc.titleA comparison of heuristic evaluation and usability testing : the efficacy of a domain-specific heuristic checklisten
dc.typeThesisen
thesis.degree.disciplineIndustrial Engineeringen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.nameDoctor of Philosophyen
thesis.degree.namePh. Den
thesis.degree.levelDoctorialen
dc.contributor.committeeMemberKoppa, Rodger J.
dc.contributor.committeeMemberSimmons, Dick B.
dc.type.genredissertationsen
dc.type.materialtexten
dc.format.digitalOriginreformatted digitalen
dc.publisher.digitalTexas A&M University. Libraries
dc.identifier.oclc34482860


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

This item and its contents are restricted. If this is your thesis or dissertation, you can make it open-access. This will allow all visitors to view the contents of the thesis.

Request Open Access