CS 352

CS 352

Project

HCI

What is HCI?

Human-computer interaction is: “concerned with the design, evaluation, and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them” (ACM SIGCHI).

Other definitions:

  • Optimizing user's interaction with system, environment, or product.
    • How do we bring tech. in line with user expectations?

Usability Engineering vs. Interaction Design

Interaction design definition: Designing interactive products to support people in their everyday and working lives.

Usability engineering definition: Nielson doesn't really give one. It's a solid process by which you can create usable software.

Active Areas of HCI

  • Novel interfaces/techniques.
  • Design & design practices.
  • Communication.
    • How medium affects the message.
  • Accessibility & universal computing.
  • Work efficiency.

Why should we care?

  • Computers affect most people:
    • 89% of US has access to computers. 65% are online.
    • Have to deal with businesses and gov't agencies.
  • Computers are everywhere.
  • Success often depends on ease of use more than power or features.

Case for HCI

  • Nielsen
    • Increase learnability.
    • Increase efficiency.
    • Increase memorability.
    • Decrease errors.
    • Increase satisfaction.
  • Preece
    • Utility?
    • Effectiveness?

User Experience Goals (Preece)

Preece introduces user experience goals:

  • Satisfying.
  • Enjoyable.
  • Fun.
  • Entertaining.
  • Helpful.
  • Motivating.
  • Aesthetically pleasing.
  • Supportive of creativity.
  • Rewarding.
  • Emotionally fulfilling.

Are these always good/desirable?

  • Depends upon the application.
  • E.g. emotional fulfillment in Excel.

Case Against Usability

There are places where usability may not be good:

  • Security systems.
  • Medicine (e.g. medicine bottles).
  • Complex system operators (make it too easy → operator stupidity).

Examples of poor usability:

  • Home electronics with the only controls being the remote. (Lose the remote and get hosed.)
  • Scientific calculators/too much functionality in too little space.
  • QWERTY for English.

Evolution of Usability

History of Computer Interaction

40s-50s: Batch

  • Computer performed one task at a time.
  • No interaction once computation started.
  • Switches, wires, punch cards, and tapes for I/O.

60s-70s: Command line

  • Computers hit “big business.”
  • More varied tasks.
  • Text processing, email, etc.
  • Teletype terminals.

80s-present: WIMP

  • (Windows, Icons, Mouse, Pointer)
  • Computers in the home, everyday tasks, no training.
  • From multi-user to multitasking systems.
  • WIMP interface allows you to do several things simultaneously.
  • Has become the familiar GUI interface.

People in Computing History

Innovator: Ivan Sutherland

  • SketchPad – 1963 – first interactive drawing program on computer.
  • Hierarchy.
  • Master picture with instances.
  • Constraints.
  • Icons.
  • Copying.
  • Light pen input device.
  • Recursive operations.

Douglas Engelbart

  • Invented mouse/pointer.

Paradigm: Direct Manipulation

You are interacting with the image on the screen.

  • '82 Shneiderman describes appeal of rapidly-developing graphically-based interaction.
    • Object visibility.
    • Incremental action and rapid feedback.
    • Reversibility encourages exploration.

Paradigm: Metaphor

Metaphor paradigm involves mapping the real world onto computer use.

  • All use is problem-solving or learning to some extent.
  • Relating computing to real-world activity is an effective learning mechanism.
    • File management on office desktop.
    • E.g. financial analysis as spreadsheets.

Project

Library Field Trip

  • Going to learn how to watch people, environment, etc.
  • Learn how to take notes.

Proposal

  • Find a usability problem.
  • Research problem.
  • Create a proposal.

Research

Usability and Design Process

Basic Questions

  • What is usability in terms of design/requirements?
    • It's essentially the same thing, come at from different angles.
    • HCI from user standpoint, software engineering from developer's standpoint.
  • When and where in design/build process do you use usability?
    • Everywhere.

User-Centered Design Process

  1. Identify users
  2. Identify activities/contexts.
  3. Identify needs.
  4. Derive requirements.
  5. Derive design alternatives.
  6. Build prototypes.
  7. Evaluate prototypes.
  8. Iterate.
  9. Ship, validate, maintain.

Understanding Users

  • Need to take into account:
    • Who users are
    • What activities are being done
    • Where interaction takes place
      • Conditions may exclude or require certain interface choices, e.g. use of sound.
  • Need to optimize interactions users have with a product.
    • Such that they match user activities and needs.

Understanding Users (2)

Who are the users?

  • Those who interact directly w/ product.
  • Managers of direct users.
  • Receivers of the product's output.
  • Purchasers of the product.
  • Users of competitor's products.

Populations and Sampling

  • Identify user groups.
    • Make sure all are represented in your study.
    • Make sure more than one person from each group is represented in your study.
  • How much to study?
    • Time.
    • Subjects.
  • Random vs. non-random sampling.

Studying Users

Each method of learning about users will be more appropriate depending upon the context.

  • Questionnaires.
    • Difficult to create in unbiased way.
    • Reach lots of people quickly.
  • Interviews.
    • Time-consuming.
    • Noisy analysis.
    • Can reveal information you didn't think of.
  • Focus groups.
    • Various levels of structure.
    • One person can dominate.
  • Naturalistic Observation
    • Ethnomethodological.
    • Contextual inquiry.
    • Participatory design.
  • Documentation.

Naturalistic Observation

What are needs?
  • Users often don't know what's possible.
  • Users can't tell you what they need to achieve goals.
  • Instead, look at existing tasks:
    • Context
    • Information required
    • Collaboration
    • Why it's done that way
  • Envisioned tasks:
    • Can be rooted in existing behavior.
    • Can be described as future scenarios.
  • Based upon observation, design a system that meets user's needs in a great way.
Ethnography
  • “Writing the culture.”
  • Ethnographer takes part in the world for extended periods of time, passive observer.
    • “World” ⇒ company, society, family, etc.
  • Observe everything taking place: Activities, environments, interactions, practices, etc.
  • Hidden assumption: We don't always know what we know or do, make the implicit explicit.
    • Not limited to any particular scope.
    • Gather info on all observations.
  • Noisy, but very rich and detailed data.
  • Two types: Participant observation and Contextual inquiry.
  • Participatory Observation
    • “Going native”
    • Ethnographer part of culture.
    • May be difficult to relate back.
    • May influence outcome.
    • Tries not to make judgements about why events happen.
  • Contextual inquiry
    • Ethnographer less embedded in the culture.
    • Apprentice relationship to actor.
    • Some observation followed by questions to the actor to clarify meaning.
    • Tends to get more focused data.
    • Prone to rationalizations and tall tales.
      • When have to explain/vocalize, actor may come up with a story.
  • Problems with ethnography.
    • Data is very disorganized.
      • Difficult to know what to do with it.
      • Needs further refinement before it can be applied.
        • Use cases
        • Scenarios
        • Task analysis
        • Workflow models
    • Ethnogaphy often good starting point, but rarely self-sufficient.
Observation
  • Subject knowledge/consent.
    • Public vs. private places.
    • If the observee has expectation of privacy, you must have consent.
  • Medium
    • Notebook
    • Audio tape
    • Video/photographs
      • Puts people less at ease. (⇒ modifies behavior to some extent)
  • Transcription
    • Within 24 hours.
    • Cleaned up version of notes.

Ethics

Learning Objectives

  • Discuss ethical concerns.
  • Role of IRB.
  • Principles and origins of Belmont report.
  • Principle of informed consent & considerations surrounding.
  • Responsibilities to participants before, during, after study.

History

  • Nuremberg doctor trials.
  • Milgram obedience experiments.
  • Thalidomide study.
  • Untreated syphilis study.
  • Human radiation experiments.

Nuremberg doctor trials

  • Nazi physicians charged conducting inhuman experiments on civilians and prisoners.
  • High altitude experiments:
    • 40% participants died.
  • Parachuting into cold water.
    • 30% died.
  • Wound, burns, amputation, chemical and biological agent exposure.
    • Mortality of 25%, many disabled or scarred for life.
  • Code of ethics developed in aftermath.
    1. Informed consent.
    2. Anticipated results should justify experiment.
    3. Human experiments should be based upon animal results.
    4. Physical and mental suffering and injury should be avoided.
    5. There should be no expectation of death or disabling injury.
    6. Degree of risk should be weighed by potential benefit.
    7. Proper preparation & precautions should be taken.
    8. Only qualified scientists should conduct medical research.
    9. Subject has right to end experiment at any time.
    10. Scientist must be prepared to end experiment if subject at risk.
  • Code did not have much effect.

Milgram obedience experiments

  • Designed to answer question “Could it be that Eichmann and hist milion accomplices in the Holocaust were just following orders?”
  • Subject was “teacher,” learner was an actor paid by researcher, following a script.
  • Subject asked to administer electrical shocks when learner was wrong.
  • Psychologists said 1 in 1,000 would administer maximum shock (thus obeying supervisor).
  • In reality, 63.75% administered the maximum shock.
  • “I observed a mature and initially poised businessman enter the lab smiling and confident. Within 20 minutes he was reduced to a twitching, stuttering wreck, who was rapidly approaching a point of nervous collapse.” S. Milgram in Obedience to Authority.

Belmont Report

Basic Principles
  • Respect for persons.
  • Beneficence.
  • Justice.
Institutional Review Board
  • IRB has authority to approve, require modification, or disapprove all research activities.
  • Purpose: Review research and determine if the rights and welfare of human participants involved in research are adequately protected.
  • Safeguard mechanism.
  • Informed consent is a process of information exchange that takes place between the prospective investigator, before, during, and after study.
  • Comprehension: Investigators responsible for ascertaining that the participant has comprehended the information.
  • Voluntariness: Agreement to participate in the research constitutes a valid consent only if voluntarily given.
    • Must not be coerced through any means, including incentive.
Investigator's Responsibilities
  • Investigators bear ultimate ethical responsibility for their work with human participants.
  • Other responsibilities include:
    • Compliance with laws.
    • Assuming fiscal management.
    • Supervising/training of students, post docs, and residents.
    • Complying with terms and conditions of sponsor's award.
    • Submission of all technical, progress, invention, and financial reports on timely basis.

Project Description

  • Groups of 4-5
  • Four phases:
    • Proposal (2/7)
    • Prototype 1 (2/21)
    • Evaluation plan (2/28)
    • Final system & evaluation (3/16)

Proposal

Identify a real usability need, for a real population.

  • Describe the problem (current breakdown and the ideal situation).
  • Document the problem (how you know there is a problem to begin with).
  • Who are your users.
  • Ideas about a solution.

Prototypes

  • Build a presentation for a design gallery.

Evaluation Plan

Based upon feedback from prototype and problem you identified:

  • Write a convincing and realistic evaluation plan to see if you have reached all/some of your objectives.
  • Perform said evaluations on your fellow students.

Final Presentation

  • Just like prototype presentation, but you add in your final design, how it evolved, etc.

Other Study Techniques

Cognitive Walkthrough

  • Subject is usually an expert in UI, etc.
  • Think-Aloud protocol is part of cognitive walkthrough:
    • User describes verbally what he's thinking while performing tasks.
      • What they believe is happening.
      • Why they take an action.
      • What they are trying to do.
    • Researcher takes notes about task and actions.
    • Makes less assumptions about why things are happening.
  • Very widely used.
  • Yields good results with few people.
  • Potential problems:
    • Can be awkward for participant.
    • Can modify way user performs tasks.

Alternative

  • What if thinking aloud will be too disruptive?
  • Can use post-event protocol.
    • User performs session, then watches video and describes what she was thinking.
    • Sometimes difficult to recall.
    • Opens up door of interpretation/rationalization.

Related: Diary Study

  • Subject asked to keep journal of daily activities.
    • Record actions, reasons, and other observations.
  • Not always subjective.
  • Prevents researcher from having to be everywhere 24/7.

Interviews

  • 3 types:
    • Structured:
      • Well-defined agenda.
      • Efficient.
      • Require training.
    • Unstructured:
      • No agenda.
      • Let subject go in whatever direction they need to go.
      • Difficult to not influence direction.
      • Inefficient.
      • Less training.
    • Semi-structured:
      • Good balance of training/efficiency.
      • Often appropriate.

Semi-Structured Interviews

  • Predetermine data of interest - know why you are asking questions, don't waste time.
  • Guidelines:
    • Stay concrete (specific).
      • “So when the new guy joined the team and hadn't gotten his email account set up yet, what happened then?” vs. “What generally happens here when someone new joins the team?”
    • Signs to look for:
      • Interviewee waves hand and looks up at ceiling ⇒ generalization coming.
      • Use of passive voice, “generally”, “usually”, “should”, “might”

Surveys

  • General Criteria
    • Make questions clear and specific.
    • Ask some closed questions with a range of answers.
      • Sometimes also have a no opinion option, or other answer option.
    • Do test run with two or three people.
  • Likert Scale
    • Usually odd # of points: 5, 7 point scale; agree to disagree.
    • Could also use words for each level.
    • Sometimes need to use black & white answer questions to get a normalization range.
  • Other Typical Questions
    • Rank importance of each of these items…
    • List the four most important tasks that you perform (open question).
    • List the pieces of information you need to have before making a decision about X, in order of importance.
    • Are there any other points you would like to make?

Participatory Design

  • Scandanavian history.
    • Scandanavia has fairly strong labor unions.
    • Workers involved in all decisions.
  • Emphasis on social and organizational aspects.
  • Based on study, model-building, and analysis of new and potential future systems.
  • User is a part of the team.
    • Immediate feedback.
    • Sanity checking.
    • Much tighter feedback cycle.

User Centered Design

Input & Output

  • Gather data:
    • Surveys/questionnaires.
    • Interviews.
    • Etc.

Represent Data

Task Outline
  • List what task is about.
  • Add progressive layers of detail as you go.
  • Know in advance how much detail is enough.
  • Can add linked outlines for specific subtasks.
  • Good for sequential tasks.
  • Does not support parallel tasks well.
  • Does not support branching well.
Use Cases/Scenarios
  • Describe tasks in sentences.
  • More effective for communicating general idea of task.
  • Scenarios: “informal narrative description”
    • Focus on tasks/activities, not system (technology) use.
  • Use Cases
    • Focus on user-system interaction, not tasks.
    • How to do a task using the system, not what tasks to do.
Hierarchical Task Analysis
  • Graphical notation & decomposition of tasks.
  • Goals – what the user wants to achieve.
  • Tasks – do these to achieve the goals.
  • Looping, conditionals integrated.
  • See slides for example hierarchy.
  • Types of Plans:
    • Fixed sequence
    • Optional tasks
    • Waiting events
    • Cycles
    • Time-sharing
    • Discretionary
ER Diagram
  • Objects/people with links to related objects.
    • Stress relationship between objects and actions.
  • Close to the type of thing you would say to a DB designer or programmer.
  • More difficult for user to understand.
  • No way to represent knowledge, ideas, motivation, etc.
  • Lends itself better in specifying to a developer what he needs to create.
Flow Charts
  • Many types.
    • Decisions
    • Actions
    • Information flow
  • Combines ERD with sequential flow, branching, parallel tasks.
  • Tracks something being moved around.
  • Visually appealing, easy to understand.
  • More abstract than HTA.
  • Much quicker overview of system.

Midterm

  • What is usability engineering/HCI/user-centered design?
    • Define
    • Describe process/target problems
    • Arguments for UE/HCI/UCD/UE/HCI/UCD in historical context
  • Basics of human subjects research
    • Some history/background
    • Importance of Milgram experiments
    • Basics of Belmont report
  • Studying Users
    • Describe methods discussed in class
    • Argue pros & cons of each, different variations
    • Propose an approach to studying a given hypothetical place/situation, and argue why
    • How to organize and analyze data

Prototyping & Design

What is a prototype?

A prototype is a simplification of a system.

In interaction design, it could be:

  • Screet sketches
  • Storyboard
  • Slide show
  • Video simulation
  • Lump of wood (Physical mock up)
  • Software with limited functionality

Why prototype?

Put many ideas out there. By making prototypes, you can evaluate many options effectively.

Facilitates evaluation:

  • Stakeholders can see, hold, interact with.
  • Team members can communicate more effectively.
  • Test ideas yourself.
  • Encourages reflection.
  • Answer questions, support designers in choosing among alternatives.

What to prototype?

  • Work flow, task design
  • Screen layouts and information display
  • Difficult, controversial, critical areas

Compromises

All prototypes involve compromises. For software prototyping there may be a slow response, stetchy icons, limited functionality, etc.

Low Fidelity Prototyping

  • Rough prototype of system.
  • Uses medium unlike the final medium.
  • Quick, cheap, easily changed.
  • Encourages high level criticism; problems with conceptual models and fundamental usability or functionality issues.
  • Users unafraid to suggest major changes.

Storyboards

  • Often used with scenarios.
  • Indicate a series of events.

High Fidelity Prototyping

  • Looks and behaves like a subset of the final system.
  • Commanly used tools: Director, VisualBasic, Smalltalk
  • Users may think they have a full system (problem)
  • Get at details of design (layout, icons, colors)
  • Should not think of prototype as part of finished system (no recycling)

Medium Fidelity?

  • Somewhere in between.
  • High production values, no/limited interaction.
  • E.g. Photoshop
  • Tests detail of design without commiting.
  • Because no functionality, less pressure from users.

Prototyping & Evaluation

  • (Early)
  • (Low fidelity)
  • Rough out on paper
  • Cognitive walkthrough
  • Formative evaluation
  • (Late)

See slides.

Formative Evaluations

  • Done on low fidelity prototypes.
  • Wizard of Oz - smoke and mirrors to simulate working system.
  • GOMS and action analysis - uses models to predict certain attributes of prototypes.
  • Cognitive walkthroughs.
  • Heuristic evaluations - artificial evaluation using a top-ten list of mistakes or good practices.

Project 2 - Initial Prototypes

Prepare a prototype that answers two things:

  • What is the problem?
    • Who are users?
    • What are their needs?
    • What are constraints?
  • What is your solution?
    • Present multiple prototypes.
    • Sketches, storyboards, mockups.
    • Why for each. Pros/cons.

Human Stuff

Cognitive Processes

  • Attention
  • Perception and recognition
  • Memory
  • Learning
  • Reading, speaking, listening
  • Problem-solving, planning, reasoning, decision making

Senses

  • Sight, hearing, touch important for design of current interfaces.
  • Smell, taste?
  • Balance and propioception (where limbs etc. are physically)

Key Sense Concepts

  • Absolute threshold
    • Thresholds that determine what range you can sense.
    • Upper and lower thresholds.
  • Signal detection theory
    • Ability to tune in/out stimuli.
  • Just noticeable difference
    • How much change is required before you sense the difference?
  • Sensory adaptation
    • We react to change.
    • Absense of change leads us to loose sensitivity (psychological nystagmus).

Vision

  • Visual Angle
    • Total: 200 degrees.
    • High-res: ~15 degrees.
  • Rods
    • 120 million.
    • B&W
    • 1000x more sensitive than cones.
  • Cones
    • 6-7 million.
    • 64% red.
    • 32% green.
    • 2% blue.
  • Phenomena
    • Color perception:
      • 7-8% males cannot distinguish red from green.
      • 0.4% of women.
    • Peripheral vision is largely movement oriented.
    • Stereopsis:
      • (Stereopsis is ability to see in three dimensions.)
      • Monocular (size, interposition, perspective, paralax)
        • Ability to detect depth with one eye.
      • Binocular (retinal disparity, accommodation)
        • Ability to detect depth with two eyes.

Hearing

  • Capabilities
  • Frequency: 20-20,000 Hz
  • Loudness: 30 - 100 dB
  • Location: 5˚ source & stream separation
  • Timbre: Type of sound (lots of instruments)
  • Often take for granted how good it is.

Motor System

  • Our output system.
  • Capabilities
  • Range of movement, reach, speed, strength, dexterity, accuracy.
  • Workstation/device design.
  • Often cause of errors:
  • Wrong button.
  • Double-click vs. single.

* Principles

  • Feedback is important.
  • Minimize eye movement.

The Model Human Processor

  • Classic study from CS perspective of how brain works
    • Microprocessor-human analogue using results from experimental psychology.
    • Provides a view of the human that fits much experimental data.
    • But it is partial model.
  • Focus is on a single user interacting with some entity (computer, environment, tool).

Memory

  • Perceptual “buffers”:
    • Brief impressions
  • Short-term (working) memory:
    • Conscious thought, calculations.
    • Different store for visual vs. auditory memory.
    • Order of seconds.
  • Long-term memory:
    • Minutes, hours, days, years, decades.
    • Long term, large storage space.
    • Access is harder, slower.
    • Retrieval depends upon network of associations.
    • Memory Structure
      • Episodic Memory
        • Events and experiences in serial form.
        • Helps us recall what occured.
      • Semantic memory
        • Structured record of facts, concepts, skills.
        • One theory says it's like a network.
        • Another uses frames & scripts.

Decision Making Models

Understanding cognition important because it helps you understand how to teach people.

  • Production systems
    • If-then rules
    • Work like a computer program
  • Connectionism (big idea in AI)
    • Neural networks
    • Hidden Markov models
    • Bayesian networks
    • Modeled after a network of nodes
      • Nodes are factoids or rules.
      • Connections between.
      • Any time you follow a path that leads to success, strengthen the path.
      • Any time you make a mistake, weaken the path.
  • Mediated action
    • Actions must be interpreted in context
    • Tools, setting, culture affect
    • Objects “tell” us how to use it
    • The way things are shaped and presented affects how people interact with them
  • Case-based reasoning
    • Learn from experience, reasoning same as memory
    • Plans, schemes, and automation
    • Take closest matching experience and modify it to meet requirements
  • External/embodied cognition
    • Emphasizes reflexes and stuff

Evaluation

  • Evaluation is part of the design cycle.

Why evaluate?

  • If you make a mistake and don't catch it, it'll screw you later.
  • If we think of design as iterative process, we need to evaluate whether we're getting better.
  • Also, at each stage of design we make assumptions. We need to check whether those assumptions match reality.

What is evaluation?

  • Different from requirements gathering:
    • Testing a hypothesis.
    • Often use different methods, more focused.
  • Methods you choose depend on debates:
    • Quant. vs. quals.
    • Controlled vs. ecological validity
    • Cost vs. relevance.

Steps Involved

  • Formulate hypothesis.
    • Hypothesis = statement of fact.
    • Important to have hypothesis for data analysis.
  • Design a test plan.
    • Picking a method.
    • Selecting users.
    • Writing out procedure.
  • Get IRB permission.
  • Deal with users.
  • Deal with data.

Testing Methods

  • Formative
  • Artificial/Controlled
    • Isolate variables, level playing field.
    • Removes “noise” from data.
    • Thoroughly documented.
    • Focus only on your question.
  • Issues:
    • Putting people in contrived environment causes changes in how people interact.
    • Results from controlled experiments can't be directly compared to real world.

Hypothesis Testing

  • Example hypotheses:
    • X is better/larger/faster than Y.
    • X improved more than Y.
  1. Specify null hypothesis (H0) and alternative hypothesis (H1).
    1. Define H1 = true iff H0 = false.
  2. Select significance level. Typically P = 0.05 or P = 0.10
  3. Sample population and calculate statistics.
  4. Calculate probability (p-value) of obtaining a sta…

(SEE SLIDES)

Dealing with Data

  • Academic honesty key.
    • Falsifiability of results.
    • Need for meticulous records.
      • Keep records unmodified.
    • Objectivity.
    • Peer review.
    • Replication.
      • Not done in software design.

Statistical Significance

  • Statistical significance means: Two populations differ to a significant extent along some variable.
  • Statistical significance does NOT mean noteworthy.
  • Typically in either rate of occurance, or the value of some result.
    • E.g. group A 2x likely to do well on tests than group B (statistically significant), yet the difference in scores may not be large enough to be significant.

Significance: Type I and II Errors

  • What does significance mean?
    • Type I: False negative.
    • Type II: False positive.
  • Set significance to balance risks of type I or II errors:
    • When might low type I and high type II (vice versa) be preferable?
  • These types of errors may arise from equipment limits, etc.

Predictive Models

  • Models used to predict human behavior, responses.
  • Stimulus-Response
    • Hick's law:
      • Decision time to choose among N equally likely alternatives.
      • T = Ic log2(n+1)
        • Ic = time to recognize each item = 150msec
      • Useful for pilot tests.
    • Fitt's law.
      • Time it takes to select something on screen.
      • ID = log2(d/w + 1.0)
        • d = distance; w = width of target; ID = index of difficulty
  • Cognitive - human as interpreter/predictor - based on Model Human Processor:
    • Keystroke Level Model:
      • Puts together lots of mini-models, comes up with larger coherent model.
      • Assigns times for basic human operations - experimentally verified.
      • Based upon MHP.
      • Accounts for:
        • Keystroking: Tk
        • Mouse button press: Tb
        • Pointing: Tp
        • Hand movement between kbd/mouse: Th
        • Drawing straight line segments: Td
        • “Mental preparation”: Tm
        • System response time: Tr

Within-Subject or Between-Subject Design

  • Between subjects: Pool using prototype 1, separate pool using prototype 2.
    • Clean statistics – less noise.
  • Within-subjects: Same subject uses both prototypes.
    • Removes people variations.

Heuristic Evaluation

Discount Usability Engineering

  • Cheap
    • No special labs/equipment.
    • More careful you are, the better it gets.
  • Fast
    • On order of 1 day to apply.
    • Standard usability testing may take a week.
  • Easy to use
    • Can be taught in 2-4 hours.
  • Reliance on discount UE can lead to sloppiness.
    • Easy to ignore more thorough evaluation methods.
    • Not all you need.

HE Overview

  • Developed by Jacob Nielsen.
    • Involves a set of guidelines – heuristics.
    • Rules come from real-world experience.
  • Helps find usability problems in UI design.
  • Small set (3-5) of evaluators examine UI.
    • Independently check for compliance with usability principles (heuristics).
    • Different evaluators will find different problems.
    • Evaluators only communicate afterward; findings are then aggregated.
  • Can perform on working UI or sketches.
  • Most important ideas:
    • Independent analysis.
    • Performed on sketches or UI.

Process

  • Evaluators go through UI several times.
    • Inspect various dialogue elements.
    • Compare with list of principles.
    • Consider other principles/results that come to mind.
  • Usability principles:
    • Nielsen's “heuristics”.
    • Supplementary list of category-specific heuristics.
      • May come from competitive analysis & user testing of existing products.
  • Fixes for violations may be suggested by heuristics.

Nielsen's Original 10 Heuristics

  • Simple & natural dialog
  • Speak user's language
  • Minimize user's memory load
  • Consistency
  • Feedback
  • Clearly marked exits
  • Shortcuts
  • Precise & constructive error messages
  • Prevent errors
  • Help and documentation

Heuristics -- Revised Set

Visibility of System Status

  • Keep user informed about what is going on.
  • Example: Pay attention to response time.
    • 0.1 sec: No special indicator needed.
    • 1.0 sec: User tends to lose track of data.
    • 10 sec: Max. duration if user to stay focused on action.
    • For longer delays, use progress bars.

Match between system and real world

  • Speak user's language.
  • Follow real world conventions.

Consistency & Standards

Aesthetic and minimalist desgin

  • No irrelevant info in dialogs.

HE vs. User Testing

  • HE much faster.
  • HE doesn't require interpreting user's actions.
  • User testing far more accurate.
    • Takes into account actual users and tasks.
    • HE may miss problems and find false positives.
  • Good to alternate between HE and user testing.
    • Find different problems.
    • Don't waste participants.

HE Results

  • Single evaluator achieves poor results.
    • Only finds 35% of usability problems.
  • 5 evaluators find ~75% of problems.
    • If they work as team, it's back down to 35%.
  • Why not more evaluators?
    • Adding evaluators costs more.
    • Many more evaluators won't find many more problems.

Evaluation (2) & Wrap-Up

Evaluation Pt. 2

Usability Testing: The Usability Lab

  • A specially designed room for conducting controlled experiments observing a task.
  • Cameras, logging systems, people track what users do.
  • Good lab costs $$$.
Observation Room
  • Three cameras capture subject, subject's monitor, and composite picture.
  • One-way mirror plus angled glass captures light and isolates sound between rooms.
  • Room for several observers.
  • Digital mixer for mixing of input images and recording to media.
Other Capture - Software
  • Modify software to log user actions.
  • Can give time-stamped keypress/mouse events – sync with video
  • Commercial software available ($$$)
  • Two problems:
    • Too low level, want higher level events
    • Massive amount of data; need analysis tools
Complimentary Methods
  • Talkaloud protocols
  • Pre/post surveys
  • Participant screening/normalization
  • Compare results to existing benchmarks
school/classes/cs352/start.txt · Last modified: 19 years ago - 2007/05/28 06:45