Computational Ethnography: Automated and Unobtrusive Means for Collecting Data

Download 28.65 Kb.
Date conversion23.12.2017
Size28.65 Kb.
Ch. 6

Computational Ethnography: Automated

and Unobtrusive Means for Collecting Data

In Situ for Human–Computer Interaction

Evaluation Studies
6.1 Introduction

Health information technology (HIT) Unfulfilled benefits in healthcare, sometimes even harmful.

Lack of usability one of key factors, so HCI evaluations important
HCI traditional evaluation methods:

  1. Expert Inspection (e.g. heuristic evaluation)

    1. Usability expert does specific tasks on the software or deice and decides if it conforms to established principles of usability (the “heuristics”)

    2. Useful when

      1. Widely recognized usability standards exist

      2. Goal of evaluation very specific (e.g. improve accessibility or eliminate patient safety hazards)

  2. Usability experiments carried out in laboratory settings

    1. Formative evaluation comparing multiple design alternatives

    2. Summative evaluation to correct usability pitfalls

    3. Quantitative (e.g. time for task completion, number of keystrokes and mouse clicks, error rates)

    4. Qualitative *e.g. participant verbalization expressing cognitive processes or commenting about usability issues they encounter) Some are randomized.


  • Conducted in controlled environments evaluators or test users perform predefined tasks in a manipulated environment to prevent distractions.

  • Designed to represent end users’ work, but not exhaustive

  • Often focus on isolated individual tasks, without considering co-workers, task-dependencies, chaos, interruption, communication failures in clinical work.

  1. Field Studies (ethnographical observation and contextual inquiries)

    1. in situ – in its original place

    2. collecting data in actual job environment

    3. Shadow clinicians in medical facility to observe individual work and interactions with patients and other care providers.

    4. Use Computer-supported cooperative work (CSCW) – Coordinating and collaborating using computers

    5. Use distributed cognition – Research that accounts how individuals, artifacts, and their environment interact together collectively

    6. Use Social Computing – Humans performing social behaviors with computers

  2. Perception solicitation through questionnaire surveys, interviews, or focus groups


  1. research subjects and usability experts hard to recruit because time consuming

  2. sample sizes often small, so hard to generalize

  3. test subjects don’t behave the same in studies (Hawthorne effect)

    1. (Hawthorne Works factory light intensity productivity, short term effects)

  4. Self-reports subject ti cognitive biases and recall errors like social desirability bias be favorably viewed by others (don’t lack competence with adapting to new technology)

    1. Or asses individual usability items based on overall impression of intervention (Halo effect)

  5. Produce specific data covering small fraction of behaviors of interest.

  • Computational Ethnography uses automated and less unobtrusive ways to collect in situ data of real end users’ actual unaltered behaviors using software or devices in real settings.

  • Based on premise that UX with modern tech leaves digital traces behind like webpage history, keywords in search engines, audit trails logging document access to health records, paging/phone logs

6.2 Computational Ethnography

Ethnography Greek  ἔθνoς ethnos (“folk, people, nation”) and  γρά φω graphno (“I write”).

  • Method originally used by cultural anthroppolgists

  • Ethnographers

  • find meaning in lives of a culture group

  • Develop ‘thick’ description of everyday life and practice

  • Long-term engagement with people they study in their everyday lives

  • Participant and non-participant observations. Active participants, or maintain a detached distance.

HCI Ethnography In Healthcare

  • Produce vivid and nuanced accounts of different players (clinicians, clerical staff, administrators, patients, and families) engage with technology during early adoption and adaptation phases and after being used regularly

  • Attention to longitudinal (same variables over long time) and distributed nature of care process and complex interplay between people, technogly, and organization

  • Shows cultural and social contexts in technology and how designs should be

Ethnography Limitation

  • Time consuming (months to years)

  • Not objective interpretive account of lives of a study

  • Difficult for observers not trained in medicine to understand what’s happening

  • Lot of work is invisible or difficult to observe. Communication by technology (pagers) and documentation that can be done remotely or after work


  • HIPPA and Medicare Conditions of Participation requires accurate records of what’s done to patients, communications surrounding care for patient, and access and modifications of patient medical records.

Computational Ethnography

  • Thickness of ethnography, strength of automated approach.

  • “a family of computational methods that leverages computer or sensor-based technologies to unobtrusively or nearly unobtrusively record end user’ routine, in situ activities in health or healthcare related domains for studies of interest to human-computer interaction.”

  • Automated data collection more objective, less intrusion, more inclusive (can see things through space and time not possible by humans), more scalable for data collection, aggregation, analysis

  • Must have both computation and ethnographic. Can’t be done outside of real-life setting and must involve computers.

6.2.1 Definition

6.2.2 Common Sources of Computational Ethnographical Data Computer Logs

HIPAA and HITECH Act, HIT systems must have auditing capability. Ex. Electronic Health Records (EHR) systems must have security audit logs or audit trails.

Guidelines for:

  • Types of auditable events (creation, modification, deletion, printing health information)

  • Metadata (date, time, patient ID, user ID)

  • Tamper proof and immutable audit trails.

Logs tell:

  • Event (when, by whom, patient)

  • Type (Chart access, pacing orders) and ID for details

  • IP address of device access

  • Potentially locational data that can be combined with time stamps to reconstruct spatiotemporal distribution.

  • Helps with workflow and temporal rhythm studies, distributed congnition and social information processing, and patient handoff research.

  • Ex. Audit logs in EHR tell clinician time to author clinical notes and proportion of notes viewed by others


  • Some things missed like screen activities (user moves window to reduce clutter)

  • Timestamps only tell when occurred not how long it took or if through optimal path

        1. Screen Activities

  • Tracks Mouse cursor trails, mouse clicks/drags, keystrokes, window activation, and window movements

  • Video or image recording

  • Can also include video and audio context

    • Facial expressions, body gestures, conversations between clinician and patient

  • Also record using logs, easily to analyze

  • Video and audio more difficult to analyze

  • Provides data on time efficiency (completion time), operation efficiency (mouse clicks), and error rates (wrong or unnecessary moves)

  • Ex. Task complexity and interruption affects clinician performance in error rates, resumption lag, task completion time in creating and updating electronic medication charts.

  • Combining with others helps

  • Morae usability studies and market research.

  • Turf EHR usability assessment

        1. Eye Tracking

  • Screen Activities can’t track when a clinician is reading an EHR before meeting a patient or examining computer-generated drug safety alert before acting

  • Head and eye movement is tracked

  • Study clinicians finding patient records or clinicians skipping computer generated advisories without reading

  • Use optical sensors to capture vector between pupil center and corneal reflections by sending infrared or near-infrared non-collimated light on the eye

  • Often used to test website usability and user attention for ad placement

  • Used for autism, anxiety and depression, and surgeon training and skill

  • Plotted as a heat map

  • Also shows when clinician looks away from screen (maybe at patient and show patient-provider communication )

  • Tobbi Technology SensoMotoric Instruments

        1. Motion Capture

  • Concern with computers hurting patient-clinician interaction

  • Less eye contact, rapport, providing emotion support, conversation interference, less psychosocial questioning and relationship maintenance, and irrelevant computer inquiries that miss patient’s issues.

  • Analyze vocalization, body orientation, body gestures

  • Microsoft Kinect with infrared depth sensor to show distance and angle relative to camera, body movements (kinetics through motion of body joints (head, shoulder elbow, wrist, hand), head orientation

    • Built in microphone can automatically segment voice data, clinician’s visual attention, and turn-talking behavior

    • Answers: body language of clinicians when with clients and computers

    • Sensor based = automated analyzing

    • Depth, skeletal, and voice direction data recorded as digitized coordinates to tell relative positions at any given time.

    • Automatic segment progression of clinical consultation into stages (greetings, phys exam, conversing seated, leaving the room)

    • Automatic nonverbal communications automatically added with gazing behavior

    • Deep analysis and relatively cheap

        1. Real-Time Locating Systems

  • Can answer location questions other methods can’t

  • Physical layout of outpatient clinic or inpatient ward optimally designed to facilitate patient care delivery, and whether introduction of HIT systems result in reduction of face time among healthcare coworkers

  • Radio frequency identification RFID emits and responds to base stations

  • Tells spatial locality and possible communication

  • With timestamps explores spatiotemporal data like clinician movement patterns, team dynamics of aggregation and dispersion, and workflow deficiencies Other Types of Computational Ethnographical Data

paging/phone logs, email messages internet traffic and data and meta data by barcode scanners and medical devices.

      1. Analyzing Computational Ethnographical Data Coding Computational Ethnographical Data

Coding scheme for labeling events must be named first like the event name and even type.

        1. Analyzing Computational Ethnographical Data

Useful ways to analyze computational ethnographic data

  1. Temporal data mining – Helps with creating time-stamped event sequences that show when events happen. Shows temporal interdependencies of events. Reveal sequential events and bottlenecks

    1. Ex. Sequential pattern analysis: interrelated events chronologically arranged.

    2. Originally to show shopping behavior.

    3. Example each letter is an activity. Given abegcdhf, eabhcd, abhcdfg. is a frequent reoccurrence.

  2. Time-stamped events allow transition analysis and allow calculation of switching costs especially for ones with different natures that cause cognitive mistakes.

  3. Transition probabilities allow Markov chain analysis explain which events are most probable.

      1. Limitation of Computational Ethnography

  • Doesn’t explain behavior, so mixing with qualitative approaches like interviewing, context inquiry, and ethnographical observations encouraged

  • Not always complete. Computer based analysis might miss hallways and bedside conversations

  • Data may come from various sources making it difficult to synchronize and perform integrated analysis

  • Data used may be collected for a different purpose making it harder to analyze.

6.3 Case Studies

6.3.1 Understanding Clinicians’ Navigation Behavior in EHRs

  • Study of Ambulatory (outpatient) primary care physicians interacted with EHR

  • Reengineered real-time capture of UI interaction of mouse clicks and key strokes

  • Also used audit trails recording EHR retrieval, creation, and modification

  • 17 Major EHR features provided by system for clinicians to perform documentation or chart viewing tasks

  • Event sequences constructed based on how features sequentially access, which means might show how actions carried out

  • HMXAD would mean history of present illness, medication, physical examination, assessment & plan, diagnosis

  • ADAD and DADA most common.

    • Suggests physicians use EHR they access assessment & plan and diagnosis together

    • Medication and Order also go together

  • Transitions with probability > 0.5 are bold

  • Shows physicians navigate chronologically

  • May lead to better understanding of cognitive process

  • May help usability of EHR by reducing mouse clicks of frequent task transitions

  • Ex. History of Present Illness was most frequent used feature ofter log in, should have a special place. Features that sequentially occur together should be put beside each other on screen.

6.3.2 Analyzing the Dynamics of Provider-Patient Interactions

  • Study examining communication between low English proficiency patient, physician, and interpreter.

  • Captured multiple data streams to examine physician, patient, and interpreter interactions.

  • Includes speech, eye contact, gestures, and body orientation

  • Used 2 Kinects and used analysis made by ChronoViz to analyze simultaneous body action, voice, and gaze of multiple participants.

  • Physician and patient sit side by side with EHR in front of physician, patient sits on the edge of an exam table, interpreter sits 6 feet in front of the patient.

  • 12 encounters captured, two were analyzed in-depth because of complexity.

  • One with English-fluent patient other with LEP (limited English proficiency)

  • Interpreter functioned as middleman speaking after physician and patient, unless interpreter couldn’t translate due to use of objects like paper or EHR

  • Both sessions different gestures

  • Deictic (pointing at EHR or paper) iconic (hand shape of cyst) beat gesture (hand palm up)

  • When no interpreter physician equally use gestures, patient used more beat gestures

  • Interpreter physician used iconic gestures more often, while interpreter and patient were all iconic (exclusively to describe size, shape, and location of injury)

  • Someone was always in the dark whether doctor waiting to hear what patient said or patient waiting to hear what the doctor said.

  • No way to tell if interpreter was accurate

  • patient or physician might be looking at interpreter when the other was talking missing out important cues

  • Ever more complicated when interpreter left out for not being able to see EHR or other things such as paper.

  • Neither interpreter nor patient could see display because no pointing at the EHR found.

  • Suggest EHR less helpful for LEP patients

6.4 Conclusions

The database is protected by copyright © 2016
send message

    Main page