« Back to Assessment Instruments
Qualitative/Quantitative:
Type of Instrument:
Number of Items:
30Subscale Information:
Focused Attention (8)
Perceived usability (7)
Aesthetic Elements (5)
Reward Factor (10)Language Availability:
Brief Description:
The User Engagement Scale (UES) is an assessment instrument designed to measure user engagement in human-computer interaction contexts. It consists of a set of items that capture various dimensions of engagement, such as focused attention, perceived usability, aesthetic appeal, endurability, novelty, and felt involvement.Website:
Not FoundCiting Literature - Development/Original:
O'Brien, H. L., & Toms, E. G. (2010). The development and evaluation of a survey to measure user engagement. Journal of the American Society for Information Science and Technology, 61(1), 50-69. https://doi.org/10.1002/asi.21229.
O’Brien, H. L., Cairns, P., & Hall, M. (2018). A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. International Journal of Human-Computer Studies, 112, 28-39. https://doi.org/10.1016/j.ijhcs.2018.01.004.Citing Literature - Empirical Use/Application:
Benz, C., Riefle, L., & Satzger, G. (2024). User Engagement and Beyond: A Conceptual Framework for Engagement in Information Systems Research. Communications of the Association for Information Systems, 54, 331-359. https://doi.org/10.17705/1CAIS.05412Version:
long form (UES-LF)
User Engagement Scale (UES)
Qualitative/Quantitative:
The assessment instrument uses quantitative and/or qualitative data
- Quantitative
Type of Instrument:
The type of the assessment instrument
- Survey
Number of Items:
Number of items in the assessment instrument
30Subscale Information:
Names of each of the subscales and the number of items for each of the subscales
Focused Attention (8)Perceived usability (7)
Aesthetic Elements (5)
Reward Factor (10)
Language Availability:
Language(s) in which the assessment instrument is available
- English
Brief Description:
Brief summary description of assessment instrument
The User Engagement Scale (UES) is an assessment instrument designed to measure user engagement in human-computer interaction contexts. It consists of a set of items that capture various dimensions of engagement, such as focused attention, perceived usability, aesthetic appeal, endurability, novelty, and felt involvement.Website:
Website providing access to and/or describing the assessment instrument
Not FoundInstrument and/or related documentation:
Citing Literature - Development/Original:
Reference for publication describing the development of the assessment instrument
O'Brien, H. L., & Toms, E. G. (2010). The development and evaluation of a survey to measure user engagement. Journal of the American Society for Information Science and Technology, 61(1), 50-69. https://doi.org/10.1002/asi.21229.O’Brien, H. L., Cairns, P., & Hall, M. (2018). A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. International Journal of Human-Computer Studies, 112, 28-39. https://doi.org/10.1016/j.ijhcs.2018.01.004.
Citing Literature - Empirical Use/Application:
Reference for publications on the application of the assessment instrument
Benz, C., Riefle, L., & Satzger, G. (2024). User Engagement and Beyond: A Conceptual Framework for Engagement in Information Systems Research. Communications of the Association for Information Systems, 54, 331-359. https://doi.org/10.17705/1CAIS.05412Version:
Number/name of the most recent version of the assessment instrument
long form (UES-LF)Related Instruments:
Not FoundImplementation Science Considerations
- Blueprint for Dissemination
- Choosing Wisely Deimplementation Framework
- Community Based Participatory Research (CBPR)
- Consolidated Framework for Implementation Research
- EQ-DI Framework
- Knowledge Transfer and Exchange
- Pronovost's 4E's Process Theory
- Real-World Dissemination
- Transcreation Framework for Community-engaged Behavioral Interventions to Reduce Health Disparities
- conNECT Framework
- Use of evaluative and iterative strategies
Constructs Assessed:
Constructs assessed by the assessment instrument (linked to constructs included in the D&I models webtool)Theories, Models, Frameworks Relevant:
Implementation Outcomes:
Not FoundThe relevance of the assessment instrument to various implementation outcomesImplementation Strategies:
The implementation strategy/ies evaluated by the assessment instrumentPhase of Implementation Process:
Not FoundPhase of implementation process when the assessment instrument can be used
Intended Focus
- Individual (Patient, Community Member)
- Community Members/Patients
- Clinical Outpatient
- Clinical Inpatient
- Residential Care
- Community Organization
- Public Health Agency
- School
- Workplace
Levels of Data Collection:
The level(s) from which the assessment instrument collects dataIntended Priority Population:
Intended priority population from whom data are collected using the assessment instrumentIntended Priority Setting:
Intended priority setting in which the assessment instrument is usedPolicy:
Not FoundAssessment instrument is relevant to policyEquity Focus:
Not Found
Psychometric Properties
- Unspecified Validity
- Unspecified Reliability
Scoring:
YesThe assessment instrument produces a composite scoreNorms:
Not FoundMeasures of central tendency and distribution for the total score are based on small, medium, large sample sizeResponsiveness:
Not FoundThe ability of the assessment instrument to detect change over time (i.e., sensitivity to change or intervention effects).Validity:
The extent to which an instrument measures what it is intended to measure accurately.Reliability:
The extent to which results are consistent results over time, across raters, across settings, or across items intended to measure the same thing.Factor Analysis:
YesA statistical method that uses the correlation between observed variables to identify common factors.
Pragmatic Properties
- Guidance to Administer
- Guidance to Analyze
- Guidance to Interpret
- Medium: Asyncronous collection of data
Time to Administer:
Not FoundThe amount of time required to complete the assessment instrumentSecondary Data:
Not FoundCost:
Not FoundCost associated with access to assessment instrument (Some insturments might require login.)Literacy:
Not FoundReadability of the items reported on.Interpretation:
Not FoundExpertise needed for interpretation of data is reported.Training:
Not FoundExpertise needed to use the assessment instrument is reportedResources Required to Administer:
None/LowResources needed to administer the assessment instrument (FTE for data collector, equipment, etc.)User Guidance:
Guides are provided to support administration of assessment instrument/data collection, and/or analysis of data from the assessment instrument, and/or interpretation of data, and/or action/decision on how to use dataObtrusiveness:
Degree of intrusion the participants will experience because of the data collection when using the assessment instrument (e.g., assessment instruments that rely on use of secondary data or automated data will be less obtrusive)Interactivity:
Not FoundData collection and/or result generation involves interactive components.
Comments
There are no reviews yet. Be the first one to write one.