UX Design Research Guide
A strategic framework with 20 methods.

Why Conduct UX Design Research?
UX (user experience) design research is the systematic discourse of studying target users and their needs. Each method provides a different approach resulting in data and insights. UX researchers must be capable of performing a diverse set of methods to strategically discover problems and opportunities throughout the design process.
Conducting UX Design Research earlier in the design process and integrating the insights will ensure time and resources are utilized efficiently. Continuing to perform UX Design Research throughout the design process will increase the value of the final design, meaning the return on investment (ROI) of UX Design Research is significantly high.
Overall, UX Design Research ensures designs are…
1. effective at meeting the needs of users.
2. ethically sound.
3. delightful for users.
4. efficient for users.
5. optimized for profitability.
Strategic Framework
While each method is unique and produces different insights, a strategic framework is always be followed to ensure the process is ethical and understood by all stakeholders. As with all research on human subjects, it’s important the research process is conducted with integrity.
Step 1: Research Matrix
Creating a research matrix helps researchers determine which research method to use and why.¹Step 2: Research Protocol
Communicating a research protocol ensures the research process for is transparent and thorough for all stakeholders.²Step 3: Performing Research
Collecting qualitative and quantitative data is prioritized at this stage and it’s important researchers are prepared to record anything, even the unexpected.³Step 4: Data Analysis
Analyzing collected data results in useful insights that can be clearly articulated to everyone in the design team.⁴Step 5: Deliverables
At the end of the process, useful deliverables (such as user journey maps or use case diagrams) are provided to stakeholders.
20 Methods of UX Design Research
Each research method is understood by whether it is producing qualitative or quantitative data, whether it studies attitudes or behaviors of participants, and whether the interaction between participants of the research and the design is natural or scripted.
- A/B Testing
Testing designs by randomly assigning groups of users targeted assignments when interacting with parts of the design and measuring the results of user behavior.
Quantitative Data | Behaviors of Users | Natural Interactions - Card Sorting
Analyzing how users organize individual labels written on notecards according to criteria that either make sense to users or into groups assigned by researchers.
Quantitative & Qualitative Data | Attitudes of Users | No Interactions - Clickstream Analysis
Process of collecting, analyzing, and reporting aggregate data about which pages a website visitor visits on a site or digital product and in what order.
Quantitative Data | Behaviors of Users | Natural Interactions - Concept Testing
Process of gathering user feedback of a developed idea or prototype.
Quantiative & Qualitative Data | Attitudes of Users | Natural & Scripted Interactions - Customer Feedback
Feedback from a sample of users who answer open-ended and/or close-ended questions who volunteered to provide feedback through a survey discovered through a link.
Quantitative & Qualitative Data | Attitudes of Users | Natural Interactions - Desirability Studies
Participants select different qualities of a prototype or potential aspects of a design to discover user preferences.
Quantiative & Qualitative Data | Attitudes of Users | Natural & Scripted Interactions - Diary/Camera Studies
Participants are provided a mechanism (a diary or camera) to record and/or describe aspects of their lives relevant to a design in a longitudinal period of time.
Quantitative & Qualitative Data | Attitudes of Users | Natural Interactions - Email Surveys
Participants receive a survey from an email that records responses to written questions.
Quantitative & Qualitative Data | Attitudes of Users | No Interactions - Ethnographic Field Studies
Researchers meet with and participants in their natural environment where participants encounter the design being studied.
Qualitative Data | Attitudes & Behaviors of Users | Natural Interactions - Eyetracking
The use of an eye-tracking device to measure where participants look as they perform tasks or interact with a design.
Quantitative & Qualitative Data | Behaviors of Users | Natural & Scripted Interactions - Focus Groups
Small groups of participants are guided through a discussion about a set of topics so participants can provide verbal and written feedback through discussion and/or exercises.
Qualitative Data | Attitudes of Users | Natural & Scripted Interactions - Intercept Surveys
Participants respond to a survey that is triggered during the use of a design.
Quantitative Data | Attitudes of Users | Natural Interactions - Interviews
Participants meet one-on-one with researchers to discuss in-depth what the participant thinks about aspects of a design.
Qualitative Data | Attitudes of Users | No Interactions - Moderated Remote Usability Studies
Participants use tools such as screen-sharing software or a camera during the use of a design.
Quantitative & Qualitative Data | Behaviors of Users | Scripted Interactions - Participatory Design
Participants are given design elements or creative materials to construct their ideal experience to expresses what matters most to them.
Qualitative Data | Attitudes of Users | Natural & Scripted Interactions - True-Intent Studies
Type of intercept survey that targets a live visitor to ask them questions once they are finished using a design to understand who is using the design and whether the design was successful for them.
Quantitative Data | Behaviors of Users | Natural Interactions - Unmoderated Remote Panel Studies
A panel of trained participants have video recorded and data collection software installed on their own personal device to record their use of a design while participants verbalize their thinking aloud.
Quantitative & Qualitative Data | Behaviors of Users | Scripted Interactions - Unmoderated UX Studies
A software application provides instructions to participants, records their actions, and asks participants predetermined follow-up questions.
Quantitative & Qualitative Data | Behaviors of Users | Scripted Interactions - Usability Benchmarking
Evaluation of a design’s user experience with metrics that compare its performance to a meaningful standard.
Quantitative & Qualitative Data | Behaviors of Users | Scripted or Natural Interactions - Usability-Lab Studies
Researchers meet one-on-one with participants to study how participants perform tasks with a design.
Qualitative Data | Behaviors of Users | Scripted Interactions
7 Key Takeaways
- Researchers need to understand why they are conducting research in the first place for the design’s users and the company producing the design.
- Have a clear process for deciding which research methods to use and when to use them.
- Articulate a clear and transparent protocol for conducting research prior to conducting it so all stakeholders are aware and informed.
- Gather rich data in a sensitive manner.
- Present data analysis and insights so they can be understood by anyone and everyone.
- Build a library of research methods.
- Understand how each research method’s purpose, whether it collects qualitative or quantitative data, whether it studies attitudes or behaviors of participants, and whether interactions are scripted or natural.

Citations
- Cameli, Matteo et al. “How to Write a Research Protocol: Tips and Tricks.” Journal of cardiovascular echography vol. 28,3 (2018): 151–153. doi:10.4103/jcecho.jcecho_41_18.
- Choguill, Charles L. “The research design matrix: A tool for development planning research studies”. Habitat International. vol 29, Issue 4. (2005): 615–626. ISSN 0197–3975. https://www.sciencedirect.com/science/article/pii/S019739750500038X.
- LeCompte, Margaret D., and Judith Preissle Goetz. “Ethnographic Data Collection in Evaluation Research.” Educational Evaluation and Policy Analysis, vol. 4, no. 3, Sept. 1982, pp. 387–400, doi:10.3102/01623737004003387.
- Perer, Adam, and Ben Shneiderman. “Integrating statistics and visualization: case studies of gaining clarity during exploratory data analysis.” Proceedings of the SIGCHI Conference on Human Factors in computing systems. 2008.