Discovery for assessment authoring app at Pearson

During my first four months of working at Pearson I worked on discovery tasks for the software I was supporting: a web-based application used to author assessments for learners. If you’re a teacher who wants to create a quiz or test for a student you might use this application.

My manager and I collaboratively determined the discovery tasks that would support our internal goals, which were basically to improve the user experience of the application.

I completed these four discovery tasks:

  1. I defined the key elements of the application.

  2. I reviewed competitor products.

  3. I performed stakeholder interviews and analyzed the data.

  4. I conducted user interviews, administered surveys and analyzed the data.

Define the key elements of the application

What: I created a Nested Object Model to map the key elements of the assessment application. The model is part of the Object-Oriented UX framework, created by Sophia Prater. Read more about it on ooux.com

Nested Object Model of the assessment authoring application

Why: I wanted to understand what’s important to the user and the business (i.e., the objects, relationships and attributes) so I would be able to better support the application.

How: I dug through every part of the application and talked to subject matter experts to understand the most important objects, relationships and attributes. I gathered it all into a spreadsheet.

Outcome:

  • The team is aligned on terminology so we can communicate about the application effectively.

  • As a new employee this task helped me to understand the most valuable parts of the application and future new employees can use this as a reference as well.

  • When I started supporting the app from a user experience and user interface perspective I was well-versed in the data structure, which means I had already grasped some of the complexities of the application. I would be better prepared to support the application.

Review competitor products

What: I reviewed the user interfaces and experiences of various competitors and products that have similar functionality that many of our users might have experience, for instance other assessment applications and even Google forms. This is an ongoing project to be continued as the need arises.​

Why: To improve user experience we need to be aware of how the digital products that our users frequent are working. Like Jakob Nielsen says, Users spend most of their time on other sites. This means that users prefer your site to work the same way as all the other sites they already know.

How: I signed up for a number of trials and looked at help documentation. I focused on areas most relevant to the application I would be supporting, for instance the dashboard, table of contents, assessment creation process, question types and collaboration features.

Outcome: I became familiar with how other organizations were structuring similar digital tasks. It gave me ideas about how we could improve our user experience. I captured the information that I thought would be most relevant to my work to refer to as needed.

Stakeholder interviews

What: We identified stakeholders, individuals who work for specific parts of the business to overs everything related to the tool, and set up interviews with five of them.

Why:

  • To hear Stakeholder perspectives about the application we support 

    • Understand their thoughts about business objectives 

    • Understand their thoughts about who our users are and how we’re serving them 

    • Not to solicit a wish list of solutions and features 

  • Establish relationships with Stakeholders to facilitate buy-in and support 

  • Identify potential users to interview 

How: Neither my team nor I had conducted stakeholder interviews so I did some best practice research and recommended a process. With collaboration from my team we finalized the interview format and questions. Myself and another colleague scheduled and conducted the interviews. Finally, I created a thematic analysis to group our findings into themes and then created a presentation to communicate the findings.

The artifacts I created for the Stakeholder interviews include a thematic analysis, contact list and a process document.

Outcome: We learned about stakeholder perspectives about their line of business and their perception of who the users are. From this I began to document our user groups, something that had not previously been done.

  • We learned about stakeholder perspectives about their line of business and their perception of who the users are. From this I began to document our user groups.

  • We learned about the digital and print publishing process for each line of business, which has implications for the applications we support.

  • We established relationships with stakeholders who agreed to future contact including surveys, further interviews and reviews of wireframes and prototypes.

  • We put together a list of 12 users to contact for user surveys and interviews.

  • I documented our process, questions, stakeholder and user contacts and all communications necessary to schedule interviews and follow-up after interviews. This work will inform our process for future stakeholder interviews.

User interviews

What: We interviewed and collected survey data from users to understand how they use the application, what they like about it and what might be slowing down their workflows.

Why: Mapping up to our internal goals, we want to identify and address process slowdowns in order to improve the user experience.

How: We identified users of the application through our Stakeholder interview process. We decided on a 2-pronged strategy to get the most feedback while still being able to manage the process with a small team. I contacted 10 users to ask them to either speak with us or fill out a survey, the questions for each were the same. Later, we cast a wider net thanks to a leader in the organization who contacted more users for us.

Thematic analysis that I created to help analyze our user research study. Essentially, post-it notes on a digital whiteboard.

Thematic analysis that I created to help analyze our user research study.

After receiving the survey data and conducting the interviews I performed a thematic analysis (above) to compile the findings. Finally, I presented the findings to the appropriate team.

Outcome: In advance of a strong focus on the user experience of the assessment application we now have several areas to tackle that both users and stakeholders agree are problem areas. Essentially, we have a research-backed direction for our upcoming work.

I also continued compiling findings about user groups that will be useful to our entire team.



Next
Next

Case study: article & topic page templates