When I arrived at the point in my MSIS program at which I needed to put together my Professional Experience Project (PEP),I knew that I wanted to do research and usability testing. Luckily, there was a company in town looking for someone with just my skills and interests. I joined the Harvest team in April 2015.
My first task was to complete a competitive analysis of 6 delivery services and communicate my findings in a slide deck. Beyond a few initial questions that the Harvest leaders wanted me to research, the assignment was quite open-ended, leaving me free to determine which aspects of these delivery services were most informative for our purposes.
This freedom required me to formulate my own strategy. I began with the goal of learning as much as I possibly could about each of the 6 delivery services, downloading each of their apps and grabbing screen captures to illustrate every part of the food delivery task flow — and there is a highly defined flow, unvarying from one service to another. That is:
- Enter address (either through location services or by manually entering)
- Browse restaurant selections
- Select restaurant; browse menu. Rinse repeat until —
- Add menu selections to cart
- Enter payment information
- Place order
I knew that it was important to closely analyze each of the services, despite my previous experiences as a user with some of them, because my personal experiences and perceptions would be no substitute for thorough analysis and user feedback. My primary focus was ultimately how the apps supported the users' attempts to make decisions about their purchases, answering questions such as:
- Does the app have search capabilities? How thorough is the search? Can users search only by restaurant name, or can they search by type of food or specific menu item?
- How browsable is the app? Are there pictures to attract the eye? How are lists of restaurants ordered?
- How much detail is available about each restaurant? Is it easy to find restaurants that offer vegan or gluten-free fare?
- When are users prompted to create an account? What payment options are available to users?
After I completed the competitive analysis and presented my primary findings in a slide deck, the development team made alterations to the Harvest website in preparation for usability testing.
I tested the website with 11 professionals from Austin tech companies. Each participant was tested individually. Sessions began with a scenario: You get home from work, exhausted, at 9 pm. There's nothing in the fridge — you didn't have time to go shopping last week. What do you do for dinner? After that, I showed the Harvest website in its current form to the participant and asked him or her to go through the process of placing an order for delivery without actually placing the order. I encouraged users to be as brutally honest as they liked while critiquing any aspect of the website. The last step was administering a demographic questionnaire to learn about the participant's household and food ordering habits.
Throughout this process, I scribbled into my yellow legal pad, ultimately ending up with 40+ pages of notes. Making these notes useful (as in, not a scrambled mess) required additional work. I did some initial coding to collect general themes, such as: references to money/pricing, feature requests, and negative comments. I then grouped similar comments within these themes to determine how many test participants had similar responses to the website. Knowing how frequently a comment or feature request was made helped me prioritize the findings for my final recommendations.
I presented my findings in two stages: first, "quick fixes" that wouldn't require building out new features but would improve the user experience before the Harvest website rolled out in beta, and second, "feature requests" that would require more time to implement but that several users wanted to see.
The research, planning, and analysis that went into this project required that I make several independent decisions regarding my methodology. In the process, I applied lessons from many of my classes, including: Usability, Information Architecture & Design, Human-Computer Interaction, Understanding Research, and Information Resources in Business.
One of my most valuable takeaways was the interviewing skills that I developed throughout the usability testing sessions. I had testing sessions with individuals who were both familiar and unfamiliar with usability testing. Some freely shared their opinions about every element of the webpage, telling stories and using examples from past experiences. Others needed more prompting to open up about their ideas.
Overall, I gained some fantastic real-world experience through this project that built on my coursework.