Elizabeth Ferron:

UX Research Guru & Usability Testing Maven

Who I am

I'm a user experience researcher. I'm good at what I do because I love learning about people, diving into data and feedback, and discovering themes that influence design decisions. Bringing together diverse, conflicting perspectives and deciding how to move forward with them is what I do best.

I have experience working with design agencies, Fortune 500 companies, tech startups, libraries, and archives. My work and my education at UT Austin's School of Information have allowed me to explore many different fields and have prepared me to fit happily into a variety of work environments.

In my spare time, I'm often enjoying this era of Peak TV (too many favorites to list), reading thinkpieces on the state of American culture and society (among other things), or checking out one of Austin's many fine eating establishments (preferably within walking distance).

What I do

I am currently doing user experience research for HCL DNA - Digital & Analytics in a client-facing role. I work with my primary client USAA to assist their Innovation team with ethnographic research, user research, concept testing, and usability testing.

Previously, I was the UX Researcher at Design for Use, an Austin UX agency. Clients that I worked with in this role included: GSD&M, U.S. Air Force, Builder Homesite Inc., and Wellsmith. DFU ran an agile workflow with an initial discovery sprint followed by design sprints.

Here's a walkthrough of what these sprints looked like for me:

  1. Discovery sprints begin with stakeholder interviews, during which the team gathers information about product vision, project goals, users, and competitors.
  2. During the stakeholder interviews, we get access to internal documents, info from the client's previous research, and any other relevant materials. Here's where I learn as much as possible about the client, their users, and their users' needs in as little time as possible.
  3. Sometimes clients already have user personas documented from previous research efforts. If they don't, here's where we buckle down and begin the process of creating user personas.
  4. Next up is a user stories workshop during which the DFU team works with stakeholders to write down what we understand user needs to be, based on user personas. (This part involves lots of sticky notes and, preferably, a huge whiteboard to stick them on.)
  5. A stakeholder interview summary, competitive analysis, usability audit, content audit, and/or user flows follow as needed.
  6. After design sprints kick off, each sprint involves at least one day (though often more) of usability testing. I work with clients to determine recruiting needs, write screeners, recruit participants for the studies, and schedule those participants.
  7. Depending on the project, client needs, and resources, I conduct usability testing sessions either remotely or in-person, moderated or unmoderated. Tools used for this include: GoToMeeting (for all moderated testing sessions to allow the designers and clients to observe), UserTesting (for unmoderated testing sessions), and Lookback.io (for mobile testing).
  8. Throughout testing days, I collaborate with DFU designers and strategists to make revisions to the testing protocol and prototype as needed. After the sessions wrap, I immediately share out top level findings within the design team so our designers can get to work.
  9. I then create a slide deck and present a summary of findings to the client to get their feedback on those findings and any resulting design modifications.
  10. The next sprint gets underway and steps 6 through 10 repeat until the project is complete!

Each project, with both HCL DNA and Design For Use, is an exercise in becoming deeply familiar with a previously unknown (or shallowly known) subject in under a week. This is what I love the most about user research: diving into the unfamiliar and emerging a semi-expert.

What I've done

For more details on other work that I've done, here's a sampling of my projects from the past two years. First up is my work doing competitor & user research with Harvest, an Austin startup. If you'd like to skip to my other projects in design, usability, and the like, use the links below. Enjoy!

 
Harvest: Competitor & User Research
The Project

My work for Harvest focused on competitive analysis and usability testing. I performed a thorough analysis of 6 delivery services to compare their designs, labeling, and features. I then carried out a usability testing plan for the in-development Harvest website to determine what worked, what didn't, and what new features users wanted Harvest to develop.

A little background on Harvest: They're a food delivery service aggregator operating in Austin, TX. Their aim is to provide users with a single website (or eventually mobile app) to visit when placing orders for delivery. They combine the restaurant options of 11 delivery services to provide the greatest overall reach for delivery options in the Austin area.

...read more...
The Story

When I arrived at the point in my MSIS program at which I needed to put together my Professional Experience Project (PEP),I knew that I wanted to do research and usability testing. Luckily, there was a company in town looking for someone with just my skills and interests. I joined the Harvest team in April 2015.

My first task was to complete a competitive analysis of 6 delivery services and communicate my findings in a slide deck. Beyond a few initial questions that the Harvest leaders wanted me to research, the assignment was quite open-ended, leaving me free to determine which aspects of these delivery services were most informative for our purposes.

This freedom required me to formulate my own strategy. I began with the goal of learning as much as I possibly could about each of the 6 delivery services, downloading each of their apps and grabbing screen captures to illustrate every part of the food delivery task flow — and there is a highly defined flow, unvarying from one service to another. That is:

  1. Enter address (either through location services or by manually entering)
  2. Browse restaurant selections
  3. Select restaurant; browse menu. Rinse repeat until —
  4. Add menu selections to cart
  5. Enter payment information
  6. Place order

Essentially.

I knew that it was important to closely analyze each of the services, despite my previous experiences as a user with some of them, because my personal experiences and perceptions would be no substitute for thorough analysis and user feedback. My primary focus was ultimately how the apps supported the users' attempts to make decisions about their purchases, answering questions such as:

  • Does the app have search capabilities? How thorough is the search? Can users search only by restaurant name, or can they search by type of food or specific menu item?
  • How browsable is the app? Are there pictures to attract the eye? How are lists of restaurants ordered?
  • How much detail is available about each restaurant? Is it easy to find restaurants that offer vegan or gluten-free fare?
  • When are users prompted to create an account? What payment options are available to users?

After I completed the competitive analysis and presented my primary findings in a slide deck, the development team made alterations to the Harvest website in preparation for usability testing.

I tested the website with 11 professionals from Austin tech companies. Each participant was tested individually. Sessions began with a scenario: You get home from work, exhausted, at 9 pm. There's nothing in the fridge — you didn't have time to go shopping last week. What do you do for dinner? After that, I showed the Harvest website in its current form to the participant and asked him or her to go through the process of placing an order for delivery without actually placing the order. I encouraged users to be as brutally honest as they liked while critiquing any aspect of the website. The last step was administering a demographic questionnaire to learn about the participant's household and food ordering habits.

Throughout this process, I scribbled into my yellow legal pad, ultimately ending up with 40+ pages of notes. Making these notes useful (as in, not a scrambled mess) required additional work. I did some initial coding to collect general themes, such as: references to money/pricing, feature requests, and negative comments. I then grouped similar comments within these themes to determine how many test participants had similar responses to the website. Knowing how frequently a comment or feature request was made helped me prioritize the findings for my final recommendations.

I presented my findings in two stages: first, "quick fixes" that wouldn't require building out new features but would improve the user experience before the Harvest website rolled out in beta, and second, "feature requests" that would require more time to implement but that several users wanted to see.

My Thoughts

The research, planning, and analysis that went into this project required that I make several independent decisions regarding my methodology. In the process, I applied lessons from many of my classes, including: Usability, Information Architecture & Design, Human-Computer Interaction, Understanding Research, and Information Resources in Business.

One of my most valuable takeaways was the interviewing skills that I developed throughout the usability testing sessions. I had testing sessions with individuals who were both familiar and unfamiliar with usability testing. Some freely shared their opinions about every element of the webpage, telling stories and using examples from past experiences. Others needed more prompting to open up about their ideas.

Overall, I gained some fantastic real-world experience through this project that built on my coursework.

Yan Engines: Business Research

In the first semester of my graduate program, I found myself thrown into the deep end. That is, I found myself gathering market research for a real company with real research needs. That company was Yan Engines, developer of the D-Cycle engine, which has greater fuel economy and torque than traditional engines. A team of four, including me, worked together to answer the following questions:

  1. What markets could Yan Engines pursue?
  2. How does the distributed power generation market look in China?
  3. What Texas trucking companies could benefit from a partnership with Yan Engines?

To answer these questions, we spent weeks digging through a variety of databases including Capital IQ, Thomson ONE, Business Source Complete, Factiva, and Lexis/Nexis. We also made use of resources that don't require either access to a university library or very expensive database subscriptions — resources like Google, Yahoo Finance, SEC Filings, and annual reports — so that our client could replicate some of our research if desired.

Our final deliverables (slide deck and document) included data on the market size of China's diesel generator set industry, the challenges of entering the Chinese market, summaries of several international companies in the market, and information of the impact of NAFTA on the Texas and U.S. trucking industries.

iSANApp: Product Design

iSANapp Prototype

The Project

How do new students make it through their first few days of class? They're a little confused, to be sure. A little nervous. A little excited. Everything is unfamiliar, but promising. What do they do to get the information they need to get through those first few weeks, months, and semesters, and how do they even know what information they need?

Answering these questions and making these times easier for students was the goal of iSANApp, the iSchool Augmented Navigation App (clever, right?). The app "creates a seamless connection between the users, the physical environment, and the available iSchool resources" (that's a quote from our documentation), incorporating maps of the iSchool building and the surrounding environment, class schedules, instructor profiles, student groups, the IT lab equipment, and calendars for keeping track of it all.

...read more...

But before we even knew what features to include in the app, we performed a through contextual inquiry and analysis. We interviewed both iSchool students and IT lab staff, the former to determine what parts of student life were most important to them and the latter to assess the flow of information within the IT lab. We then broke our interview notes down into basic ideas (such as "It is difficult for me to access the IT lab site and resources online") and wrote each idea on a Post-It note. From there, we built a Work Activity Affinity Diagram (WAAD) with the Post-Its, grouping the ideas until we had formed general themes: the IT lab, bulletin boards, classes, social events, iSchool administration, and the iSchool website.

Next we defined the workflow of the many exchanges of information that take place in the current system, based on our interviews and observations. We noted the roles that were involved — students, administrators, student group leaders, the iSchool website, etc. Then we created workflow models to visually illustrate these roles and information exchanges.

After gaining an understanding of the current system, we were ready to begin thinking about the design of our new app and the role that it could play in simplifying these processes. From our WAAD, we determined the interaction design requirements for the app and were able to situate the app in a new workflow model. This is the task structure model that we created to represent the primary user tasks within the app:

Another key aspect of our design process was creating a user persona and sketching the user's interactions with the app with storyboards. We finally built out our sketches into a prototype using Axure. The final step was a pilot usability test with five iSchool students during which we assessed usability problems and gathered suggestions for improvement.

My Thoughts

One challenging aspect of this project was that our team was part of the audience for our own product. It was an app made for iSchoolers, by iSchoolers, so we had to be careful to avoid allowing our own perceptions and biases to influence our design decisions. Interviewing other students and creating a persona were important steps towards designing for the users and not ourselves, but, especially in the interviews, we had to closely examine our questions in order to ensure that they were not leading.

When it came to prototyping our app, we had to consider which interactions needed to be represented for the purposes of our usability testing and which were unnecessary. Time constraints meant that we needed to be strategic: get enough prototyped that it worked for our purposes but not so much that we went over deadline (or missed sleep) to finish it.

Michigan DMV: Information Architecture
The Project

When it comes to up-to-date tech and design, there's a general sense that we can't expect too much of our government websites. Not to knock them! The time and resources are simply not there in many cases.

For this project, my team members envisioned a new possibility: what if a government entity did have the time, did have the resources, and, most importantly, had three fledgling-yet-skilled information professionals doing the bulk of the web work for them? What could that website become?

So, optimistically, we set about redesigning the Michigan Secretary of State website with three sets of keen eyes towards the structure of its information.

...read more...
The Story

We began our redesign of the Michigan SOS site with a critique of the current site. To achieve a degree of thoroughness beyond what our own perspectives (twenty-something graduate students) could achieve, we created user personas to help us conceptualize and understand the user base — a challenge considering the variety of people who need to use the site's services. Our personas included:

  1. Emily, the 16-year-old looking to get her new driver's license who tends to use her iPhone for everything and wants fast answers to her questions
  2. Greg, the 42-year-old teacher who needs to switch the license plates on his mother's car and hopes that looking for this information won't waste too much of his busy day
  3. Frank, the 79-year-old retiree who wants to retake his driving test but isn't very comfortable using the Internet

In our initial suggested changes, we emphasized accessibility and ease-of-use, especially for users with little online experience. We realized the importance of a powerful search engine, since users come to the website with very specific questions and do little browsing, if any. Our recommendations included adding a prominent search bar, reorganizing the website content, and introducing a minimalist design.

Next up in our process was a competitive analysis of the California, Washington, and New York DMV websites. Each of the team members picked one site to closely analyze and share with the group. We identified strengths and weaknesses of each site, mostly in terms of features and organization. One of our primary conclusions was that Michigan's lack of a clear hierarchy of information made it difficult — if not impossible — for users to find content beyond the initial 'parent' pages linked to on the home page. The other DMV sites did a better job of addressing this issue.

To address the organization issues of the Michigan DMV site, I proposed a card sorting exercise. I used about 80 post-it notes to represent most of the site's pages of content. Then I recruited three friends to sort the pages. I used an open card sort method, asking the participants to group the pages in whatever way made the most sense to them. After about an hour and half of discussions (with some disagreements), the wall looked like this:

Next up in our process was redesigning the site. We used the results of the card sorting to plan the organization of content, but our primary goal was to reduce visual clutter and emphasize the search feature.

Finally, we tested the redesigned site with six individuals with a variety of ages and backgrounds to reflect our user personas. The tests consisted of four tasks: finding out how to renew a vehicle's plates, finding the nearest DMV location to a particular zip code, creating a login profile, and providing feedback to the Michigan DMV.

We found that the test participants generally appreciated the design of the website and were comfortable using the search engine as their primary means of discovering information. Several users expressed confusion with the vocabulary of the side navigation. If we were to continue with the project, addressing this confusion would be a top priority.

My Thoughts

Redesigning a government website was an interesting experience. Anyone who's ever used a government website has probably had a moment in which they thought, Why can't I find what I need? How hard could it really be to make this thing work? The truth is that a content-heavy, information-rich site presents many design difficulties — doubly so when its audience is most of a state's population. Unexpectedly, I came away from this experience with a newfound respect for government sites. Sure, they could be better. And should be. But making a wealth of content easy to navigate and logically organized is challenging, and it takes time. And that's why the world needs information architects!

CASA of Travis County: Usability Testing
The Project

All websites need to be user-friendly. Food delivery sites need to help users make their selections quickly and easily — and place their order. Government sites need to be easily navigable so that users can find the important information they need — and avoid making calls or going to an office in person to get answers to their questions.

Websites for nonprofits need to communicate the organization's goals and values, and the relative success that the site has in this effort has a direct impact on the likelihood that individuals will give either their time or their resources to the organization.

With this in mind, my usability team planned and carried out a usability test plan for CASA of Travis County, a nonprofit that appoints volunteer special advocates for children who have been abused or neglected. Our testing aimed to discover the ways in which the user's experience of the CASA website could be improved, with the idea that improving this experience would benefit CASA's mission.

...read more...

(It should be noted that, though we planned and executed this project in a serious and methodical way, we did not share our findings with CASA. Incidentally, the CASA of Travis County website was redesigned by an Austin agency sometime in the months after this project. And, wouldn't ya know it? Some of their changes to the site were ones that my team would have made ourselves.)

For our test plan, we considered the primary reasons that an individual might go to the site: to learn about the organization, to find out how to volunteer with the organization, and to donate money or other resources to the organization. We created 5 tasks for each user to complete during the usability testing sessions based on these goals. These tasks were:

  1. You are considering volunteering with CASA, but would like to learn more about the people it helps. Find Adrian's story on the Children's stories section of the page.
  2. Find instructions for becoming a volunteer and find out when the next information session will take place.
  3. Make a donation of money to CASA online.
  4. You would like to donate to CASA, but cannot afford to give money. Find other ways to contribute to the organization.
  5. You are hoping to find full-time employment within the CASA organization. Find the application process.

We tested 9 users: four in the Information eXperience and computer labs at the UT iSchool, 2 in their home environments, and 3 remotely using GoToMeeting. We specifically sought out users who met the parameters for volunteering with CASA in terms of age and life circumstances. We gathered demographic data as well as quantitative and qualitative feedback on their use of the CASA website.

During each testing session, one team member would administer the test and explain the process to the user, one would take notes, and one would time each task and assist as needed. We all took turns with each role.

We kept track of usability metrics such as scenario completion (whether or not a user finished a task), critical errors (errors that prevent the user from completing a task), non-critical errors (errors that lead the user to complete the task in an inefficient way), and time on task (the average time it took for each user to successfully complete each task). The impact and frequency of particular errors were also recorded in order to judge the overall severity of a problem and prioritize our proposed solutions.

In addition to these quantitative means of evaluation, we asked the users to complete a brief questionnaire about their overall impressions of the website: opinions on its appearance, suggested improvements, and likes/dislikes.

After compiling all of this feedback and data, we were able to make actionable suggestions for improvement. It was important to us that we not only point out problems but also propose solutions.

My Thoughts

This project taught me a lot about the usefulness of quantitative data in usability testing. Not only do clients (and people in general) like to see numbers and statistics and percentages — these figures actually help us researchers organize our own thoughts and base our priorities on something that feels firm. Even though the number of users tested was relatively small (but big enough that the findings were plenty significant) and even though we were (and always are) dealing with something as variable and somewhat imprecise as human perception, pulling quant in with the qual made our work better, without a doubt.

Resume

Experience

HCL DNA - Digital & Analytics, San Antonio, Texas
User Experience Researcher
January 2017-Present

  • Collaborate with client team in the banking industry on a variety of projects
  • Moderate interviews, both scheduled and spontaneous
  • Write protocols for moderated and unmoderated interviews
  • Plan and launch unmoderated usability studies
  • Create personas, empathy maps, and user journeys
  • Plan longitudinal research studies on digital ethnography platforms
  • Run concept testing efforts

Design for Use, Austin, Texas
User Experience Researcher
September 2015-January 2017

  • Conducted preliminary design and user research to gather insights into user base, competitors, and best practices
  • Developed and mapped user stories that contributed to design requirements
  • Planned and moderated usability tests (remotely and in-person) within an agile workflow
  • Planned and launched unmoderated usability studies
  • Recruited, screened, and scheduled participants for usability studies
  • Planned and carried out longitudinal user studies
  • Created, ran, and interpreted the results of card sorting exercises
  • Presented research findings to clients and other project stakeholders
  • Communicated and worked with a variety of clients, including advertising agencies, health care startups, and ISPs
  • Collaborated with UX designers and strategists

Grainger, Austin, Texas
Freelance UX Researcher
July 2016-August 2016

  • Analyzed user interviews and survey results
  • Created slide decks that communicate prioritized research findings
  • Presented research findings to internal stakeholders
  • Conducted tree tests and heuristic evaluations of complex product taxonomies
  • Collaborated with UX researchers and taxonomists

Harvest, Austin, Texas
UX Consultant
April 2015-August 2015

  • Researched competitors and market to gather design and feature requirements for website
  • Planned and moderated usability testing sessions to gather demographic information and qualitative feedback
  • Communicated research findings visually with slide decks and verbally with presentations to the design and development team
  • Worked with design team to strategize design requirements for usability testing
  • Managed scheduling system for usability testing

Dolph Briscoe Center for American History, Austin, Texas
Monographs Intern, Library Unit
March 2014-August 2015

  • Cataloged newly acquired book collections and kept detailed record of cataloging work
  • Researched monograph materials in WorldCat and OCLC databases to ensure that records were correct
  • Trained serials intern in serials and monographs processing
  • Staffed public service desk and answered reference questions
  • Planned and led team effort during week-long shelf shifting project

Dolph Briscoe Center for American History, Austin, Texas
Serials Intern, Library Unit
December 2013-March 2014

  • Performed end-processing on newly acquired periodicals in catalog
  • Researched serials in WorldCat database to ensure proper cataloging
  • Prepared and organized library materials for binding process
  • Selected materials to add to library collection

Dell Children's Medical Center, Austin, Texas
Volunteer/Intern, Family Resource Center
September 2013-May 2015

  • Searched PubMed and other medical databases to find articles relevant to doctors' and nurses' literature requests
  • Processed and maintained library materials
  • Assisted medical librarian with special projects
  • Welcomed patrons to the FRC and introduced them to FRC services
  • Checked out books, DVDs, and other library materials to patients and family members

Goodwill Computer Museum, Austin, Texas
Volunteer
September 2013-December 2013

  • Organized and inventoried newly acquired artifacts including Atari, Commodore, and IBM machines
  • Served as support staff at Austin Museum Day 2013

Texas A&M University Libraries, College Station, Texas
Student Worker I, Government Documents
November 2011-May 2013

  • Maintained collection of government documents published by the U.S. Government Printing Office
  • Organized, shelved, and labeled documents for ease of use
  • Contributed to long-term cataloging projects
  • Assisted library patrons in finding documents
  • Repaired damaged books and instructed fellow student workers in repair techniques

 
Skills
  • UX Research
  • Project Planning
  • Usability Testing
  • Test Protocol Writing
  • User Interviews
  • Diary Studies
  • Information Architecture
  • Card Sorting
  • Tree Testing
  • Competitive Analysis
  • Survey Design
  • Qualitative Research
  • Quantitative Research
  • Business Research
  • Requirements Gathering
  • Persona Development
  • Writing
  • Storyboarding
  • Recruiting
  • Ethnographic Research
  • Presentations/Slide Decks
  • Statistical Analysis
  • Concept Testing
  • Prototyping
  • Optimal Workshop
  • UserTesting
  • Axure RP
  • Photoshop
  • Camtasia
  • Excel
  • HTML/CSS/JS
  • PHP
  • MySQL
 
Education

University of Texas at Austin, Austin, Texas
Master of Science in Information Science
August 2013-August 2015

  • Relevant Coursework: Usability, Information Architecture & Design, Human-Computer Interaction, Database Management, Understanding Research, and Information Resources in Business

Texas A&M University, College Station, TX
Bachelor of Arts in English, Minor in Sociology
August 2009-May 2013

Say hi!

If you'd like to get in touch with me, email me at:

eliz DOT ferron AT gmail DOT com

Thanks for stopping by!