Research into how to build trust for people purchasing fruits and vegetables online.

 

Info: Class project for Research Methods for HCI taught by Dr. Carrie Bruce in Fall 2017 at Georgia Tech.

Team: Meghan Galanif, James Hallam, Ashok Krishna, Rachel Chen, Sijia Xiao

My Roles: Discovery research, contextual inquiry, semi-structured interviews, design, evaluation plan creation, presentation


The Problem Space & Discovery Research

Our team decided to look at ways to improve the fruit and vegetable purchasing experience.  We began by looking at both in-store and online shopping experiences, and looking for opportunities for improvement.  We did the following discovery research to gain knowledge about the different contexts of purchasing fruits and vegetables (produce):

  • In-store: In-store observations of produce departments.  This method was chosen in order to observe the environment and watch people navigate produce departments, select produce and make a purchase without obstructing the process. It also allowed us to collect notes on observations made and gain insights into behaviors before taking it a step further and conducting interviews.
  • Online: We conducted semi-structured interviews of people who shop for produce online, and personally placed orders with AmazonFresh and Instacart in order to try out the experience of home delivery.  Here we chose to use semi-structured interviews in an attempt to find a common theme of pain points in the online produce shopping experience.

In-Store Background Research

File_000(4).jpeg

After conducting in-store observations, we made two artifacts: a site-map of a major grocery store chain's produce department, and created a task analysis for purchasing an apple.

wfoods_map.PNG
 

Task Analysis: Purchasing an apple in-store:

taskanalysis_apples.PNG
 

Online Background Research

For our background research of online produce shopping, we did a competitive analysis in order to investigate and compare several existing websites, and we completed orders and deliveries with both AmazonFresh and Instacart to see what the experiences were like.

afreshsmaller.png
 

Similarly to purchasing an apple in-store, we made a task analysis for purchasing an apple online:

taskanalysis_amazonfresh.PNG
 

Context Comparison

The two experiences of shopping for produce online versus in-store are very different.  We were able to compare the two contexts and pull insights into what shoppers want and need by doing an affinity diagram of all of the information we learned from on-site observations, literature review, semi-structured interviews and the online services competitive analysis. 

Insight: It is difficult to buy fruits and vegetables online without the sensory experience shoppers get in-store.  When shopping for produce, people like to:

  • See the color and condition of the skin
  • Smell for freshness
  • Feel the texture of the skin for blemishes
  • Taste fruit samples

None of this is currently possible when shopping online.

Question: How does online produce shopping differ from typical online shopping?

  • Produce isn't durable- it does not have a stable shelf life, and timing peak freshness with delivery could be a problem.
  • Produce varies from item to item- fruit from the same lot and same farm are still individually different, and people have shopping preferences for these differences.
  • Consumers are picky about produce- Americans are notoriously selective about the perfect appearance of produce, and millions of tonnes of produce are rejected every year due to cosmetic imperfections.
 

In-Store Shopping Experience

  • Direct interaction with produce
  • Easy browsing for new items
  • Get produce immediately
  • Requires trip to store
 

Online Shopping Experience

  • Description of produce and stock photo
  • Familiar Amazon experience
  • Convenience of delivery
  • Need to be available for delivery window
  • Only know what you get after delivery
 

Context Decision

Ultimately, our team decided to focus on online produce shopping, because it is a rapidly growing industry that could potentially revolutionize the way people get their groceries, and because of the potential societal benefit.  Online grocery shopping is growing at a rate of 25% per year and 85% of online grocery shopping transactions contain produce (Springer, 2017).  We thought that online produce shopping could benefit from design thinking and HCI solutions, as the primary benefit for consumer adoption to online shopping was found to be the lack of sensory information (Loria, 2017).  Also, fruits and vegetables are necessary for humans to live a healthy life, and online produce delivery may lower prices and broaden accessibility of fresh produce to people who may not otherwise have access to it. 

Narrow Scope to AmazonFresh

We decided to narrow our scope to the online grocery delivery service AmazonFresh due to the recent acquisition of Whole Foods by Amazon, and because of the potential to research how reviews influence trust in the online purchasing of fruits and vegetables.

Online Produce Shopping Context & User Needs Research

We deployed an online survey and conducted semi-structured interviews and contextual inquiries of people shopping for produce online.

Survey

We deployed an online survey to get opinions from a large number of people about produce shopping habits, and online produce shopping behaviors.  We included questions about their online produce shopping habits, their pain points and questions regarding how they use reviews during the decision making process.

survey1.PNG
survey2.PNG
 

Contextual Inquiry

We performed three contextual inquiries with subjects who shop for produce online as they used AmazonFresh.  The goal of the contextual inquiries was to observe AmazonFresh users in the act of shopping for fruits and vegetables on AmazonFresh.  Also, the contextual inquiry supplemented the information gained from the survey, as the survey results are self-reported, but the data from the contextual inquiries is derived from actual observed behavior. 

We chose contextual inquiry because it allowed us to observe the user in the act of online fruits and vegetables shopping.  We could discern exactly why a user performs an action on AmazonFresh, and what factors led them to purchase an item, through observation and further verbal probing. Specifically, we could observe the user's click flow and what features or action buttons on the site led to a purchase.  We were also interested in being able to ask questions as they shop, and to hear what they were thinking as they shopped.  Our project requires this rich level of qualitative data as we were interested in specific factors as to why or why not people buy produce online.  We were aware that the data gathered from two contextual inquiries would not be representative of all users, but we wanted to still utilize the method for in-context data gathering.

The contextual inquiries were conducted with AmazonFresh shoppers.  The sessions were roughly one hour in length and consisted of observing a user as they shopped for produce.  Each researcher had a list of key factors in order to ensure they took notes on the most important information, however since this was an in-context activity, an interview script was not followed.  

Following the contextual inquiries, the team conducted an affinity mapping session.  Individual insights were grouped on the basis of thematic similarity, such as users thinking it is risky to buy produce with bad reviews.

Image uploaded from iOS (1).jpg

Initial findings:

  1. Online shoppers want convenience
  2. In-store shoppers like to see the produce they are buying.
  3. Online shoppers want to know exactly what they are going to get before buying something for the first time
  4. People read review, scanning for specific keywords, ex. sweet or moldy
  5. People will keep reordering if there are no changes in quality
  6. People preferred reviews from local people
  7. Negative reviews had major effect on selection
  8. Bad online ratings made a purchase risky
 

Qualitative Data Coding

We looked at reviews of the AmazonFresh service in order to find what people liked and disliked about the service, find any pain points and look for patterns that could be exploited for potential design improvements.

reviews_afresh.PNG
 

Personas

Empathy Map

empathy_map3res.jpg
 

Personas

Three personas were born from evidence gathered from semi-structured interviews, contextual inquiries, survey data and the evidence gathered from reading AmazonFresh user reviews.  We made personas in order to help tell the story of types of people who use online grocery delivery services, and their motivations for doing so.   We used personas to help us illustrate to others who we are ultimately building our design for, and to keep track of the types of features and functionality we will need to consider in our design.

Design Requirements

  1. More control over reviews - focus on local
  2. Ability to specify preferences (ripeness, appearance) while ordering
  3. Make it easier to scan reviews for keywords
  4. Make it easier to complete reviews after purchase.
  5. Make it easier to report a problem with order

Design Iteration & Testing

First, we made a set of sketches of design choices.  We then held feedback sessions on those sketches.

DSC_0881.jpg
 

Sketches of Interface

 

Wireframes (click image to enlarge)

Product page wire frame - Design additions: 1. Ripeness meter, 2. Appearance selector, and 3. '17 reviews from your area'

 

Reviews page wire frame - Design additions: 1. Filter reviews to 'Reviewers in my area' and 2. Most common words in reviews

 

Mobile app interface for leaving order review - Design features: 1. order info at top, 2. write a review button for each item, 3. item review screen, 4. add photo/video button, 5. report problem screen (triggered with review less than 3 stars)

 

Interactive Prototype

Two interactive prototypes were implemented using JustinMind: one of a banana product page, and one of a mobile apporder  review page.   Users can click through a prototype of an AmazonFresh page selling bananas, and make preference selections, and leave a review for their order and report a problem.  Here are the links to the prototypes:

 

Evaluation

We employed two methods: cognitive walk throughs and moderated user testing. 

DSC_0893.jpg
 

Moderated User Testing

We performed moderated user testing for the two prototypes (product page and mobile review) using the think-aloud walkthrough method. During and after their review of a prototype, participants were asked follow up questions about their experience. 

Five study participants were recruited to evaluate both prototypes in a user testing session.  For the desktop prototype, users were asked to go through the process of buying bananas.  For the mobile app prototype, users were asked to perform three tasks using the prototype (detailed below)

Evaluation Goals

  • Identify any usability issues with the interface

  • Identify any areas of confusion

  • Determine the utility of added features

  • Determine if the design decisions would entice people to leave product reviews

  • Determine if designs improved the online produce shopping experience

Method Justification

We chose to do think aloud walk throughs with study participants because we wanted to allow participants to click through the prototype and say out loud what they are considering and thinking about specific design elements.  We also wanted to see what design elements participants chose to interact with while using the clickable prototype. For evaluating the mobile app prototype, we gave participants three tasks to complete in order to get participants to have to interact with every design element.

We kept the structure of the evaluation session similar to that used in the evaluation of the sketches and wireframe designs.  This was to be able to compare data longitudinally across all design iterations and note improvements between them. 

There are noted constraints to this method.  We had a relatively small number of participants, limiting the conclusions we can draw from the data since individual preferences may have a noticeable effect on the data, and it was difficult to eliminate those biases from the reporting. There was also a limit in our request of the task for the subject to simulate a fake order for review - this requires participants to have to imagine a problem with an order that does not exist in reality, which could lead to inaccurate reports of their experiences.  This puts additional burden on the participant that would not be necessary if we were testing the system on real orders.

 

Conclusion

People want the benefit of reviews without writing them.  An improved way for people to leave reviews is necessary, since we found that people are unlikely to write a positive review, even when their experience was positive. 

We also found that the reviews may benefit if we shift the focus from negative reviews from problem resolution.

Our prototype changed how reviews worked on AmazonFresh.  People would leave less negative reviews when they could report a problem and get resolution instead, and batch reviewing improved the likelihood of positive reviews being left.

 

Sources

Loria, K. (2017, April 18). Produce boosts AmazonFresh sales to $10M in Q1. Retrieved September 24, 2017, from http://www.fooddive.com/news/grocery--produce-boosts-amazonfresh-sales-to-10m-in-q1/440602/

Springer, J. (2017, August 29). Early adopters capturing more online grocery sales. Retrieved September 24, 2017, from http://www.supermarketnews.com/online-retail/early-adopters-capturing-more-online-grocery-sales