top of page

Simplified UX for Enterprise Solutions 

Role: Product Manager

Collaboration: SAP Design & Co-Innovation (DCC) Team

Project Summary

The objective of this initiative was to provide a better user experience to critical parts of the business (e.g. Order-To-Cash). Employees required intense learning, training, and documentation. Additionally, the complex ERP user interfaces increased the probability of user errors, reducing productivity and in many cases causing frustration to end-users.

Responsibilities

As part of this initiative, I co-led Design Thinking workshops including user research, ideation and usability testing activities for different divisions of Nestlé including Waters, Purina, Prepared Foods, Nutrition and Health Science.

As I result we were able to increase end-user productivity, reduce chances of errors and training time. In addition, a big shift in the mindset of stakeholders regarding user-centered design and that designing for users can yield immediate and long-lasting results.

 

This work was presented at the 2016 SAPPHIRE in Orlando: Provide a Simplified and Integrated Visual Experience.

User-Centered Design Approach

We focused on going to where the users were for each engagement. It allowed us to understand the problems to solve, consider users' environment, and from there, map out user journeys, ideate, and create designs that were tested with users to capture feedback for the next iteration of prototypes, and ultimately translate into the final product. 

Process | Step-By-Step

  1. What and how to measure results

  2. User research: observe and interview users

  3. User research: interpret and drive insights

  4. Design & prototype: solve pain points

  5. Design & prototype: assess technical feasibility

  6. User testing

  7. Development: design handoff and UAT 

  8. Measure results

1. Strategy: What and how to measure

One of the big topics that always come up is how to measure user experience. For each project, we defined what and how to measure at the beginning of each engagement. Some of the factors we considered while measuring results:

Before getting started

The System Usability Scale (SUS) is a cheap, easy, and quick way to understand how users perceive a website or solution and served as a barometer before we got into the user research phase. We also used this method to compare before and after design changes were implemented. 

sus-survey-before-design.jpg

2. User Research | Observe and interview users 

From the SUS we selected end-users who were willing to participate on the research. We observed 8-10 users at their workplace for each project, collecting artifacts, notes and pictures.

User workplace and artifacts collected during observations and interviews. 

3. User Research | Interpret data & drive insights

Through the end-user journey and creation of the persona we understand user’s tasks in details. Also, it gives us the ability to identify pain points based on the observation. The persona helps us to understand who the users are and how we would tailor the solution specifically to their needs.

Persona defined after user observations and interviews.

Mapping user journey.

4. Design & Prototype

Design and prototype were divided into the following:

  • Minimum workflow exercise  – by mapping out current screens, fields, and flows we highlight the path users take to accomplish a certain task.

 

  • Wireframing – based on the minimum workflow exercise and user journey, we simply wireframe the desired experience.

 

  • Visual Design – putting all together by following UX best practices of navigation and other elements, as well as applying look & feel to the prototype.

Prototyping.jpg

5. Assess Technical Feasibility

We understood early on the importance to assess the technical feasibility of designs. First, we take into consideration the user’s needs, as well as business requirements. Second, we balance between what is a great design and what is good performance. It is crucial to ensure that the designs created are feasible before validation with end-users.

We validated technical feasibility with development teams before moving into visual designs. 

Usability_Testing.jpg

6. User Testing

This step served as a validation and feedback gathering resulting from observing users. It is important to organize tasks in scripts so users would be able to perform such activities in a setting that would allow them to provide valid feedback so designs can be adjusted accordingly.

 

We targeted an average of 8-10 users per usability testing and gathered results that helped us iterate designs and make adjustments before implementation.

7. Development handoff and UAT

After designs were signed off we officially handed off designs to the development team. Throughout the development and testing of the solution, we met with developers on a regular basis to answer questions and monitor progress. Once development was completed, we participated (along with end-users) in the user acceptance testing (UAT) and shortly after we launched the solution in production systems. 

8. Post-launch and Measuring results

For every project we piloted, we initially targeted the users who participated in research and usability testing activities. We expanded to the broader group as we measured positive results.

We once again used the survey to measure quantitative and qualitative results (open questions) and in most cases we had positive results with a few opportunities for improvements.

Additionally, we measured the steps before and after of several end-user tasks (a couple of examples are shown below). 

bottom of page