One hundred million dollars worth of Hershey’s Kisses, Jolly Ranchers, Whoppers-- all packaged, prepared and waiting to ship out to grocery stores around the country for the lucrative Halloween season. The trucks were waiting for their orders, grocery chains were waiting to stock their shelves, and shareholders were counting on the profits. The only problem: the ERP system made fulfilling the Halloween orders impossible.
In fall 1999, this scene represented Hershey’s executives’ nightmare. Their Enterprise Resource Planning (ERP) system caused $100 million worth of orders to go unfulfilled, nosediving the stock 8 percent and generating a 19 percent drop in profits year over year that quarter.
“We fully expected a strong finish in the second half of the year. Instead, the implementation of the final phase of the Corporation’s enterprise-wide information system created problems in the areas of customer service, warehousing and order fulfillment,” CEO Kenneth Wolfe said that year.
It was one of the first IT debacles to have such an impact on a company’s stock price. In the many, many retrospectives, one theme emerges for Hershey's: the company went against recommendations to rush the rollout and therefore, cut corners on user testing.
When conducting a legacy system modernization, this mistake happens all too often. Legacy systems are complex and monolithic. These high-risk, monolithic systems can take years to modernize. When the C-suite starts pressuring to adjust the timeline, crucial periods of user testing seem like an easy target. How much testing do you really need?
More than you think. Unless you are on the ground with your users observing their behaviors, interactions with the system, frustrations on the job, your assumptions about how your users operate and what they need from your modernization will sink your project, like the 25 percent of software development projects that fail outright and the 60 percent that produce ineffective products. Modernizing a legacy system is too important to not be 100 percent sure your users will use it effectively and correctly. The spaghetti code hides valuable business logic that also covers years of workarounds from users and businesses as they fit the system to the changing needs of the business and users. Just examining the system and making assumptions about how the system fits into users’ job duties will not yield a product that makes your users more efficient-- missing the mark on ROI goals.
You are going to get user feedback, the good, the bad and especially the ugly; if your solution causes significant headaches, everyone will hear about it. The difference with this type of feedback and user research to collect requirements is your ability to make changes from that feedback. As the project inches closer to completion, the cost of changes rise significantly, as much as 10 times at each phase of development. After prototyping, the valuable insights you gain from rigorously testing your solution will cause changes that are less expensive than those you will rush to implement post-launch.
Building User Feedback Into Your Timeline
First, take time to understand your business objectives. What are the goals that the modernization hopes to achieve? What KPIs does the project need to hit to be considered a success? The keys here are to identify concrete numbers and meet with all key stakeholders.
After identifying the KPIs, it’s time to get out in the field to conduct user research. It’s important to complete both contextual inquiries and user interviews. Contextual inquiries mean standing side-by-side with the user as they go about their day. It’s important to schedule as much time out in the field as it takes to develop a complete picture of a user’s day-to-day duties, so build in as much time as you need to build that picture for each user persona. You can’t skip the valuable insight of contextual inquiries; in user interviews, users will omit crucial parts of the process. After mapping the workflows and data of the user interviews and contextual inquiries, merge that data with your business objectives to inform the requirements and design of the solution.
After design, start early with user testing. Give users a prototype of the product and then ask them to provide key functions. Do not rescue them when they struggle-- only jump in if they are close to shutting down. Observe where they struggle and ask them to pause and explain their thinking. If they start tapping or going down the wrong path, be sure to inquire why and try to identify what design clues are sending them down the wrong path.
The product is not complete after it’s developed. It is absolutely vital to a project’s success that the software undergo an extensive round of quality assurance testing to identify bugs. To learn more about developing and delivering successful enterprise systems, check out our eBook below.