If you build it, they will use it. Won’t they?
Implementing and managing core systems is an amazing feat of balance. Even with clear mandates and best intentions teams have conflicting opinions of how to optimize their processes. Each data field, mouse click, list value, and business rule needs to be assessed against trade-offs of quality, user experience, and processing efficiency. Each user views the workflow and processes slightly differently.
Faced with such challenges, project teams often settle for a solution optimized for getting the system live. What impact do these decisions have on the end-user experience? Have we indeed positioned our project for the optimum success?
Interestingly, the concept of taking a system “live” has historically meant that it became static – changes were difficult and costly. However, new systems and designs mean that the system becomes a “living” system that can be adapted and tweaked as experience, feedback, and results are obtained. There must be some stability and consistency for the users and data (see Avoiding a Rules Hangover), but this doesn’t need to be rigid as it has been in the past. Another balance to maintain.
Ideally, a program begins by defining the primary project objectives from the perspective of business value ahead of the implementation. This information is then leveraged by the implementation teams to maintain focus. Sub-goals and metrics are defined and aligned to objectives to ensure that the goals are following the SMART criteria.
Success of the project is in part the action of going live and credit belongs to these teams that persevere through cycles of requirement gathering, implementation and testing. Success is also measured by the value provided to the organization and benefits to the end users post-implementation.
We need to set a reminder for ourselves to observe the impacts of those decisions and compromises to our results. Often, we talk about realizing Return on Investment (ROI) and optimizing the Total Cost of Ownership (TCO) of a project or system investment. This means ensuring that adoption rates are being maximized and that workflows indeed flow. How often have we found that we bought something for a cool feature, only to never use that option because it ended up feeling cumbersome or difficult to use?
Once the learning curve levels and the new-car-smell has faded, it is time to revisit the project – depending on the system and phase this comes approximately around the time of the first anniversary of the implementation. By then, users will have adopted the features they like and found work-arounds or alternatives around the parts they aren’t so fond of. Measurable results will be stabilizing and the post-implementation effects can begin to be quantified.
Key questions we can ask at this time to evaluate the success of the project include:
Are users leveraging the system features and options?
Do they have all the tools they need? Are there hints, tips, user guides, or messaging that can be clarified?
Are steps being skipped in manual processes or extra steps being added?
Are the straight through processing rates as expected?
Are users aware of the simplest paths to complete common processes?
If you don’t have your goals pre-defined, you can still review your results. Start with your corporate goals and then look to trends and changes in the data that support the goals. Meeting and observing users of the systems – the employees, agents, and policyholders – can create insights into the actual usage of the systems you are implementing and how the processes align to the corporate goals and user needs.
By continuing to monitor the operational results, and modifying your system and processes when necessary, there is an opportunity to continue to improve the user experience and results of your project long after the go live.
Need more help? Guidewire’s Value Consulting team has pioneered processes to support ROI capture across the full project lifecycle. We’d be happy to help as you continue your path to a fully optimized solution.