Experience. Like many words in the English language, the word experience takes on different meanings in different contexts. For example, I have experience developing software, which is taken to mean that I have written code that produced an app.
But I have also had the experience of breaking a finger, taking out a mortgage, swimming, and using software written by other developers. These experiences do not necessarily translate to expertise, but merely that I have engaged in an activity and can relate to others who have done the same. I can also share my impressions about those experiences, both good and bad.
When technology firms use the term “experience” it generally relates to the way in which users engage with their products and services. But too often the focus of that experience boils down to, “Did it work?”
For example, I like to play video games, which necessitates that I use a pair of headphones. I have had many sets, most of which were clearly not designed for people who wear glasses. Most have been uncomfortable, and the experience of using them was not a pleasant one. Having discovered a company who designed headphones with people who wear glasses in mind resulted in a far better experience, and one that engenders a sense of loyalty and advocacy for the brand.
They all worked and had 100% uptime. But one brand stood out above the others because it went beyond uptime and focused on the experience of wearing the product for long periods of time. Needless to say, I’m a customer for life. Would buy again. Five stars.
This kind of focus on customer experience is one every company should pay attention to as they progress on their digital transformation journey. Working correctly should not be the (only) goal of your app or device.
In today’s digital as default world, a nearly constant uptime should be assumed. Availability of an app or device should not be the question. What should be the question is the experience I will have with it.
Basing customer experience on traditional measures of uptime is not a winning strategy. Plenty of apps are available all the time, and many of them still end up being deleted not because of aggressive advertising or notifications, but because they are simply “confusing.” An app that is available but presents confusing navigation options or unclear guidance through a complicated process results in a poor experience. Failing to ensure an app interface is just as usable on mobile as it is on my large display makes for a poor experience. Failing to maintain consistency between services and products makes for a poor experience.
And let’s not fall into the trap of thinking that because an app will be primarily accessed by machines, scripts, or other services that experience doesn’t matter. There is still a human being in the mix, namely the developer who must build a mechanism for their app to interface with that API. Their experience is important, as APIs that are poorly documented or inconsistent will certainly result in frustration and perhaps abandonment of the effort. Uptime of the API isn’t the end of the experience, it’s the beginning.
Whether you’re providing an API or a service, delivering a product or guidance through a process, a human being is going to experience what you’re offering. One of the definitions of experience is based on human reaction:
(noun) an event or occurrence that leaves an impression on someone
And perhaps that’s why we find it easier to focus on uptime; uptime can be measured and tracked. Impressions are a human reaction that has no default digital signal. We must actively seek to understand the human factor. Luckily there are digital signals that can help us understand whether an impression is good or bad.
Measures like length of engagement, daily and monthly activity rates, and percentage of a process complete can provide general insights. For example, complex onboarding or registration is a top reason cited by users for app deletion. Instrumentation can provide insights into that process to expose sources of frustration. If most users abandon a process at step two, it behooves us to examine that step and determine what is getting in the way.
Instrumentation is key to understanding how users engage with technology. That’s true for user interfaces designed for consumers and it’s true for interfaces and APIs designed for consumption by developers and engineers. Instrumenting interfaces provides designers and developers with the insights they need to ensure a good impression.
These capabilities are constantly evolving, and “digital analytics” can provide even greater depth to understanding how a user interacts with an app or device.
While we often use digital body language to detect threats (like bots) and assess risk, we are just beginning to explore the potential of these signals and apply it to engagement with apps and devices.
As we dive into this year’s State of Application Strategy survey, we’re seeing that customer experience continues to be a significant focus of digital transformation efforts across every industry. We’re not surprised. In a digital as default world, technology is replacing human customer service as the default way in which companies engage with customers—the first time, and nearly every time after that.
CX is increasingly an important part of the development lifecycle. It is a discipline that focuses on leaving a good impression on someone. That makes it a human first approach to technology, and not merely a measure of uptime.