Learning is going to be sticky.

When the first iPhone launched in 2007 the features and capabilities of the phone were revolutionary, but it’s often forgotten that the app store as we know it today wasn’t avaible. There was a tremendous hunger among developers who saw the future land rush, but couldn’t build native applications yet.

A year later when the app store launched, the phone took off, and we really entered a new era.

From the moment that people were able to add applications to their devices, effectively customizing their phone to their life, the dynamic changed. Competition wasn’t really possible in the way it had been before.

Prior to the app store, we all looked at phones as a collection of features. This one could do email really well, over here we had a cool music player. One persons favorite form factor was anothers annoyance. You would buy the set of devices that matched your life, the phone was just a part of it.

After the app store, we simply made our phone the collection of features we wanted and needed. Instead of competing with a single iPhone feature set, all would be competitors needed to compete with a great solution for each and every unique combination of hardware and software. By making a personal device that was actually personal, the landscape and customer expectations shifted rapidly, the value an individual saw in their device wasn’t what came in the box, but rather the personalized solution it provided.

This is just beginning to happen again in a new way. It has the potential to vastly alter how people think about the technology they use, how long they keep things, and what is expected of hardware and software in the coming years.

Software, using the capacity of machine learning and hardware accelerated neural networks is going to begin learning a lot more about us. Our preferences, habits, desired routes, processes and routines are all a few years at most away from being captured by the tools we use.

I suspect that the majority of these insights into our behavior will be welcome, though of course there is an opportunity for the companies applying this technology to take it too far, or combine it in ways that could invade privacy. We should be vigilent for that, while also welcoming some of the life improvments that will come along with it.

Some simple examples to help illuminate this:

  • There are two exits from the highway you can take to get to your house. You have a preference for the second one because there are less traffic lights, even though the miles driven is slightly further. The navigation in your car insists that the first is better. Soon, your navigation system should be able to recognize that for some reason you always take the second exit. It doesn’t need to know why, just that it can stop yelling at you and ‘recalculating route’.
  • When you get home from work every day, it’s usually around the same time. In the summer months, it’s still light out, but in the winter, it’s dark. The first thing you do when you get home is turn on the light, but only in the winter. Systems in your home should see and learn this, automatically turning the light on for you, after sunset, in the winter.
  • You get paid every other Thursday, and once a month, after the second paycheck you transfer some money into your savings account. It’s a manual process, you could set it up to happen on a particular date each month, but your paycheck date moves around a little and you want to make sure it happens after that second one, so you do it manually. This behavior isn’t that complex, and should be automatable.

Each of these behaviors is a small difference from a default, the kind of thing that we all handle every day. The ability to capture that small variation, personally for you, and apply it correctly is at the heart of the upcoming personalization trend, and it’s going to change how people think about products.

Take a voice assistant like Alexa. So far a lot of the value in Alexa has been enablement focused. I bought Alexa and now I can turn my lights on and off with my voice. These enablement features however are usually handled through platform APIs that can be used by a wide range of products. Telling Alexa to do it is the same as pushing a button in an app, it’s just a different kind of keyboard.

There is limited stickyness to these products because it’s still very possible to switch out Alexa for Google Home, and get the same kind of results.

Products and systems that learn about you will alter this dramatically and as fundementally as iPhone + App Store altered the phone market. The connection to the API is completely replaceable, and invisible to the customer. What is unique is the learning that system has developed about me, my habits and preferences. I’ve trained it, and in doing so I’ve created set of differences that are unlikely to be replicable by another product out of the box. Even with equal capacity to learn, I’m still faced with the daunting task of re-training.

Products that learn will become increasingly valuable in our lives, but they will also become increasingly sticky and leaving them behind will be more and more difficult as the value that comes from that learning will be unique to me and irreplacable.