I had a Fitbit Flex. I don’t any more. To track my progress while walking, I had to constantly sync to my smartphone to monitor my steps in real time. The syncing didn’t work very well, and there was no way to have the display on the phone persistently show my progress. Every few minutes if I wanted to see how many steps I had taken, I had to sync. At least the Fitbit did the syncing wirelessly over Bluetooth, unlike the Jawbone UP which requires you to physically plug the bracelet into your phone. Though Fitbit and Jawbone have their fans, I thought it was a royal pain to use. It was another device to manage and charge…not to mention the $100 expense.
I recently got a Samsung Galaxy S5 which has a built-in pedometer that appears to have all of the pedometer functionality that the Fitbit had with many other side benefits. Notably, I always have my smartphone with me, so I don’t have think about putting the bracelet on. It tracks my steps in real time, it’s one less device I have to think about or charge, and of course I saved the $100. I’m averaging over 10,000 steps/day this month. I sold my Fitbit on Bookoo.
Just as the wearables revolution is getting underway, one of the leading companies, Fitbit, is already getting sideswiped as smartphones subsume capability that was previously provided through an attached device. Samsung is going to use the Galaxy as the platform for health monitoring because the S Health app that came with the phone clearly anticipates other devices and modalities. For example, the S5 has a built-in heart rate monitor (nice!). No doubt Samsung’s watches, and whatever else is to come, will provide additional sensors that can leverage the intelligence and extendibility of the smartphone as well as its gorgeous display. And less than eight weeks after the Galaxy S5 started shipping, Apple announced similar capability in iOS8.
null —or at least have a good pivot in the works. It’s one thing to have a business model selling hardware devices and sensors that that are essentially wireless peripherals for smartphones, but what else could be built into the smartphone that would kill off any opportunity for peripheral makers?
When I first heard about cameras in phones, my thought was, “How am I going to get the images off the phone?” I was thinking about the paradigm of a camera, not a wirelessly connected image capture device (which also subsumed the capability of my $400 camcorder). The 16 megapixel camera in the S5 is better than the Canon Sure Shot I used to have, and because it’s on a smartphone, the apps get refreshed and updated constantly…something that’s not very practical with a good old digital camera.
When cameras were first added to smartphones, no one predicted business opportunities like Instagram or Snapchat. When GPS functionality was added to smart phones, no one anticipated applications like Foursquare, Tinder and Uber.
We have seen time and time again that when capability is added to a platform, innovation shows up in unforeseen ways. This essentially was Microsoft’s business model with Windows—provide the platform and let other people add value to it through applications—and the same dynamic is occurring on smartphones. When Windows had 97% market share, every time Microsoft added additional capability into the OS, it deprived some other company of their livelihood. The same thing happened on the hardware side. When I bought my first PC in 1989, I spent $750 for a 9,600 baud modem as an accessory.
If you are making wearables, you better make something that can’t be Hoovered into a smartphone, because sooner or later if it can be it will be. Fitbit, et al, must be focusing on sensors and use cases where the capability cannot be incorporated into the smartphone architecture.
Today phones know where we are and how we’re moving (a radical idea just 10 years ago). In the near future they’ll know everything about the environment we’re in (temperature, humidity, elevation, existence of pollutants, air quality, ozone levels, etc.) as well as everything going on with our body functions (basal temperature, heart rate, EKG, breath analysis to detect disease states, blood tests, etc.). Let’s call it “ubiquitous situational awareness.”
One of my clients, Spec Sensors, is developing low cost, ultra-low power sensors for smartphones that detect gases like hydrogen and methane. When we talked to potential investors two summers ago, we got many, “I don’t get it” as responses. Today the Internet of Things is one of the hottest sectors around. “Why should I build that into the platform?” doesn’t need a direct answer any more. By adding the capability, at very little marginal cost, the value of the system goes up by a multiple of the cost of the added component. It’s a nice differentiator too.
“Today’s smartphone has been (and will continue to be) taken to a whole new level of performance and form factor thanks to the enabling power of MEMS integrated with sensors, says Karen Lightman, Executive Director, MEMS Industry Group.
When cameras were added to phones, people already had and understood what to do with a camera. Ditto for GPS. But why do people need methane detectors? After the capability is added, someone will create the application that justifies why it was essential.
Today we can sequence the genome and the price is falling fast. If it follows Moore’s Law (or some variant thereof), it may soon be cheap enough to do the sequencing on your smartphone. Just as proteins in your body are coded by your genome at birth, a significant amount of protein expression is caused by environmental variables (exposure to toxins, viruses, trauma, etc.). Dr. Christopher Wild calls this this “exposome”.
With environmental sensors on your smartphone you will be able to perpetually monitor and record the environment you were in. The combination of genomic data as well as environmental data means that the ability to predict your health and disease states and your susceptibility to various medical conditions will go up immeasurably. There’s no doubt in my mind that the processor on your smartphone will be the engine that drives this.
When Al Jolson spoke the first dialogue in movie history in The Jazz Singer he said, “You ain’t heard nothin’ yet.” Folks, you ain’t seen nothing’ yet.
Neil Kane (@neildkane) runs Illinois Partners which helps companies, universities and investors with innovation strategies and technology commercialization.