But where Shazam could really help Siri’s ears is with HomePod. Apple wants its new home speaker to “reinvent home music,” but if all it does is sound good, that’s hardly revolutionary. If Apple could leverage its Shazam acquisition to build some serious smarts into HomePod, it could be a difference maker. We will already be able to ask Siri to play things like the most popular song in 1986, but Shazam could amplify its knowledge considerably. It would be great to tap your AirPods and ask “Play the song that goes like this …” or “Play that Ed Sheeran song about Ireland.” Shazam might not be able to do that now, but the groundwork is certainly in place, particularly when paired with Apple’s own AI musical capabilities.
And it could go beyond simple song identification too. Apple could use Shazam to create personalized playlists right on HomePod, based on your listening habits and tastes. Apple Music already creates mixes that are pretty great, but Apple’s machine learning could use what it hears to create customized playlists for the time of day that only play in our homes. That alone could be a reason to spend $350 on a HomePod.
Archive for December 2017
I know a lot of people turn off haptic feedback on their smartphone. That is because, I have now learned, essentially every Android smartphone has absolutely awful haptics. Your $930 Galaxy Note8 has haptic feedback that is, frankly, bad. So does every other Android phone. Yes, the difference is that clear after going to the iPhone X.
Apple’s Taptic Engine doesn’t just buzz – it clicks, it taps, it knocks. And it can do so with an incredible range of intensities and precision. If I had to analogize, it’s sort of like having used crappy $10 earbuds your entire life and then someone hands you a set of $300 open-back Sennheisers. You didn’t know your music could sound that much better until your ears heard it for themselves. The same thing applies with the Taptic Engine: you won’t get it if you haven’t used it.