Ideally, this transformation can be trained to recognize a user’s instances of “Hey Siri” in varying environments (e.g., kitchen, car, cafe, etc.) and modes of vocalization (e.g., groggy morning voice, normal voice, raised voice, etc.). Our output is then a low-dimensional representation of speaker information, hence a speaker vector.
On each “Hey Siri”-enabled device, we store a user profile consisting of a collection of speaker vectors. As previously discussed, the profile contains five vectors after the explicit enrollment process. In the Model Comparison stage of Figure 1, we extract a corresponding speaker vector for every incoming test utterance and compute its cosine score (i.e., a length-normalized dot product) against each of the speaker vectors currently in the profile. If the average of these scores is greater than a pre-determined threshold (?), then the device wakes up and processes the subsequent command. Lastly, as part of the implicit enrollment process, we add the latest accepted speaker vector to the user profile until it contains 40.
In addition to the speaker vectors, we also store on the phone the “Hey Siri” portion of their corresponding utterance waveforms. When improved transforms are deployed via an over-the-air update, each user profile can then be rebuilt using the stored audio.
Source: Personalized Hey Siri – Apple
As good as this sounds, though, there are some limits that Google isn’t discussing. The Pixel 2 line might not need dual cameras to do portrait modes, but that also means you aren’t getting optical zoom, a wide-angle lens or other perks that come with dual cams. If you’re too far from a concert stage to get a good shot, it won’t matter how good that one camera sensor might be. And given that the Pixel 2 phones use the same Snapdragon 835 chip as Android phones from earlier in 2017, you probably won’t capture 4K video at 60 frames per second.
There’s also the question of whether or not synthetic camera tests like this tell the whole story. While the original Pixels did end up having excellent cameras in practice, there were still flaws (for example, that lack of optical image stabilization) that didn’t become fully apparent until the public got its hands on the hardware. The DxO score is a good sign, but it’s worth being skeptical about Google’s claims until more people have had a chance to try the Pixel 2’s camera tech for themselves.
Source: Google claims Pixel 2 has the best camera, just like the first Pixel
On paper, the multi-core result of the hexa-core A11 is 50 percent faster than the octa-core Snapdragon 835. As I mentioned above though, Geekbench doesn’t test other parts of the SoC. Things like the DSP, the ISP and any AI-related functions will influence the day-to-day experience of any devices using these processors. However, when it comes to raw CPU speed, the A11 is the clear winner.
This can be a bit hard for Android fans to stomach. So what is the reason? First we need a bit of a history lesson.
What is different about Apple’s CPU cores?
There are several key things to recognize about Apple’s CPU cores.
First, Apple had a head-start over just about everyone when it comes to 64-bit ARM based CPUs.
Second, Apple’s SoC efforts are tightly coupled to its handset releases.
Third, Apple’s CPUs are big and in this game, big means expensive.
Fourth, Apple’s CPUs have big caches.
Fifth, and finally, Apple’s plan of making processors with wide pipelines at (initially) lower clock speeds has come to fruition. In very broad terms, SoC makers can either make a CPU core with a narrow pipe, but run that pipe at high clock frequencies; or use a wider pipe, but at lower clock speeds. Like a real world water pipe, you can either pump water at high pressure through a narrower pipe or at lower pressure through a wider pipe. In both cases you can theoretically achieve the same throughput. ARM falls squarely in the narrow pipeline camp, while Apple is in the wider pipeline camp.
Source: Why are Apple’s chips faster than Qualcomm’s? – Gary explains | AndroidAuthority
I think we might be seeing another moat built, this time across the fields of Augmented Reality (AR), Machine Vision (MV), and, more generally, Machine Learning (ML).
At last week’s WWDC, Apple introduced ARKit (video here), a programming framework that lets developers build Augmented Reality into their applications. The demos (a minute into the video) are enticing: A child’s bedroom is turned into a “virtual storybook”; an Ikea app lets users place virtual furniture in their physical living room.
As many observers have pointed out, Apple just created the largest installed base of AR-capable devices. There may be more Android devices than iPhones and iPads, but the Android software isn’t coupled to hardware. The wall protecting the massive Android castle is fractured. Naturally, Apple was only too happy to compare the 7% of Android smartphones running the latest OS release to the 86% of iPhones running iOS 10.
Source: Apple Silicon and Machine Learning – Monday Note
Apple Pay has the largest percentage of supporting US merchants with 36% accepting the mobile payment service today, up from 16% last year, research released by Boston Retail Partners (BRP) reveals. Some 22% of retailers are planning to accept Apple Pay in the next 12 months and 11% plan to do so within the next one to three years, while 31% are adopting a “wait and see” approach.
Source: Apple Pay most popular mobile payment service among US retailers, survey finds • NFC World
- 1000 watts on a specialized server
- 100 watts on desktops
- 30 watts on laptops
- 5 watts on tablets
- 1 or 2 watts on a phone
- 100 milliwatts on an embedded system
three four orders of magnitude. Modern CPU design is the delicate art of placing an inferno on the head of a pin.
Source: An Inferno on the Head of a Pin
Between Dec. 13 and Dec. 31, 2016, Apple AirPods accounted for 26 percent of wireless headphone e-commerce revenue, which was more than any competitor. More impressively, the day Apple AirPods were released, Dec. 13, was the largest single day for online headphone spending for all of 2016.
Source: Early Apple AirPods Sales Are Huge Despite Supply Constraints
Also interesting is the boost that Bose and Sony got, being the leaders in the high end, but how much of that bump is iPhone7?