According to Xconomy, Apple has assembled a small team of developers focused on speech technology and is working on a secret update to Siri.
Siri, the voice-activated personal assistant, has been riddled with issues — not only is the Android counterpart Google Now much faster, but it’s also much more accurate. Now that the initial hoopla around Siri has died down and analysts and pundits are willing to criticize Siri, Apple is looking to make the feature better and better.
Xconomy said that they don’t really know exactly what Apple is working on, but they’ve joined someone of the biggest names in speech technology and put them in an office outside of Boston. Not only that, but many of the members of the Boston-Apple team have worked at Voice Signal Technologies, “speech software company” that was purchased by Nuance (the company that sells their speech-recognition tech to Apple for Siri) in 2007 for an astounding $293 million. Xconomy wrote about a few of the people who are working for Apple in Boston.[inline_share text=”Share this article!”]
The group includes Gunnar Evermann, who stayed at Nuance for nearly four years before joining Apple in July 2011. Evermann previously worked for Apple in California, but moved back to the Boston area recently, helping to spark the company’s office in this area, sources said. His job is listed as “manager, Siri Speech.”
Evermann is joined by Larry Gillick, who was a vice president of research at Nuance after the VoiceSignal acquisition. More recently, he worked as a consultant and as chief scientist at EnglishCentral, a venture-backed online language instruction startup. Gillick’s job title at Apple is “chief speech scientist, Siri.”
Also on the team is Don McAllaster, listed as a senior research scientist who joined Apple in 2012. McAllaster has a long resume as a top research scientist with companies in the speech field, stretching back to Dragon Systems, where he also worked alongside Gillick.
Xconomy hypothesized that Apple may be preparing to move the tasks preformed by Nuance in-house, so that it can be controlled and dictated by Apple. This makes sense, since Apple loves to make sure everything is run on their own terms (see: Apple Maps, Apple moving chip development in-house, etc.).
This also implies that Apple could be working on having Siri working offline. TechnoBuffalo notes that, deep within the innards of iOS 7, developers spotted lines of code that would let iOS 7 devices have offline dictation. Right now, iOS 7 dictation works just like it did in iOS 6; wi-fi or data connection necessary. But, with offline dictation, a lot of Siri’s function’s could be accessed without a wi-fi or data connection, which should appease a lot of Siri haters.
Discuss on Facebook