In depth review: Apple's iPhone 4S running iOS 5
Jump to a different section
Apple has included one application intended to demonstrate what you can do with the new horsepower of the A5 chip: Siri voice assistant. It offers an even more intuitive interface than multitouch: you just say what you want. Siri replaces the far more limited functionality of the Voice Control feature introduced in iOS 3, and is invoked the same way: hold the Home button or mic/headphone remote button for a second or two and Siri pops up.
Alternatively, you can set Siri to activate when you raise the phone to your mouth, but that didn't seem to always work in testing, even when making exaggerated hand motions. Turning the phone on and off seemed to get it working again, at least part of the time. The phone's screen needs to be on (it won't turn on by itself), although it doesn't need to be unlocked for motion-activated Siri to work (when it does).
Siri works without having to unlock the screen, and can also be set to activate without needing a passcode, but you can also turn that feature off in Settings. That's something you should do if you don't want people picking up your phone and asking Siri to lookup contact information, a function that isn't limited when you have passcode set. You can also dial numbers that you look up without entering a passcode, so if you're concerned about unauthorized calls, you'll want to disable Siri's option to bypass the lock screen.
At the same time however, if you ask to review your email (or search the web!) Siri will ask you for your password first, explaining that it can't help you do that until you do. So be aware that Siri's passcode security filtering a bit of a mixed bag.
Siri still a work in progress
Apple describes Siri as still being in beta, which means it only supports a few languages and doesn't always provide ideal results. There are two major problems that seem to occur pretty regularly: one is in recognizing key prepositions or other minor words that dramatically color the meaning of what you are saying.
For example, ask to perform a conversion and Siri frequently mistakes "in" for "and," providing nonsense results: "what is 60 mph and km/h?" results in a Wolfram Alpha page dutifully responding "60mph | 1km/h." This kind of query shouldn't be too difficult for Siri to correctly figure out from the context, but you can also refine your queries to be harder to mistake by saying something more like: "convert 60mph to kilometers per hour."
It's also possible to tap the words Siri has recognized you as saying and edit the text to be what you meant, although this only works once; you can't scroll up and keep editing commands to resend; it's not clear why you can't, because you should be able to.
A second way Siri gets confused is when you exploit one of its more powerful actions: contextual questions. Ask about breakfast restaurants in one place, then ask about another area and Siri will formulate a followup response related to the first. If you suddenly begin asking something unrelated however, Siri will keep chatting up Yelp reviews about restaurants until you say "start over." Which of course, is also the solution to that particular problem.
A very useful new interface for mobile devices
While asking Siri questions is fun, the most powerful thing Siri does is help provide hands free assistance with some of your apps. Being able to pull up Maps directions from a voice request is very useful when you're driving, and quite a lot more useful than having to open Maps, select directions, select the text box and then record dictation of you saying where you want to go, then entering it.
The latter is an example of "plain old" voice recognition, a very useful feature that Android and Windows Phone 7 both already offer, but not in the same class as Siri. Prior to Siri, iOS 4 only handled a few Voice Commands and didn't offer any built-in voice dictation feature. To perform dictation, you needed to download an app, and your text would only work within that app. Other mobile platforms already allow you dictate just like entering text, at least in select places.
Siri is unique in that it doesn't just listen for a few commands or just allow you to dictate text anywhere you can type (although it does those things too, depicted above). What Siri does that isn't matched elsewhere is build an intelligent profile about you (including the identity of family members you might want to call, for example), listen for the intent of what you're saying, and allow you to navigate via your voice entirely, providing a new interface that's even more intuitive and natural than multitouch gestures.
Once you begin a task (say, by asking Siri to take a Note or find your location on the Map or to pull up today's scheduled events), you can then jump in with your hands and begin interacting via multitouch, correcting text errors with iOS' touch to select features, pinching to zoom your map, or using Siri's voice dictation to flesh out additional details of the Note or text you started. Siri makes a great springboard for beginning work, and for some tasks, can do impressively complex tasks for you as you simply speak simple instructions and hear feedback on what changes are being made (such as shifting a calendar appointment).
Only if Siri doesn't understand what you want does it revert to searching the web for you via your default search engine. You can also ask to "google," "bing" or "yahoo" a certain phrase (such as "Yahoo movie times"). Google hates being used as a verb, but Microsoft would love the opportunity for Bing to enter the general population's vocabulary and Yahoo is likely just as happy to be invited. Siri has no particular preference; it just does what you ask, or uses your default Safari search engine if you don't specify.
There's a few things Siri should know how to do but doesn't: it won't Spotlight search your phone, it can't take pictures from the Camera app ("I'm not much of a photographer" it says), it won't launch a given app ("open Photos" obtains the reply "I'd like to, but I'm not allowed to. Sorry about that"), and there's not yet a public API that allows third party developers to intercept requests that might invoke their apps ("tweet blah blah blah" or "play electro using Pandora").
A springboard for web services
Siri also taps into a wide variety of specialized responses that are more likely to be useful than a generic web search, primarily Yelp reviews, Yahoo Finance & Weather, and the Wolfram Alpha knowledge base. There's lots of potential for expanding upon these, but it's interesting to see Apple continuing to partner with web service providers (as it did with Google on the original iPhone for Maps, videos and search) rather than trying to build its own competing version of everything (as Google has done with Answers, Buzz, Local, Google+ and even Android).
Apple's Siri integration with Yahoo services, Wolfram Alpha (a firm Steve Jobs partnered with over and over throughout his career) and Yelp reflect similar iOS partnerships Apple has recently made with Twitter, or previously with Microsoft on Exchange Server ActiveSync and which it has attempted to make with Facebook (something that hasn't worked out as smoothly). Going forward, it will be interesting to see what new third party services Apple integrates into iOS and Mac OS X, even as it expands its own web services with iCloud.
However, Siri on iPhone 4S is a first release by Apple, barely out of the gate, and is great at doing what it's set up to do so far, which includes consulting your inbox, notes, calendar and contacts; composing and addressing new texts or emails; creating new Calendar events or Notes; setting Reminders, alarms and timers; checking weather, stocks and the world clock; pulling up maps or directions; searching the web via Google, Bing or Yahoo; and consulting Wolfram Alpha.
Apple has also added an arm of Siri functionality that ties into the brand new Find Friends app, which is not included in iOS 5 but downloaded separately. This suggests Apple has plans to extend Siri's core functionality, and will likely invite third parties to tap into the system once it's ready.
Siri also seems to work well even in a fairly noisy location, such as a busy restaurant with plenty of background noise. Again, it's most likely to get confused about what you're asking due to context or mumbling than to imperfect speech recognition. Its ability to pull the important words out of what you say is quite impressive, and it frequently recovers from misspoken commands by anticipating what you meant to say.
On page 4 of 4: You should now know if you want an iPhone 4S
On Topic: iPhone
- How to multitask more efficiently with 3D Touch on iPhone 6s
- Apple's HomeKit gets support from MediaTek via IoT chip SDKs aimed at appliances, small devices
- Microsoft updates Skype for iOS 9 with Split View multitasking, Spotlight search support
- Facebook rolls out mobile profile pic changes such as video support, temporary images
- About 15M T-Mobile users affected by Experian data breach