As part of my day job at Bloomberg, our team just released a pretty mammoth update to our iPad app. In one monthly release cycle we added pencil support, picture in picture video, iOS multitasking, plus our own in-app split screen multitasking. I've written a fairly techy blog post over at Bloomberg about it, go have a read!
There will be a million of these posts by Monday so I'll keep this brief but not to the point: others have done that already.
- It's a lovely object
- You pick it up the wrong way around exactly 50% of the time
- It feels like magic that it also works as an accelerometer
Incidentally, you charge it by the lightning port on the front. So it's okay for some but not others, huh Apple?
Well, game. I said brief. I went with Asphalt 8 because everyone else will have gone for for Crossy Road. Plus I love a good racing game: years ago my friends bought be a steering wheel for my Playstation 2, and it wasn't even my Playstation 2. Control-wise, I assumed I'd be sliding my thumb around on the touchpad bit of the remote to steer - I'd completely forgotten the controller has accelerometers - and so the moment the game asked me to hold the remote sideways to steer was a proper "ooooh" moment. The next surprise for me, and bear in mind I'm not a big console gamer, was that steering via a remote is so much better when the controller isn't also the screen. Time after time I've bought racing games on the iPhone, iPad mini and iPad and each time forgotten how annoying it is to have steering affect what you can see. While the controller is a bit small and light to really feel like a steering wheel (you think?), it's really quite good for casual gaming. Furthermore, not touching the screen makes the experience so much more like playing on a console than an iPhone or iPad.
All this contributes to the big thing here: games do not feel like iOS games. This is a big thing.
Y U No Help Me With Music Siri?
I swear the WWDC presentation showed Siri controlling music? Yet "Sorry, I can't help you with that" is all I get every time I ask for Taylor Swift. Or even Ryan Adams. I guess Siri doesn't want to show bias.
Apps. On your TV. There's not a lot to say really - it's all about the individual apps, and this is a review of the Apple TV, not the content. And so more importantly is the question of:
I was expecting the UK channels to take a few days to make it to the Apple TV, although seeing as someone got an iPlayer app working before the box was released, surely it can't take them too long? Now TV is even more embarrassing - it's listed on Apple UK's site as a partner. Where are my bingeworthy boxsets, Sky? The forums are getting angry, quite rightly since the explanations aren't forthcoming.
In the meantime, you can chuck your whole Mac screen to the TV with the press of a button, so no need to get up from your sofa and smash the thing into a billion pieces just yet.
On-screen you're moving a cursor around with your thumb, and as you'd expect it moves in the direction your thumb goes. But when this means going a long way from the top to bottom of a screen, it actually feels like you're scrolling - and the direction is the opposite of "natural scrolling". It's not really a problem, I just think it's quite interesting. File Under Unexpected UX Consequences.
Unexpected Bonus App
"CATS! Right Meow". Trust me.
Get ready for an awful first-world problem rant. Indulge me.
DPD were scheduled to delivery my Apple TV on launch day, October 30th. I was at work. Apparently to DPD "leave with a neighbour" means "don't try leaving with a neighbour".
And so I elect to "collect from Southwark depot". A tube, then a bus, then a walk to the address given by DPD and I find myself at Royal Mail's delivery office. Right. Not DPD. A helpful Royal Mail employee directs me on my cross-country walk to DPD's office. I say office: a service road to a truck delivery centre and a portacabin. I ask the lone man in the portacabin, who's on the phone, and he gets me to sign a visitor pass and - I shit you not - gives me a high-vis vest and directions to the "customer entrance". Across the truck park where a high-viz must be worn to get to the "customer entrance". At the "Customer entrance" I buzz at a door for a few minutes. No one let's me in: I am Jack's unsurpised spleen. I put my head around a wall to find the door doesn't even lead to a building, it is a doorway to more outside, where a queue of confused people are standing in high-vis jackets. They let me through the pointless door. I wait in line and after 20 minutes I receive my parcel. My Uber has already been and gone. Surge pricing begins. I sob.
I don't really: I go out for steak.
It was marginally easier than this.
This is only Apple's fault in the sense they picked DPD as a delivery company, from an admiteddly poor lineup. Tip: use Royal Mail. Don't use HDN. Don't use DPD. Screw you, DPD.
Siri was released to great fanfare in 2011 as the exclusive headline feature of an tock- year iPhone - the iPhone 4s (née S). The device itself was a great rev - the first serious camera on an iPhone and a huge processor speed bump meant it was one of the years that iOS really flew. If you're going to be on a two year iPhone cycle, people in the know will tell you the s models are the ones to get. While they lack the wow factor of a new visible hardware design, they tend to be much better equipped in terms of processing power-to-OS-feature ratio, and the physical designs themselves are tweaked to perfection thanks to the lessons learned from the previous one model (think: antennagate, scuffgate, bendgate, kill me now).
That year, Siri was indeed impressive, at least compared to other digital assistants or to people who hadn't used voice recognition since DragonDictate circa '93. The voice recognition was up there with the best people had previously seen (thanks to it basically being the best people had previously seen). Siri managed what seemed like a great leap forward by having the cloud do the heavy lifting: most of the processing is done on remote servers rather than locally, which helps both in terms of speed accuracy, and long term improvement, which we'll get to in a moment. So Siri was a short term success, but after a few weeks the initial excitement and intrigue died down and Siri use reduced down to mostly starting timers and reminding us to put the bins out.
This has been the path of digital assistants many times over, but this time, unlike many before, it seemed Siri had done just enough to make sure this time they wouldn't go away. By 2014 Google and Microsoft enter the space (with Google Now and Cortana respectively), and this year has seen news of Facebook's upcoming assistant 'M'.
Apple didn't stand still though, and between 2011 and 2015 Siri has improved considerably. Firstly, all that processing happening on remote servers was improving Siri's recognition and understanding. Secondly, Siri now has access to the current context of the device it's running on, such as the time of day, the apps running and the content on the screen. Thirdly, the iPhone 6s is over ten times faster than the iPhone 4s. Finally, on new hardware Siri is always listening.
Fast-forward four years then, and Siri is clearly the best way to carry out a whole variety of tasks on your iOS device. And so what's next? Well, this autumn Siri is making the jump to the living room with the 4th generation Apple TV. If there's an input mechanism that Siri can beat, it's an on-screen keyboard controlled by a D-pad. On the Apple TV, Apple have allowed Siri to use previous commands as context, as a way to hone a search. Think "Show me all the Woody Allen films. Just the funny ones". Siri on Apple TV is going to be huge.
The next logical step for Siri will be the Mac. OS X Mavericks brought dictation: using the same cloud based voice recognition but without full-Siri experience, you could now dicatate instead of type. Many are expecting Siri to come to the Mac platform soon, but how will Siri need to adapt on this platform that has such a different set of input methods?
I think the answer is this: Siri needs to stop being modal and start multitasking.
Siri on iOS is used in two main situations: when touch is not possible as an input mechanism, and when voice is more efficient, convenient or intuitive. Up until very recently, only the first of these really applied: there were very few cases where it was more efficient to use your voice if you were already holding an unlocked device in your hand. As of iOS 9 though, Siri is context-aware, greatly opening up the times and places voice can be the smarter input choice over touch.
One thing that doesn't seem to have changed though is the way use of Siri interrupts your flow: you stop using touch, you use voice to interact and achieve a goal, then you go back to using touch. In developer-speak, Siri acts modally. Like a settings dialog or an alert view. While Siri active you are blocked, deliberately, from interacting in any other way with your device. On iOS, currently, this doesn't matter too much: generally you will only be doing one thing at a time anyway. But with multitasking now available on the lastest iPads and the iPad Pro launching next month - which really showcases the multitasking use-cases - this modal paradigm will become a sticking point. And when Siri makes it to the Mac, it will very noticeable - I would go as far to say unworkable - if Siri blocks the UI and everything stops in its tracks while you talk. The answer is that Siri's interface needs to become modeless so that it can listen to commands as we carry on interacting the way we always do.
Imagine this: you are browsing recipes in Safari and want to save one to your recipes collection. Right now, you can say: "Hey Siri, add this to my recipes note" and the link will be appended to the end of your note entitled Recipes. While this is, let's be honest, pretty impressive, why stop there? Why should you not carry on scrolling through the website while you carry out this task? You can multitask, your touch-input methods can multitask: why not your voice input?
Another example: you're writing in a text editor on your iPad, and you remember something for later: "Hey Siri, remind me to take the recycling out when I leave the house later". But why stop the flow of writing while Siri listens and acts?
One more: first thing in the morning and you want to open a few documents that you're going to be working on, and you want to check your calendar to see what time your first interruption/meeting will be. Two actions that can be carried out with your keyboard and mouse or trackpad, or with voice: "Open the last three documents from yesterday", "What's my schedule like today?". But surely the most efficient path here would be carrying out one with your hand and the other with your voice.
If and when voice input mechanisms and the digital assistants they drive can be always on and modeless like this, we will have input multitasking. Don't underestimate how powerful this will be, and how much it will change the way we interact with our devices.
Hey Siri, can you multitask yet?