5 Lessons From Product For Developers

This October I attended the London JAM Conference 2015 - Sharing the Stories Behind Great Products. I'm primarily a developer, and so I went along hoping that some of the content would be relevant to my role, and the conference did not disappoint. These are my 5 key take-aways from the day.

Note While the conference had a great line up of speakers and all were inspiring, each of these lessons is based upon a point or points from a particular speaker. I've developed each of these ideas into my own for this article, but I've credited the original speaker under each.

1. Five Whys

(From Pip Jamieson, The Dots)

Five Rings is a famous book by kenjutsu practitioner Miyamoto Musashi on business strategy, later found to be applicable to martial arts and sword fighting. Five Whys is the second most famous concept to come from Japan about 5 things.

A technique used by the under-fives for millennia, it was formalised by Toyota founder Sakichi Toyoda (yes, the same one we stole all those other 'lean' ideas from) and is a technique to help get to the root cause of a problem. It's simple: ask "why?" 5 times when questioning a problem. The reason Five Whys is important enough to go on this list is that it's so simple, yet it's so easy to forget to do. There's a great example here from the guys at Buffer.

As a developer, try applying Five Whys when next tackling a bug: either apply to the problem you're trying to solve, or to your proposed solution. If applied to the problem, chances are you'll uncover the root issue rather than the special case your bug is. Applied to your proposed solution, you'll make sure you're actually solving the right problem. It can work in both directions, and it's like rubber-ducking on speed.

Put down that burger, I said 'Whys'!

2. Idea Incubator

(From Pip Jamieson, The Dots)

A good development team can be a great source of product ideas, but these ideas rarely see the light of day. This can be due to prejudice against ideas coming from Dev, or simply because they don't come up at the right time or get heard by the right people. It can be unproductive for a dev team to discuss ideas that aren't in the current sprint or in the backlog, but you don't want to discourage the principle. Idea Incubator is an outlet for this creativity: every two months, organise a formal internal pitch session where anyone can present their idea to Product. Keep it lightning-style: two or three minute pitches, and five minutes for feedback.

The benefits are legion:

  • Everyone is given a chance to contribute to the product and so promotes inclusion.
  • The sprint is not blown off course by discussing ideas that aren't yet relevant.
  • It provides a natural filter for ideas: it seems like a great idea now with all this other work to do: will it still seem like a good idea in a few weeks?
  • Another source of potentially great ideas!

3. Be A Customer

(From James Gill, GoSquared)

When developing or launching a new product, it's very clear that the on-boarding experience needs to be refined to perfection: firstly, to give the best possible first impression to your customers, and secondly, to make sure attrition is kept to a minimum. However, if you work on a well-established or long-running product, or a product people have no choice but to use, this part of your experience can easily be neglected. As a developer you may work in a sandbox that is far from the standard customer experience, or not us ethe same sign in process as them. Even worse: you might not even realise!

Once a month, sign up or log in to your app or site or service. How was it? Fast? Slow? Confusing? As a developer you may not be able to control all aspects of this process, but focus one the parts you can. Was the keyboard set to the right style? Did the spinners freeze? Look at the code: could it be more efficient? Run faster? DO things in paralell?

For those aspects your can't directly change - there could be a UX or business reason why the process is a certain way - it's still better to find and discuss them, so everyone understands what decisions have been made and why. Even in the worst case where you might not get a good explanation, you'll at least build up some empathy for your customers!

4. Use Processes, Don't Fight Them

(From Tim Davey, OneFineStay)

There are many ways to upset a developer, and enforcing process where it doesn't belong is right up there with telling them they're using the wrong text editor. When software is developed in a large team or organisation there are places where decisions must be made, or workflows adhered to, in order to keep the team running as a cohesive unit. They might seem arbitrary, but they are there to promote consistency, reduce complexity and therefore improve overall efficiency.

Use process in the areas where, as a team, you have a pre-agreed position on a decision that needs to be made regularly, or where there is a rule-of-thumb and it can usually be followed. By using process in these places, you reduce decision fatigue in your team and improve the quality of the decisions that they must make.

Culture is the way a collective of people can make individual decisions. By agreeing on and abiding by certain processes, you can create a team culture that encourages free-thinking but is fast, efficient - and most importantly safe - as the decisions are made within a consistent framework that you all fundamentally subscribe to.

5. Never Don't Not Use Data

(From Graham Paterson, Intuit)

The benefits of using data to assess your product are clear: it can show you why your product is succeeding or failing. What isn't so clear is the danger of not using data. If you don't use data, your product may be succeeding - or failing - and you will have no idea why.

Without data:

  • you won't know why you have suceeded
  • you won't know why you have failed
  • you will miss illogical cases

This is why data-driven analytics aren't just a luxury: they are a necessity. Of course, data is never an excuse to not listen to customers too: the two need to go hand in hand to keeo your prodict on track.

Notes On Apple TV

There will be a million of these posts by Monday so I'll keep this brief but not to the point: others have done that already.

That Remote

  1. It's a lovely object
  2. You pick it up the wrong way around exactly 50% of the time
  3. It feels like magic that it also works as an accelerometer

Incidentally, you charge it by the lightning port on the front. So it's okay for some but not others, huh Apple?

Games

Well, game. I said brief. I went with Asphalt 8 because everyone else will have gone for for Crossy Road. Plus I love a good racing game: years ago my friends bought be a steering wheel for my Playstation 2, and it wasn't even my Playstation 2. Control-wise, I assumed I'd be sliding my thumb around on the touchpad bit of the remote to steer - I'd completely forgotten the controller has accelerometers - and so the moment the game asked me to hold the remote sideways to steer was a proper "ooooh" moment. The next surprise for me, and bear in mind I'm not a big console gamer, was that steering via a remote is so much better when the controller isn't also the screen. Time after time I've bought racing games on the iPhone, iPad mini and iPad and each time forgotten how annoying it is to have steering affect what you can see. While the controller is a bit small and light to really feel like a steering wheel (you think?), it's really quite good for casual gaming. Furthermore, not touching the screen makes the experience so much more like playing on a console than an iPhone or iPad.

All this contributes to the big thing here: games do not feel like iOS games. This is a big thing.

Y U No Help Me With Music Siri?

I swear the WWDC presentation showed Siri controlling music? Yet "Sorry, I can't help you with that" is all I get every time I ask for Taylor Swift. Or even Ryan Adams. I guess Siri doesn't want to show bias.

Apps

Apps. On your TV. There's not a lot to say really - it's all about the individual apps, and this is a review of the Apple TV, not the content. And so more importantly is the question of:

Missing Apps

I was expecting the UK channels to take a few days to make it to the Apple TV, although seeing as someone got an iPlayer app working before the box was released, surely it can't take them too long? Now TV is even more embarrassing - it's listed on Apple UK's site as a partner. Where are my bingeworthy boxsets, Sky? The forums are getting angry, quite rightly since the explanations aren't forthcoming.

In the meantime, you can chuck your whole Mac screen to the TV with the press of a button, so no need to get up from your sofa and smash the thing into a billion pieces just yet.

Unnatural Scrolling

On-screen you're moving a cursor around with your thumb, and as you'd expect it moves in the direction your thumb goes. But when this means going a long way from the top to bottom of a screen, it actually feels like you're scrolling - and the direction is the opposite of "natural scrolling". It's not really a problem, I just think it's quite interesting. File Under Unexpected UX Consequences.

Unexpected Bonus App

"CATS! Right Meow". Trust me.

Addendum

Get ready for an awful first-world problem rant. Indulge me.

Delivery

DPD were scheduled to delivery my Apple TV on launch day, October 30th. I was at work. Apparently to DPD "leave with a neighbour" means "don't try leaving with a neighbour".

IMG_3045.jpg

And so I elect to "collect from Southwark depot". A tube, then a bus, then a walk to the address given by DPD and I find myself at Royal Mail's delivery office. Right. Not DPD. A helpful Royal Mail employee directs me on my cross-country walk to DPD's office. I say office: a service road to a truck delivery centre and a portacabin. I ask the lone man in the portacabin, who's on the phone, and he gets me to sign a visitor pass and - I shit you not - gives me a high-vis vest and directions to the "customer entrance". Across the truck park where a high-viz must be worn to get to the "customer entrance". At the "Customer entrance" I buzz at a door for a few minutes. No one let's me in: I am Jack's unsurpised spleen. I put my head around a wall to find the door doesn't even lead to a building, it is a doorway to more outside, where a queue of confused people are standing in high-vis jackets. They let me through the pointless door. I wait in line and after 20 minutes I receive my parcel. My Uber has already been and gone. Surge pricing begins. I sob.

I don't really: I go out for steak.

It was marginally easier than this.

This is only Apple's fault in the sense they picked DPD as a delivery company, from an admiteddly poor lineup. Tip: use Royal Mail. Don't use HDN. Don't use DPD. Screw you, DPD.

Siri's Next Trick Needs To Be Multitasking

Siri was released to great fanfare in 2011 as the exclusive headline feature of an tock- year iPhone - the iPhone 4s (née S). The device itself was a great rev - the first serious camera on an iPhone and a huge processor speed bump meant it was one of the years that iOS really flew. If you're going to be on a two year iPhone cycle, people in the know will tell you the s models are the ones to get. While they lack the wow factor of a new visible hardware design, they tend to be much better equipped in terms of processing power-to-OS-feature ratio, and the physical designs themselves are tweaked to perfection thanks to the lessons learned from the previous one model (think: antennagate, scuffgate, bendgate, kill me now).

That year, Siri was indeed impressive, at least compared to other digital assistants or to people who hadn't used voice recognition since DragonDictate circa '93. The voice recognition was up there with the best people had previously seen (thanks to it basically being the best people had previously seen). Siri managed what seemed like a great leap forward by having the cloud do the heavy lifting: most of the processing is done on remote servers rather than locally, which helps both in terms of speed accuracy, and long term improvement, which we'll get to in a moment. So Siri was a short term success, but after a few weeks the initial excitement and intrigue died down and Siri use reduced down to mostly starting timers and reminding us to put the bins out.

This has been the path of digital assistants many times over, but this time, unlike many before, it seemed Siri had done just enough to make sure this time they wouldn't go away. By 2014 Google and Microsoft enter the space (with Google Now and Cortana respectively), and this year has seen news of Facebook's upcoming assistant 'M'.

Apple didn't stand still though, and between 2011 and 2015 Siri has improved considerably. Firstly, all that processing happening on remote servers was improving Siri's recognition and understanding. Secondly, Siri now has access to the current context of the device it's running on, such as the time of day, the apps running and the content on the screen. Thirdly, the iPhone 6s is over ten times faster than the iPhone 4s. Finally, on new hardware Siri is always listening.

Fast-forward four years then, and Siri is clearly the best way to carry out a whole variety of tasks on your iOS device. And so what's next? Well, this autumn Siri is making the jump to the living room with the 4th generation Apple TV. If there's an input mechanism that Siri can beat, it's an on-screen keyboard controlled by a D-pad. On the Apple TV, Apple have allowed Siri to use previous commands as context, as a way to hone a search. Think "Show me all the Woody Allen films. Just the funny ones". Siri on Apple TV is going to be huge.

The next logical step for Siri will be the Mac. OS X Mavericks brought dictation: using the same cloud based voice recognition but without full-Siri experience, you could now dicatate instead of type. Many are expecting Siri to come to the Mac platform soon, but how will Siri need to adapt on this platform that has such a different set of input methods?

I think the answer is this: Siri needs to stop being modal and start multitasking.

Siri on iOS is used in two main situations: when touch is not possible as an input mechanism, and when voice is more efficient, convenient or intuitive. Up until very recently, only the first of these really applied: there were very few cases where it was more efficient to use your voice if you were already holding an unlocked device in your hand. As of iOS 9 though, Siri is context-aware, greatly opening up the times and places voice can be the smarter input choice over touch.

One thing that doesn't seem to have changed though is the way use of Siri interrupts your flow: you stop using touch, you use voice to interact and achieve a goal, then you go back to using touch. In developer-speak, Siri acts modally. Like a settings dialog or an alert view. While Siri active you are blocked, deliberately, from interacting in any other way with your device. On iOS, currently, this doesn't matter too much: generally you will only be doing one thing at a time anyway. But with multitasking now available on the lastest iPads and the iPad Pro launching next month - which really showcases the multitasking use-cases - this modal paradigm will become a sticking point. And when Siri makes it to the Mac, it will very noticeable - I would go as far to say unworkable - if Siri blocks the UI and everything stops in its tracks while you talk. The answer is that Siri's interface needs to become modeless so that it can listen to commands as we carry on interacting the way we always do.

Imagine this: you are browsing recipes in Safari and want to save one to your recipes collection. Right now, you can say: "Hey Siri, add this to my recipes note" and the link will be appended to the end of your note entitled Recipes. While this is, let's be honest, pretty impressive, why stop there? Why should you not carry on scrolling through the website while you carry out this task? You can multitask, your touch-input methods can multitask: why not your voice input?

Another example: you're writing in a text editor on your iPad, and you remember something for later: "Hey Siri, remind me to take the recycling out when I leave the house later". But why stop the flow of writing while Siri listens and acts?

One more: first thing in the morning and you want to open a few documents that you're going to be working on, and you want to check your calendar to see what time your first interruption/meeting will be. Two actions that can be carried out with your keyboard and mouse or trackpad, or with voice: "Open the last three documents from yesterday", "What's my schedule like today?". But surely the most efficient path here would be carrying out one with your hand and the other with your voice.

If and when voice input mechanisms and the digital assistants they drive can be always on and modeless like this, we will have input multitasking. Don't underestimate how powerful this will be, and how much it will change the way we interact with our devices.

Hey Siri, can you multitask yet?