WORK / Article / Professional Why a physical interface is better than a touchscreen

Ben King

Industrial Designer

Ben is an Industrial Designer working in the Professional Category at Design Partners. He specialises in physical interaction design, connected products, design strategy and invention. He loves helping companies find the right product to develop. His work has been named as one of the best inventions of 2015 by Time, it has been awarded patents, $250,000 in crowdfunding and won multiple awards.

 

By Ben King, Industrial designer, Design Partners

 

For years computer interaction has been getting less physical as touchscreens prevail, which means we’ve been missing a trick. Physical interaction triggers subconscious interfaces that enable us to do things easily without thinking about them. Touchscreens, by contrast, require a lot brainpower because of their lack of tactility.

Blame Steve Jobs

Our fingertips have spent the last ten years pinching and swiping glass screens since Steve Jobs launched the iPhone. But for me, it’s a frustrating experience because I never feel in control. If I have an important task to do – like filling in my tax return or emailing my boss – I just don’t want to be using an iPad or my phone. I want to be at my computer with a mouse in one hand and a keyboard at my fingertips.

The computer mouse is an icon for interaction design. The G 502 was the best-selling gaming mouse in the world in 2016.

The computer mouse is an icon for interaction design. The G502 was the best-selling gaming mouse in the world in 2015-2017.

The problem is that touchscreens are boring to touch. We’re human beings after all and have been using our hands to explore the world from the moment we were born. Our hands are full of sensors, all of which the iPhone ignores because it primarily relies on our vision – 30 per cent of our brain is dedicated to vision, only 8 per cent to touch, which is why touchscreens require so much mental effort.

At Design Partners we’ve been working on tools that people can learn to use automatically with minimal mental effort, tools that tap into our subconscious interfaces. This is the instinctive behaviour we employ every day: when we’re cleaning our teeth, eating cereal with a spoon, or doing what I’m doing right now as I write this, tapping on a keyboard and clicking on a mouse. Our brains learn how to perform these tasks automatically.

Designing for our senses

Designing objects that maximise our senses is key to creating these subconscious interfaces. While amazing things are happening in voice and audio interfaces, I would argue that touch, of all the senses, is particularly important because humans have evolved to use tools with our hands.

Think about all the information your hands are receiving when you are writing with a pen for example, the smoothness of paper, how the ink is flowing and how hard you are pressing and how well the pen is moving across the paper as you write.

A great tool will provide our hands with loads of tactile information in a way that a touchscreen can’t, which means the caveman with a stone axe is engaging more effectively with his senses than a person with an iPhone.

Blind Navigation

In our work with Logitech building the Harmony remote range, we had a central idea of ‘Blind Navigation’. Our designs have carefully considered button layouts, shapes and feedback so people can instinctively feel their way around the controllers without interrupting their focus, be it music, a movie or chatting with friends.

This ambition to create subconscious interfaces is central to a lot of our work. The surgeon’s tools we are building strive to keep the focus on the task not the tool. Equally, the tools we have built for 3Dconnexion allow engineers to physically manipulate their way around 3D environments keeping their focus on what they are drawing.

Developing SpaceMouse with 3dConnexion we defined a new interaction for navigating digital 3D environments.

In developing SpaceMouse with 3dConnexion we defined a new interaction for navigating digital 3D environments.

Touchscreen hangover

Microsoft was the first to wake up to the importance of the physical side of computing and shake off the touchscreen hangover. Its Surface product was launched with a pen as part of the kit, something humans have been using for thousands of years. It works alongside a physical keyboard and mouse, and for maximum hands-on computing it comes with a screen puck that you can use to change the tools while you’re working.

The Apple Pencil launched shortly after Microsoft’s, confirming that physicality is back and will only become more important as augmented reality gains traction. AR removes the screen altogether, but the first wave of platforms to launch are still to come up with an interface that maximises the potential of the technology.

A great natural interface should be physical and it can be as simple as a mouse. Suffice to say that it’s something that Design Partners is working on as we explore the idea that the future of computing will be as much about feeling digital content as looking at it.

See how Design Partners maximises sense of touch to develop a way for engineers to physically manipulate their way around 3D environments.

Related content