I recently was shown a blog post by Golden Krishna called “The best interface is no interface at all“. Reading it made me cringe. What’s worse, a UX designer shared it with me. IMHO, GK couldn’t be more wrong. And it’s all about perspective.
The History of the UI
As Krishna points out in his article, a fateful relationship between Apple and Xerox introduced the world to the graphical user interface. Gone were the days of command line key strokes, odd abbreviations to commands, and white text on a black background. Sure, your favorite game was perhaps less popular, but all-in-all, the end user welcomed the user interface with open arms.
Afterwards, we got the Palm Pilot with it’s stylus (no doubt a derivative of “stylish” because, let’s be honest, a stylus rocks) and then the iPhone and the plethora of touch-based devices that have followed. All of these have relied on the user interface.
The State of the UI Today
Deny it all you want, but the user interface has irreversibly changed the world of computing as we know it. Analysts believe that mobile web browsing will surpass desktop in 2014. It’s hard to browse an interface-less web on an interface-less device. Apps such as FiftyThree’s Paper have not only seen millions of downloads, but won the respect of some of the most revered design companies around. It’s semi-impossible to jot down ideas, create vibrant drawing, or otherwise interact with your touch-based device without a user interface.
Principle 1: Interfaces are Useful to Enhance Natural Processes
Krishna has some valid points, and I’m not here to dispute the fact that in some cases, the user interface is over used. No, I probably don’t need to check my Twitter feed while I’m filling my water glass from the fridge, but wouldn’t I want to know if I’m running low on condiments? My fridge is connected to the internet, so it has access to my calendar and sees that I am hosting a big BBQ this weekend. You can’t have hot dogs without ketchup. It’s a fact**.
I can already walk up to my car, unlock the doors, and drive away. And I don’t need my smartphone to do it. So the argument isn’t that user interfaces are unnecessary, it’s that they aren’t necessary for everything. Again, do I need my Twitter feed while I’m driving? No. Do I want my car to check traffic reports between my current location and destination and show me if there are any road blocks or closures? You betcha. Do I want to be shown that the tire pressure on the back left tire is a little low, and the location of the nearest gas station or auto shop? Absolutely. Again, the interface is not the problem. In these cases, the interface is used to augment the experience of the user. It should give enough information to inform the user, and then get out of the way. Krishna cites Square’s Auto Tab feature as a prime example of interfaceless superiority (never mind the fact that you need to unlock your phone, find the app, tap on the app icon, wait for the app to load, search for the business by name or location, and then enable Auto Tab for that individual location), but it has some pretty severe drawbacks, namely that it can’t be used at mobile merchants and it won’t work if two Auto Tab-supporting merchants are located too close to each other. Why? Because Square relies on the GPS data from your phone to figure out where you are. It can’t figure out which merchant you’re at if the merchant keeps moving around (sorry, food trucks), and aGPS chips in phones aren’t accurate enough to distinguish between two merchants located right next to each other. Sure, Google introduced NFC and Google Wallet, but Wallet is only supported in the US and less than 5% of all smartphones in the market even have NFC readers built in. Bluetooth 4.0 is a much more promising technology than NFC, but even still, no one wants their phone broadcasting all of their personal information out at all times in hopes that their car or their TV might pick it up. There are security concerns beyond the scope of this post to consider, which is why BLE support is done on an app-by-app basis on iOS. And we can argue about skeumorphism versus flat design another time. It’s been done. Which brings me to my next point.
Principle 2: Leverage Computers to Cater to the User
There’s a reason we have user experience and user interface as two separate disciplines in the studio I work. The two are not the same, but they should compliment each other. As we move towards the Web 3.0 and the Internet of Things, the user interface becomes more important, not less. Krishna claims that technology should be omniscient:
Your TV turns on to the channel you want to watch
Well, maybe if you’re this guy, but the rest of us probably need to tell our TV what we’re in the mood for. Our TVs can get smart enough to make recommendations about what we might like based on the time of day, day of the week, or even who is sitting in front of the TV, but to believe we can live in a world where we just walk around and things get bought, music plays, and TV channels change is a bit unrealistic. And thus, saying that “No UI is about machines helping us” is actually not true. UI’s are necessary so we can tell computers how to best help us. Interface for the sake of interface isn’t the answer. Interface for the sake of the user is. Interface for the sake of augmenting the user experience is. Interface for the sake of interfacing is.
Principle 3: Create a system that adapts for people
I’m actually going to keep this principle titled the same as it originally was, because I think this is very true. And I also believe that good UI does this. One of the arguments for skeumorphic design back in 2006 when the first iPhone was released was that design that modeled the real world in that way made it easier for users to adapt to this new kind of technology. My notepad looks like a notepad? Cool! I know how to use a notepad. My iBooks sit on a bookshelf? That makes me feel comfortable, because I’m used to books on a bookshelf. Sure, skeumorphic design can be and was taken too far, but it served its purpose for the time. As users became more comfortable with the technology, skeumorphism lost its relevance in software interface design. Apple totally redesigned iOS with the launch of iOS 7. They probably spent millions of dollars in designer and engineering hours to bring iOS 7 to market. And sure, some people may have gotten a little dizzy, but they didn’t have to start all over again learning how to use their iPhone. So to say that user interfaces need full redesigns and those redesigns result in the user being forced to relearn just isn’t true. Apple has done a fantastic job of building and maintaining their Human Interface Guidelines, which tell both designers and developers how they should utilize the toolsets of the iOS SDK and platform to create beautiful, consistent user experiences across multiple applications.
Krishna cites Trunk Club as an example of how the no UI concept works. And he’s right. It is a fashion startup. And I’d even further agree that if you’re a startup, thinking about yourself as a company that provides a service and not as a company that builds an app usually leads to more successful results. But they are definitely not a software company. And they also aren’t trying to use software to enhance the lives of their users. Check out their website. It has a sleek, flat design in line with many startups’ websites. Download their app and you can browse featured trunks. Trunk Club undoubtedly uses this information to curate the kinds of clothes they send in your next trunk. And this isn’t new. Plenty of other subscription-based startups that wouldn’t call themselves “software companies” are using technology, along with user-facing interfaces, to learn more about an individual user and cater their services to that user’s needs. Just check out NatureBox or Fancy. Fancy is a great example of using a simple User Interface to curate products to a user’s preferences. You scroll through a list of pictures of items, and fancy the ones you like. Done.
Suffice it to say, I don’t think that all interfaces are good all of the time. However, I also think that interfaces can be used properly to drive the user experience and help users interact with technology to augment their daily lives. I just don’t like all-or-nothing approaches to topics like this, and think that an article about bad user experience design should make us value good user experience design more, not herald for the end to all interfaces.
** I actually eat my hot dogs plain.