The blurred lines between the iPad and Mac

When Apple unveiled their sneak peak project to run iOS apps on the Mac, one of the main sources of excitement was how this might affect hardware in the future. People are hypothesising that touch on the Mac is inevitable, mouse pointer support on the iPad isn’t far behind, and that one day iOS may run on a device with a laptop form factor, but is that really the case?

WWDC_No

Those shouting out that one day touch will be added to the Mac because it now runs iOS apps should reconsider the fact that the entire Mac user interface has been designed and refined over decades to work with a mouse pointer. Additionally that when porting an iOS app to the Mac, it’s UI elements are converted into native macOS UI elements, directly opposing any intention of touch input.

If there’s anything Apple is good at, it’s that their software always comes with a narrative arc spanning years, and cutting touch out of iOS apps on the Mac is a pretty clear sign of their direction on this.

The other reason not to add touch to the Mac: we already have a pretty great hardware for touch input; it’s called iPad.

One of my favourite analogies about Apple products is by Phil Schiller where he explains that the job of the iPhone is to be so good that it challenges why you need an iPad, and that iPad is there to challenge why you need a Mac. 

Once Apple’s ‘iOS apps on the Mac’ project is ready for primetime and apps are able to interchange between using UIKit and AppKit, could we not be in a future where any app can run on macOS or iOS, and adapt to each interface as necessary? Powerful Mac apps could be able to run natively on iOS; the home of the touch interface, whilst also running natively on macOS; the home of the keyboard and pointer interface. We could stop calling it iOS apps on the Mac, and Mac apps made for iOS, and just call them ‘apps’; for once these two frameworks are interchangeable, that’s all they should be. 

Granted, iOS may need to gain new UI features to achieve parity with what the Mac version of an app can offer such as the menubar and windowing. This may even result in part of iOS’s aesthetic becoming more like macOS, such as the use of window shadows and titlebars, but these aren’t unreasonable or unfeasible to design and adapt into a touch based paradigm, nor does it mean that these two interfaces are merging.

So, when developing an app on these interchangeable frameworks, it is the job of the iPad to be the epitome of what can be done with a touch only interface. Evolve the iPad any further by adding keyboards and pointers and you end up with the Mac’s interface.

And of course, it is the job of the Mac to be the epitome of a keyboard and pointer interface. Add touch and you end up with an iPad. 

These two interaction paradigms need to coexist. To blend them together would dilute the strength of each experience. 

But why should they coexist? So that the user can choose which paradigm they prefer. The Mac is the iPad with a keyboard and pointer that we want, and the iPad is the Mac with touch that we want. The line between touch and pointer is very much not blurred. The hardware for the future of computing is already here, and now it’s up to the software.


This section is for all of the those ‘what about…’ points:

 

…the fact that using an iPad is a vastly different experience. iOS and macOS are such different beasts. 

Right now they are, however macOS has learnt from iOS over the years, and no doubt will continue to do so. The system will continue to be simplified, locked down like iOS, adopt terminology, cues, and usability wins to ultimately make macOS feel like the grown up, ‘pro’ version of iOS people are craving today. Ultimately, you want to be able to switch between iPads and Macs and understand that they are part of the same computing family, yet one of them is designed to be touched, and the other, clicked.

 

…the iPad with Smart Keyboard. Does that not contradict these two well-defined hardware groups and interaction paradigms? 

Not really, in my eyes the Smart Keyboard for iPad just is a tool to allow for quicker typing, not a paradigm shifter. Its destiny isn’t to become the bottom half of a MacBook where there is also a trackpad. If there becomes a time where that does happen, you really would want to switch the interface to that of an OS that is designed for pointers. macOS perhaps. 

 

…some kind of hybrid OS, where UI elements change size based on whether there is touch input or not

Look at your iPad. Now look at your Mac. Those interfaces look different for a reason; they are making the most of the space available based on how you interact with them. A hybrid UI is jargon for a compromised UI; a game of tradeoffs to decide which features to sacrifice to make touch work, and where usability should suffer so that pointers make sense. 

I’m 100% supportive of a hybrid OS that could literally switch between iOS and macOS interfaces as the hardware changes. For example, if you did attach an iPad to the base of a MacBook, it could indeed switch from running iOS to macOS, and you know what, with these interchangeable frameworks, that might just be possible (and I would love it).

 

…a future where there is no iOS and macOS, but one ‘Apple OS’

Feasible, and the advantages are plain to see. iOS and macOS do already share a lot of underlying code, and as iOS’s capabilities expand, the risk of reimplementing what macOS already has, grows. So, share the base system, but still let UIKit and AppKit do their thing to deliver the right interface for the hardware.

I suppose this could also be called a hybrid OS, however the key difference here is that there isn’t a hybrid interface. Here, the interface is still unique to whether there is a touchscreen or pointer.

Advertisements

Starting at StarLeaf

I’m thrilled to be starting work as a UX Designer at StarLeaf next month. StarLeaf brings people together through the power of messaging, meetings and calling; and I can’t wait to be a part of the team to help create industry leading communication products and experiences.

Designing Friction For A Better User Experience

Allegedly, Facebook did some experimenting on a security checkup process, in which examining the privacy and security settings took only a few milliseconds for the user and wasn’t considered thorough enough. To improve the perception, Facebook added some delay, along with a fake progress bar, so that users could get a better understanding about the thoroughness of this process.

We naturally design to try and reduce friction, yet sometimes friction is needed to actually enhance the user experience. This article by Zoltan Kollin provides a wonderfully comprehensive overview and examples of where adding delays and additional steps is a desirable quality within a product.

Halide: How to Design for iPhone X (without an iPhone X)

Fascinating article from the developers of the highly acclaimed Halide camera app for iOS on how they redesigned the app for the iPhone X before it was even announced.

When it comes to reading, most of us read from left to right, but as humans we reach things from the bottom up.

If you design with this in mind, it’s called ‘Reachable UI’.

This is a way of thinking that more designers need to seriously consider; as devices get taller, interactions need to be increasingly accessible from the bottom of the screen.

I found it was quite difficult to figure out what was ergonomically sound without an actual device to test on.

Then, Ben built an iPhone X.

I love this. Since the app was being designed before the iPhone X had been revealed, let alone shipped, it was absolutely necessary to get a feel for its proportions.

Buttons that require a tap were put in the area that was best for interacting

The bottom quarter of the screen. Makes sense.

We adjusted these to fit the ergonomics of the new device; for exposure adjustment, we ensured you could compensate for at least 5 EV (exposure values) with your thumb, giving you great exposure adjustment without requiring serious finger gymnastics.

The importance of ergonomics within an app’s design cannot be understated. Not only does this make the UI more functional, but to the user the entire app feels like a better thought out and more cohesive experience – not a battle against the screen to access functions.

In the case of Halide, buttons that require taps are in the bottom quarter of the screen, and functions that can be controlled with less accuracy such as a swipe, in the prime space where thumbs can pivot yet don’t need to reach the opposite side of the device.

Testing on a physical mockup proves valuable; speeding up the learning process in-house rather than when the app ships, leading to a better first experience for users.

Website redevelopment of gunatitjyot.org

gunatitjyot.org is a website I lead technical development for with the help of a team of volunteers for my religious organisation. The website has an audience around the world wanting to keep up to date on the organisation’s latest activities, events, and spiritual material, as well as new visitors coming to learn about the religion.

In late 2017, the team, led by myself, began to evaluate how the website could provide a better user experience for visitors. This included the kinds of content on offer, how they were organised and structured on the site, and the visual design of the website.

Continue reading Website redevelopment of gunatitjyot.org