MacCritic

Name:

Favorite quotation: Before you criticize someone, you should walk a mile in their shoes, that way when you criticize them, you're a mile away and you have their shoes. - Jack Handey

Sunday, April 20, 2014

Skeuomorphism Revisited

I was highly critical of iOS 7's new UI when it came out: I hated on the stark white look, trashed the sub-minimalist approach, belittled the conflicting design principles, and called it downright Microsoftian.

I am simply a Mac lover and an iOS lover and I liked the fact that iOS had a similar look and feel to OS X. iOS was originally marketed as basically "OS X on a phone" where you could view "the real internet" (instead of the gimped, mobile versions of sites that other phones were limited to). It really lived up to this feeling.

I also loved many of the skeuomorphic apps from Apple and third-party devs. Sure there were some horrific atrocities like the old Game Center, but looking back on it, I think for the most part skeuomorphic design is superior.

There are two primary advantages of skeuomorphism that I have not seen discussed anywhere else, and which I wonder if Apple even considered in the absence of Steve Jobs:

(1)

One is that it makes each app feel much more distinct.

An example: one of my favorite third-party apps, Grafio, had this beautiful design with wood textures. When I was in Grafio, I knew I was in Grafio. But the new version of Grafio removed all that nice design and replaced it with stark black icon bars and thin little 'minimalistic' icons. However now, when I'm switching between apps, iPhoto and Grafio look very similar and I just don't feel a distinct difference.

By giving a photo-relasitic look to textures, tools, buttons, it lets app designers take full advantage of the beautiful retina displays we now have to make their app feel totally distinct. Yet so many apps that previously had such interfaces have ditched them and have all gone to the same stark look. The more I use all these apps the more I miss the old distinctness. When I go back to my iPad (which shall always remain on iOS 6, sorry), suddenly each App now has that distinct feeling again; each one is its own little work of art :D

Sure, there is nothing in iOS preventing custom UI; but for some reason many developers ditched their custom UIs anyway. Perhaps that's because now, it seems like it might be easier to make your app not look like a system app yet still use the canned UI of iOS 7, because you can customize the color scheme better than you could with the canned UI of iOS 6. However there are only so many color schemes that look good, and many third-party apps gravitate towards the same ones.

The only thing that sets them apart from each other now is the use of different icons, but due to how thin and slight they are, again it just doesn't feel like there's a lot of distinctness between them.

----

(2)

Two is that skeuomorphism fools the brain. It gives a feeling of physicalness and playfulness to things. It subconsciously makes you want to touch things to see what their texture feels like. It naturally inspires curiosity and inspection.

Like it or not, we do still live in the physical world. Our brains evolved over millions of years to distinguish things from each other by texture and feel. There are more neurons in the brain dedicated to the nerves in the finger tips and the visual system than almost anything else. Two of the chief things that makes us human are our superior color vision and extreme manual dexterity.

Vision and touch in fact tied together: "eye-hand coordination" goes far beyond just being able to coordinate action. It also entails Hebbian adaptation: the ability to "feel" a touch sensation *before* you have actually touched something. Seeing a texture anticipates feeling it; this predictive ability of the imagination and subconscious is evolution's solution for the problem of lag: by feeling something before it is touched you can avoid touching something bad before you already have, and it's too late. (Touch a hot stove.)

All sensations of texture take place fully in the mind. Sensory data from the fingers (touch) can be one source of what causes the mind to perceive a texture, but sight data from the eyes can also trigger it, as can memories. Just as you can imagine touching a leaf and remember the feeling of it, so also you can see a picture of a leaf and reach out to touch it, and even if it's behind some glass, just the action of doing this can cue your mind to "feel" the leaf more realistically than if you closed your eyes and just tried to imagine it.

On a certain level, you don't need to actually feel "real" (physical, mechanical) haptic feedback to have haptic feedback, because to quote Morpheus, "Your mind makes it real." You just need realistic, familiar textures and realistic-looking skeuomorphic design, and boom, you have haptic feedback -- probably better haptic feedback than if you actually had whatever crappy haptic feedback we'll likely see whenever the physical kind finally comes out. On a subconscious level, your brain anticipates feeling the textures it sees; then it anticipates that feeling even if the neural impulses don't actually come from the fingers.

Steve Jobs understood this. He famously said regarding OS X's Aqua interface, "One of the design goals was when you saw it you wanted to lick it." Steve understood there is a subconscious element to things and that a good UI ought to leverage all the processing power not only of the computer, but of the user's subconscious mind.

After all, Neo, what's the difference between something that tastes like chicken, and "real" chicken?

...

So why did Apple give up that technology in iOS 7? I would argue that they were forced to, and it was not necessarily by choice. They realized that in order to migrate iOS to a variety of screen sizes, they could not keep kicking the pixel can down the road.

The only legitimate problem with skeuomorphic design is that it's highly reliant upon bitmaps. We saw with the iPad Mini that the UI simply shrank, something Jobs wanted to avoid because the original iPad UI was designed to be just the right physical size for actual fingers to use. However some people have better manual dexterity and close-up eyesight than others, and Apple sells plenty of iPad minis as a result. I don't see a problem there.

But on the iPhone, Apple knows that it can't just decrease the screen DPI and make it bigger, while remaining competitive. They want to make a UI that will scale properly across devices while remaining the same physical size.

That's why in iOS 7 they got rid of so much UI chrome, because they want the UI to be solely based on vector graphics elements (like fonts and lines) that can scale perfectly if the screen size changes. That's why they are pushing developers to use the "AutoLayout" constraints, which are a series of rules that defines where on the screen each element gets drawn according to its distance from other elements or the screen boundaries, instead of just a position on a grid of coordinates that is directly mapped to a pixel grid.

I've been in favor of a fully vector-based UI for OS X and iOS for many years, since in a world of varying screen sizes and varying screen DPIs it would be the only way to have a truly resolution-independent, WYSIWYG UI.

However even though Apple said several years ago now that they would move to a vector-based UI for OS X, they never did because it would have been too much to ask everyone to convert all their UIs into to vector-based ones. Every app would have to get redone. It's a monumental undertaking, and Apple did not have a way to force developers to do things. Besides, it's not like you can't just change your screen resolution on a Mac if you want the UI elements to look bigger, or use the zoom-feature. Also, apps run inside their own windows on the Mac, so it doesn't matter if that window gets smaller.

Now on iOS, Apple does have the ability to force developers to do things. That's good because it means they can require the use of APIs like AutoLayout and TextKit to make apps that will scale to differing screen sizes. This becomes a manageable task when you can stop worrying about scaling textures and design all your icons as vector-based ones. You can use a desktop-publishing style approach to design now and achieve a nice look through alternate means than skeuomorphism.

...

But have we lost something important with the move away from skeuomorphism? I'd argue that perhaps we have. How can we get back there and strike more of a balance between resolution independence, and psychohaptic feedback?

One idea would be to use dynamically generated textures based on fractals or other forms of procedural math, or leverage OpenGL to scale bitmapped textures in a way that looks better using bump-mapping and dynamic light sources linked to the accelerometer.

This is where I think we are ultimately headed, probably not as soon as iOS 8, but somewhere down the road.

Check out the new Frax app for iOS to get a sense for what kind of real-time textures can be possible:

http://www.pocketmeta.com/frax-hd-ip...ngenious-5940/

The issue of course right now is battery life; you have processor-expensive textures everywhere and it just kills you. However if the textures are not updated in real-time, but simply rendered once upon determination of the screen size, and then cached, it doesn't kill you.

Apple has a lot of ground-work to do before it can realize something like resolution-independent skeuomorphism in a battery-efficient way through system APIs with fractal textures and OpenGL, etc. However I would not rule out the possibility or assume that they went away from skeuomorphism on purpose.

Because if you look closely skeuomorphism is still present in certain places in iOS 7: the look of frosted, translucent glass in Control Center and Notification Center; the light paper texture of Notes app. It's present in things things that can scale independent of resolution, as one might predict if I'm right, and they haven't completely eschewed skeuomorphism but rather been forced away from it in order to migrate to a variety of screen sizes.

I do not think Jonny Ive doesn't understand the value of psychohaptic feedback in places, but I think they wanted to make a bold move and push the envelope in a new direction to freshen things up. That's not to say the pendulum couldn't swing back the other direction towards a method of implementing psychohaptic feedback via skeuomorphism in places where it really does help the interface feel more interactive and draw the user in more.

So I am highly critical of iOS 7's notion that all of a sudden, these sorts of skeuomorphic cues are no longer useful, now that we've been using touch interfaces for a couple of years. Sorry, but it just seems idiotic to me to arrogantly assume that after just a few short years Apple might have undone millions of years of evolution and fundamentally changed human nature. Sorry, but I'm right.

-----

Problems with the Deference Edict

Another aspect of the iOS 7 design principles that has hurt apps has been the "get out of the user's way" bit. "Deference" was the buzzword Apple chose for this idea: "The UI helps users understand and interact with the content, but never competes with it."

This is why the UI of Safari hides from you when you scroll around. I won't argue that it's great when used right: Safari uses it right (although I think Safari's UI is too 'thick' when it is on the screen). Deference is something that great UIs have always featured, going back throughout time. The original WriteNow for Mac had a ruler that "hid" behind the text window to maximize the text area on the old Mac's 9" screen.

Yet do the the deference edict, many app designers have felt compelled to make their UIs smaller and more cramped. This has resulted in buttons that are too close together. Where previously controls were well-placed for one-handed thumb use, now several of my favorite apps demand that I use my index finger to carefully push tiny, thin buttons. Grafio is a good example of this; it used to have sidebars that reduced the visible canvas area a bit, but made it much easier to select the tools with my left thumb.

However the fundamental problem with "deference" is that it's hard to use correctly. Let me explain.

Prior to the "deference" edict, in iOS 6 and earlier the primary consideration of UI design on iOS was the physicality of using it. I.e., if I'm holding the device in my hand, and using this interface, where should the buttons be in order for them to be in the most ergonomic positions when I'm using this device with one hand? How about two? The best UIs were the ones that placed the controls in a manner that made it easy to reach them and reduced the likelihood of pressing the wrong button. They often accomplished this by using the easiest types of gestures like swipes to reduce the number of buttons necessary on the screen at any given time.

However under the "deference" edict, the general MO of the UI designers became, "How do we reduce 'clutter' off the screen"? So they started to ask questions like, "Why is this sidebar here, can't we just move this icon down to the bottom bar?" Someone might respond, "But sir, then all the buttons would be pretty close together down there." Yet they would end up doing it anyway because, well, ergonomics must be sacrificed to free up a bit more screen real-estate for 'content.'

That's forgetting of course that these apps are *tools*, not just *content consumption portals*. Tools should be ergonomic. Form should follow function. 'Content' is important, yes, but just like with minimalistic furniture, what good is a chair that is not comfortable? A couch that hurts your back? Who cares what it looks like if you don't want to sit in it?

Most of the attempts at "deference" I've seen in the apps I use have just resulted in the cramming together of icons along the bottom row (when perhaps they used to be in a side-bar or side panel) or interface elements that annoyingly flit away with no obvious way to get them back. The eBay app is the worst example of this; they hide the most useful options until you push a button to "see more". Now I have to spend more time tapping and less time using. I don't care about the fact that when the menu is on the screen it conceals less of the 'content'; it always conceals some of it, and now it's there for longer than I wanted because it takes me longer to tap more.

----

Reservation of basic gestures

Despite the fact that Apple wanted "deference" out of developers, on the other hand Apple reserved for system functions the most useful basic gesture that could have allowed us to get rid of interface elements: swiping. Facebook app for example used to have a great tray that popped out when you swiped from left to right, but now since that's the "universal back gesture" they had to add yet another button to their UI. I did not see this as an improvement, not that it was Facebook's fault.

Even worse now the system swiping competes with other swiping like swiping to scroll. Often when I'm trying to scroll around on a page, I inadvertently activate the "back" or "forward" gesture. It's just annoying.

I do like the universal back and forward gestures, don't get me wrong; I just don't think they helped developers implement "deference" and I think there it has definitely lowered the percentage of times where my attempted gestures are interpreted correctly in several types of situations.

----

Removal of features

All you have to do is read the Apple support forums about Numbers, Pages, iMovie, etc. Many important features that people relied on were simply removed.

What's the problem? Why remove critical features?

Surely not budget; Apple, the richest company in the world, can afford to hire however many developers it wants to make sure that every useful feature gets included in its apps. Surely not capability; Apple has no lack talent in-house.

No, I think it's something deeper. I've noticed it with video game companies, I've noticed it with application developers. I think it's a kind of trend in software development and release that's developed ever since cloud-based distribution became common.

It's the feeling that you can always update it later so it's OK to just release something in an unfinished state. It's skipping a true beta test and just releasing beta software that's missing core features from previous versions.

Was an order given out to each team at Apple that said: pick five features to ditch from your app? I don't think so. I think it's more that when they change lots of fundamental frameworks and shift to 64-bit and enforce iCloud compatibility, they have to basically re-do the entire app from scratch anyway. It's a major, huge code refactoring job. In order to hit release deadlines they had to pick some things to leave out.

But you see Apple's culture of secrecy means they don't tell any of us WHY they did it. We just have to speculate. It leaves users frustrated and they don't understand how an "upgrade" could remove features. Apple needs to be more transparent about these things and let its flock know, "Hey, if you need feature X, Y, or Z, then don't upgrade yet, because that won't be out for awhile yet."

Take iMovie for example: many people didn't find out until after updating that the latest version of iMovie for Mac cannot import projects from the latest version of iMovie for iOS. You have to downgrade both in order to have compatibility. But how do you downgrade both once you've already upgraded? It's a very complicated procedure, and it's only possible if you had the foresight to keep regular backups of all your iTunes .ipa files.

The forces of capitalism aren't going to directly force Apple to keep its iLife or iWork customers happy: if people feel burned by these apps then Apple doesn't instantly go out of business. An app developer whose sole income is from a certain app, on the other hand, has a direct motivation not to piss off its customer base.

Apple must therefore really go out of its way to make extra efforts to listen to customer feedback and ensure that its customers remain satisfied.

However I've noticed an alarming trend among people who work in software, where they think they know better than their customers. This has really become part of Apple's corporate culture and it's really annoying. Everyone starts thinking they are Steve Jobs and they can remove some core feature as if it was the floppy drive and their product is the iMac. I must point out that just because you remove a core feature from a product and customers complain about it does not automatically mean that it's a good idea, simply because one time in the 1990s Steve Jobs removed the floppy drive from the iMac.

The floppy drive took up physical space in the iMac and added to the cost each unit would take to produce. However in software removing a core feature might save developer costs up front, and possibly speed up development, but it won't make each copy of the app cost more to produce. If it results in fewer people purchasing the app or recommending it to others however, it will negatively impact sales, which will greatly lower the profit earned by the app.

The more customers you piss off, the fewer people recommend you to others, and the more haters you gain. It won't be noticed in the short term, especially with bundled apps like iMovie, but in the long-term it will cause your product line to lose its competitive edge and erode your profit margins.

The thing is, I honestly believe Apple does not intend to upset its customers in this way. I just think there is something wrong in their product planning and testing that is leading to this sort of thing happening. Maybe they don't interview enough users of the actual products to find out how they actually use them. Maybe they rely too much on usage data provided and there is something faulty about this. Perhaps they don't do enough open public betas where real actual users test the new version and report on how well it continues to meet their needs.

Whatever the case may be I sincerely hope that Apple is making changes to address this issue because there are a lot of other ways in which I feel they are not listening to their customers, that have caused people I know to switch platforms and give up on Apple.

One problem I see is that Apple does not interact with people on its support forums. It should take a page from what many other companies do and keep closer tabs on this. It should have a team of people who communicate with users and relay the complaints and concerns from users back to the engineering teams, just like they do for their phone lines. Customers expect this. They don't understand how they could post a problem on the forums and then nothing gets done about it.

They should make bug reporting and beta testing more open and available to non-developers. Frankly the feedback portal they have is just too inadequate. You can't file a ticket and keep up with the progress. You feel like you're just pissing in the ocean. I know Apple listens to the feedback they receive but users never get the sensation that they really do.

One of the worst things is that since developers are the ones doing all the beta testing for new iOS releases, and when we install the update we're told to backup, and install it in a very particular way, then who outside of Apple is beta testing the "normal" update and install procedures that users will be doing? Nobody. So is it any wonder that over-the-air updates and "non-clean" installs typically have so many problems? Those procedures never get beta tested! But they could, if Apple had a corps of real-user testers.

Ultimately no company is perfect but periodically I must issue a rant. I love many things Apple is doing but I'm not here to talk about that stuff today. I'm a hardcore lifelong Mac user and developer and none of that will ever change. I'm just stating my opinions in the hopes that it might inspire some debate and I can refine my views if I'm wrong about something. Then I can send that refined view to Apple as properly considered feedback.