What You're Buying Is an Ecosystem that Comes with a Phone

A recent trend on technology websites has been to try to dumb down fairly complex and multifaceted matters into bite size, ELI5-type articles with the basic stance that you, dear reader, are in need of having something explained to you.

I'm not really sure why this is or where it came from, but I can see marketing folks running around the editorial room shouting "—Hey guys, great great stuff... but we need something that captures people's attention — they're seeking fast answers. Millennials, ok? MILLENNIALS!! OK!! Any questions?"

These articles are often called things like: "Here's the best TV for you", "We've picked your next fast food”, "Your next car is electrical", "Eat nothing but carrots", and so on.

The Verge is an online tech and culture website I usually take pleasure in reading. However, their latest addition to the buzzfeedy clickbait genre outlined above is called The Best Phone You Can Buy Right Now (2017). Their suggestion is the Samsung Galaxy S8:

“Samsung’s Galaxy S8 and S8 Plus is the best phone for most people. It’s available across all four US carriers and unlocked. It has the best display on any smartphone right now, a head-turning, premium design, a top of the line camera, reliable battery life, and fast performance.”

Confusingly, they later suggest that the iPhone 7 is an option “If you’re not into the S8’s curvy design or are locked into Apple’s ecosystem”.

There are many possible angles from which one could try to approach this article, such as who this article is actually for (assuming that most people who find their way to a semi-obscure website such as the Verge have probably heard of both Samsung and Apple already) — but let’s just focus on the “ecosystem” part mentioned in passing above.

As a disclaimer — I'm a long time user of both Apple's iPhone (since model 1) as well as Android. I currently have and use both an iPhone 6S and a Nexus 6P. I'm not a strong proponent of either and see pros and cons with each platform and ecosystem. In this, I think I differ from most of the authors of these articles, that unfortunately tend to be written by people that have a very strong preference for either Android or iOS. Android fans tend to stress superior tech specs of Android phones, especially per dollar spent, where Apple fans tend to stress the quality of the user experience and the quality of the third party apps. 

This actually gets us to the point here. When you choose a phone in 2017, you’re basically not primarily picking a phone, you're main decision is choosing between two competing, behemoth ecosystems: Apple or Google.

Sure, there’s a bit of overlap, especially Google's ecosystem can bleed into Apple's and Apple has made a few lame attempts at providing some of its services on other platforms, but basically, you pick one or the other as your go-to place for things like cloud storage and backup, mail and calendar, and media consumption.

The phones that are currently for sale are in some sense just the latest physical incarnations of these competing ecosystems, whether they are from Apple, Samsung, Google, Nokia, HTC, Huawei, LG, OnePlus, Sony, or what have you.

There’s a new iPhone every year and a never-ending stream of new Android phones – although the one that counts, the latest Galaxy from Samsung, is also updated on a yearly basis. The tech specs of the latest and greatest phones in terms of CPUs, storage, cameras, screens, and so on are incredibly similar, to the point of being identical, across almost all the so-called ‘flagship’ phones. Sure, there are some small differences, but these are mostly only relevant until the competition’s upgrade cycle kicks in which tends to level everything out again. So right now, as duly noted by the Verge, the Galaxy S8’s screen “has the best display on any smartphone right now”.

A few  observations here. First, what’s meant by “best display?" Second, note the “right now” disclaimer.

The S8 has a great display, no question about it. It’s one of the first phones to shrink the bezels around the its main screen to such an extent that the phone and screen becomes one. This isn’t really a device with a screen on it; it’s more a screen wrapped around a device. The screen's curved edges add to the feeling that what you’re holding is a screen, not a 'device'. The Galaxy’s screen is also very bright, has a ridicolously high resolution for its size, and uses HDR and other display techniques to boost colors and contrast to make the image pop.

Many people find that these effects produce a “better” image. Personally, I find these images artificial, plastic, and fake looking and I much prefer the more natural image reproduction offered by the iPhone and most of Apple's products (as well as some other Android phone manufacturers for that matter). What this does point to however is that “best” is a difficult concept when it comes to things where personal preference matters.

Similarly, the Galaxy’s camera is indeed “top of the line”, as argued in the Verge's article, but if you look at side by side comparisons between images taken with different smartphones you see that, first, they differ very little in anything measurable (they all have very similar color depths, resolution, etc.), and, second, where they do differ is almost entirely in their subjective aesthetics — color temperature, saturation, differences in fake bokeh, low light compensation, etc — most of which is a result of differences in post-production software processing, not hardware specs. Again, “best” here is in the eye of the beholder.

Additionally, the differences between the phone tend to be evened out very quickly. Apple's new phone is due to come out later this year — which by the way seems to look very similar to the S8, Andy Rubin's Essential Phone, as well as the LG G6 and V30. The next Pixel? Chances are it'll be an all screen phone too (or not, surprisingly enough).

What this generation of phones more clearly than ever shows us is that phones-as-devices are increasingly becoming less important, they are gradually becoming just windows into the software — software which in itself is just the machinery needed to access the underlying ecosystem. More than ever, what sets phones apart in 2017 is what’s on the screen, not the screen itself and what’s literally behind it.

The argument is that the choice you make when buying a phone is first and foremost not about the phone you buy, but the ecosystem you invest in.

If we buy this argument, then there are three relevant questions you need to ask yourself:

  1. What user interface do you prefer, Android or iOS? Both have pros and cons, yet they are becoming increasingly similar in some key areas. In my view, iOS is a much more polished user experience and I'm to this day surprised that Android isn't catching up faster in this area.
  2. What ecosystem is right for you? Google’s or Apple’s? Both have pros and cons. In my view, Google's ecosystem is better for your own stuff (such as the integration between Gmail, Google Drive, and Google Photos) whereas Apple's ecosystem is much better when it comes to entertainment. iTunes has a bad rep, sure, but it's also very functional.
  3. How privacy conscious and concerned are you? If you are concerned about privacy, you would naturally gravitate towards Apple's ecosystem. This is not to say your data is safe with Apple, but unlike Google, Apple's key "product" isn't your data. What's a bit counter intuitive in all this is that it's probably less likely that Google gets hacked than Apple, so in some sense your data is probably more secure with Google, but on the other hand Google's using that data in a variety of ways — not all of which you're aware.

Looking at it this way — when you take a photo with your phone, what happens to that image after you taken it? What would you want to happen to it, or not happen to it?

Edward Tufte and Interaction Design

Went to a full day course today here in NYC given by Edward Tufte; a legend in the field of information visualization. I've always enjoyed reading Tufte's work. His historical odysseys, writing style, thoughtfulness, and carefully crafted books should be required reading for everyone from business schools MBA-ers to CS to the arts to social sciences to journalists to design educations. His books are that good and that important. For me, the central theme in Tufte's work has been the integration of text and visuals, content and form, which has been an important source of inspiration for my own work.

I'm not sure what I had in mind for a one day course and I did enjoy most of it, but I left feeling a little short changed. Perhaps the course is mostly meant as a basic introduction to his work and not a deepening of it? His presentation style is also slow, quite likely knowingly so—deeply rooted in the 'old american male professor' genre—but still slow. I like that he's not using Powerpoint slides but instead builds up a story (slowly) around a couple of key images that he zooms in and out of (although to be honest some of his pictures looked a lot like slides!) I also really liked that he just jumped right into it, on the hour, without even the shortest "hello and welcome." Refreshing!

Tufte's work is hence utterly relevant to so many different areas and has impact and implications for even more fields. Yet, one of the areas I feel his work is relevant to but he hasn't quite grasped is my field—interaction design. He's hovering over a host of relevant topics here, for sure, but doesn't quite get the details right, which unfortunately for him is exactly what he keeps calling out other people on, so... 

First of all, his thinking in this space seems a bit old. For instance, Tufte kept referring to some Dell laptop where the scrollbars apparently covered 11% of the screen real estate. That's a relevant anecdote if the year is 2003, but not really in 2016. In a world of smartphone apps, retina displays, tablets, hamburger menus, world wild west, notifications, creative online typography, etc. etc., it would seem that there are so many other more recent examples of the same idea (i.e. badly designed interface elements that hide rather than promote content) that this rather archaic anecdote more serves to confuse than enlighten. There were a few others like this as well, including web designers confusing the short term memory magic number 7 +/-2 theory. I'm sure this has happened, but probably not in the last decade and certainly not a common occurrence anymore, if it ever was.

There are so many examples of Tufte's work that are still so relevant for today's web and app designers, so why not talk about these instead? He mentioned one in passing: that many sites today are conceived and designed to be responsive, i.e. aiming at providing an 'optimal reading experience' regardless of what device/resolution you use—in effect separating content from layout and function. I think it's fair to say that Tufte's collected works can be seen as a critique of this very relevant and timely design idea. So why not spend time on massacring this? I would.

Tufte also mentioned that he was one of maybe 10 people in the world who thinks theoretically about these issues. Again, he's been at it for a while and this was maybe, even probably, true at some point, but I also think it's a bit negligent towards what has happened in the field in the last 20-30 years. Folks from all kinds of (academic) disciplines are doing it now: such as Human-Computer Interaction (HCI), interaction design, philosophy of technology, and design research, as well as a host of non-academic thinkers utilizing blogs, internet-based magazines, professional conferences and workshops, etc.

Third, Tufte's only substantial idea (at least in the way it was framed at this talk) about interaction design echoed the Heideggerian notion of "not letting the interface get in the front of the content". This idea was popularized in HCI and interaction design by Winograd & Flores back in 1987 and even earlier than that in the philosophical field called philosophy of technology. This idea is one that I've drawn on heavily in my own work, among several other researchers and practitioners in interaction design.

Yet, the problem here seems to be the distinction between form and content. In all his work, Tufte shows that these go best together if they are considered hand in hand. 

Let's look at this in a bit more detail. One of Tufte's recurring rhetorical refrains is that you now have better tools at your disposal in your smartphone than those that you use at work and that we should all rise up and demand at least equally good tools for 'work'. That's fair, but what Tufte misses here is that this also means that there is a substantial overlap between "work" and "leisure"—or whatever we want to call it, i.e. when we're not actually 'working'. I think one of the fundamental shifts in interaction design over the last 15 years or so is that the computer now just isn't something we think about as a machine for work. It's so much more than that. We use our PCs, laptops, and smartphones to mindlessly scroll through Facebook, play games, pass time, buy stock, watch movies, find plumbers, stalk coworkers on LinkedIn, read a text, write novels, keep swiping left and only occasionally right, anonymously rant on online fora just because you can, check out new music, do some work stuff (mostly emails), create graphs for our kid's soccer team, pass time before I can get out of here, plan the holiday in Cape Cod, and so on and on and on. All using the same magical machine.

With this in mind, I think Tufte's explicit notion that the primary role of interaction design is to make the interface disappear in favor of content is still a relevant perspective in many ways. However, it also implicitly suggests a rather old-school, work oriented perspective lurking underneath—which is that the user's only goal of using a computer or a smartphone is to get to the 'content' that their interfaces are hiding. It often is, but not always. Such a view is not enough to understand what interaction design is today. In what I have called the 'third wave of HCI', we see for instance web sites and apps where the interface is knowingly designed to be unclear and fuzzy and it is the user's task (or fun, to be more precise) to figure it out. Here, the interface and the content blend into one—they become the same thing. The interaction itself becomes valuable and meaningful, not just the so-called 'content' that it is supposed to hide or show. Computer games have always had an element of this. What makes Flappy Birds irresistible is the interaction, not the content.

At the end of the day, literally, I left Tufte's talk not bedazzled but yet hopeful. Tufte's thinking is still relevant for interaction design, it might just require someone with more detailed knowledge about the area to be able to interpret, see, and further develop its significance. One potential path is the current interest in digital assistants such as Siri, Alexa, Cortana, and 'Ok Google'. Applying Tufte's information visualization principles to these would probably reveal quite a few design obstacles to overcome in the next couple of years.

That said, I ended up taking a lot of notes and did get to doodle a bit too. This one, for instance, I call "A Bear with Many Faces" (yes, of course I name my doodles!)

Towards Integrated Headphones? On Apple and the Rumored Removal of the Headphone Jack

Quite a few people, not least so tech bloggers, seem to almost violently oppose one side of design I’ve always thought Apple is doing well, at least on the hardware side—that good design is as much about taking things away as it is to add stuff. Apple's done this with the floppy drive, CD-drive, VGA port, and lately—with the rather incredible Macbook—with all but one port. Others have quietly followed a couple of years later.

Some of these bloggers are now upset that the next iPhone might not have a traditional headphone jack: the 3.5 mm stereo connector. This connector, which of course is analogue, was invented in the 19th century to be used in telephone switchboards. The same design is still in use for connecting a wide variety of audio peripherals, from headphones to electric guitars.

That the connector is old is not the reason it might have seen its prime, however. In many ways, the 3.5mm is the perfect analogue audio connector. It rotates 360 degrees, you can charge at the same time as you listen to music, headphones do not have to be charged, etc. Yes. 

If the rumors are true, I am sure Apple has converging reasons for why they want to remove it. First, phones seek to get thinner and thinner yet with huge batteries inside and at some point size and real estate does become an issue. Here, Apple's engineers probably struggle with the length of the connector, not necessarily its diameter. Second, I would be surprised if there is not an element of selling-new-headphones here too. Beats, after all, is an Apple company, and yes, yes, you do wonder what company would be ready with a line of USB-C headphones in case Apple decides to go with the new connector. Apple's ecosystem is important, but the firm makes a lot of money selling stuff still. If they decide to go with the Lightning port, which I dearly hope they won't, they will also force manufacturers to pay up for using it. Additionally, on a paranoid note, the move could potentially be DRM oriented, but surely that's not the case, right? RIGHT? 

Third—and worth spending a bit more time on—we have ‘other technical reasons’:

Here, a USB-C port (let's hope they don't go with Lightning connector, although knowing Apple that's probably not unlikely) would in the long run actually offer increased compatibility between devices. As the traditional headphone jack is analogue and really just envisaged to transport an audio signal, various ‘hacks’ have been made to its design along the way to allow it to do more. The iPhone, for instance, uses a 4-conductor (so called TRRS) phone connector for its headset to allow for a microphone and control buttons for pause/play and volume. Other vendors have made other design choices. This means that Bose’s quite excellent QuietComfort 20 noise cancelling earpieces come in two different versions, one for Apple devices and one for Android. As an active user of both an iPhone 6 and a Nexus 6p this is surprisingly annoying. I’m begging for the industry to widely adopt USB-C as the standard for all kinds of peripherals, regardless of type. I think Apple should have some credit for paving the way (and taking the bullet, too) and I’m surprised other vendors aren’t following suit—yes, looking at you Samsung and Google.

Another, and for me personally, the primary reason I’m not so sure dumping the headphone jack is such a bad idea is also related to it being analogue. This one, however, has to do with sound quality, something I care about.

As the traditional headphone jack serves the headphones you put into it with an analogue signal, this signal has to be converted from digital to analogue and then amplified before it can leave the phone through the headphone jack. In other words, in the signal chain from wherever in the cloud your music lives to your ear, there has to be a digital to analogue converter (a DAC), an amplifier, and some form of speaker system (such as your headphones). Today, everything except the latter typically lives inside of your smartphone or your computer. Almost without exception, these amplifiers are underpowered, of poor quality, or both, which in turn make them unable to drive anything else than the crappy kinds of earpieces that came with your phone. Hence, even if you have a good pair of headphones, they do not really sound as good as they can on your mobile device.

There are at least two things to consider here. First, there has been trend towards wireless Bluetooth headphones. As often tends to happen, this technology was released before it was ready for prime time. Early implementations of Bluetooth headphones were buggy and the sound quality was terrible. While the technology has come a long way in the last couple of years, Bluetooth still has its limitations and quirks, mainly to do with the fact it uses the same wireless frequency, the 2.4Ghz band, as literally everything else: wireless mice and keyboards, WiFi, microwaves, you name it. Still, Bluetooth audio is becoming a viable alternative to the headphone jack.

Second, in an increasingly broader circle of people that are actually interested in audio quality—even outside of the rather narrow and highly specialized group often referred to as audiophiles—there has been a trend towards getting dedicated external audio units. Musicians are getting devices such as Universal Audio’s Apollo to be able to record, mix, and master music professionally. Connoisseurs of music are buying external DACs and amplifiers to improve the sound quality, such as Meridian’s rather great Explorer2. What these devices have in common is that they are external and that they connect to the computer device through USB, Firewire, or Thunderbolt.

Thus, by removing the digital to analogue conversion and the amplification from the device, Apple actually opens up for a new breed of “integrated headphones” where the DAC, the amplifier, and the headphone itself can be matched to perfection by the maker.

Make no mistake, I’m convinced that this will result in an explosion of rather terrible integrated headphones over the coming years, but I’m also convinced that serious companies can use this to their advantage and come up with well-balanced, well thought out, and of course, well-sounding combos. For the benefit of mankind. Well.

 

Apologetic Interfaces

The MacOS X menu for Bose's SoundTouch music system comes across as almost a little heartbreaking. It is as if it doesn't really believe in itself. Why else would 'Quit' be the first menu option? 

The first steps toward a new breed of "Apologetic Interfaces?"

Picture the creative meeting where this was decided. A few people around a conference table, half-empty coffee cups, a few post-it notes. "Let's see, what would The User want to do..." Silence. After a while, from the other side of the table, "Well... it would be... ehhh.. some users might... like to... ehm...shut down the, ehm, quit." "Ah, excellent! Let's write that down on the whiteboard!" "Anything else?"

This example aside, maybe it's not such a bad thing that applications step it down a little. While it's natural that an application wants to tell the user as soon as possible that, "Hey! Guess what, there's a new version of me out!", the problem is that with all the apps you have on your computer, it's just too much distractive shouting going on all the time from a lot of different places.

If I open up Word, I do so because I either want to write something down or read a document. Unless there's a minor crisis (let's say an earthquake) I DO NOT want Office Update to take focus away from the document that's forming in my head and inform me of "critical update #1.6.28343".

Similarly, if I open up say VLC, especially while giving a talk, I do so because I want to show my audience a lovely little video clip, not give them the breaking news that version 2.1.5 has improved the reliability of MKV and MAD file playback.

An old-school, Unapologetic Interface, but not without finesse

Mindlessly checking for an update the first thing you do when the user opens up the application is just bad, thoughtless design. You open an application because you want to do something with it, right now. There are so many more ways in which an update could be done in a nicer, more humane way that doesn't get in the way of the user's intent.

A simple solution would be to gather the information and download the update in the background. Then just hold off the notification until the user either decides to quit the application or becomes idle. Or update it automatically in the background. Or, let the OS handle it if that's an option. Apple knew about this problem as well, that's why they implemented the App Store and the Notification Center. That's all pretty great, but then again some of the most notorious apps aren't using that. Looking at you, Microsoft and Adobe. 

One of the finesse-less, usual suspects

If you look closely at the picture above, you'll see that VLC comes with a solution to this which isn't without finesse. Clicking that small (and offset) check-box, you can chose to automatically download and install updates in the future (given that you then click 'Install Update'). That's actually a pretty elegant design idea. 

Yet, there are two problems with this approach. First, I doubt a lot of people actually notice this potential. With this kind of interface, you get drawn to the "install update" button. Or, if you're in fact giving a presentation, you just click whatever button you can as fast as possible to get this annoying window out of sight. A more general concern with automatic updates, second, is that new isn't always better. If an app goes from say version 1.4.23 to 2.0, it may actually be wise to stick with the old version for a while and let them figure out the bugs before you update. Or you simply don't like the new look and feel. Or, which is getting increasingly common, version 2.0 really means the same functionality as version 1 but now with ads all over the place. 

So when it comes to software updates, I'm leaning more and more towards update-as-you-quit as the more humane approach, with minor, bug fix updates automatically installed in the background. 

In light of this, maybe SoundTouch's approach could be seen as the humble beginnings of an entirely new breed of interfaces, "apologetic interfaces", characterized by low self esteem and by being aware of their propensity to annoy.

"I'm so sorry for wasting your precious time and valuable screen real estate, Dear User, but before we part I would like to let you know that there is a new me for you. No pressure, just letting you know." 

Come to think of it, too much of that could become annoying as well.