Plug ‘n Play

by Jinryu

I remember, back in the 90s when I was in High School, one of my English teachers was saying how he hated the information age. It kept on inventing words that either were ugly words, or which read as one thing but meant another.  He didn’t like, for example, words like “login,” and the variants, “log-in,” “logon,” etc.

 

I find I agree with him.  Words don’t mean much nowadays in tech.  I might just be chasing at shadows on this, but I think it largely has to do with capitalism’s influence, specifically marketing.  What words mean don’t matter any-more– what we’ve done is made proper nouns out of what would normally be common nouns.

It’s a bit hard for me to explain this idea, because I haven’t sorted it out myself.

But take for instance– USB.  That was something that a small fraction of people knew about in the 90s– now, even though my parents, now in their 50s, have no clue how it works, they’ll still know what USB means when I tell them to plug something into a USB port.

In reality, it stands for “universal serial bus,” which to the layman will make no sense whatsoever.  I’m more annoyed by words like “universal,” because it’s really not.

Similarly, a big idea that came out of the 90s was “plug ‘n play,” the idea that you could just plug something in and it would just work.  Well, it’s almost 20 years later, and we still don’t have that experience across the board, even when people releasing software and hardware say you can.  You’d think that plugging a printer into a laptop would be simple enough– the plugs only fit the ports if you connect the right ones, for instance.  But software size? Things don’t work that simply.

Yeah, I’m sure it’s difficult because all these manufacturers and developpers aren’t agreeing on standards– but occasionally, it annoys the shit out of me when even a single company can’t keep up with it’s own departments.

Just now, I was installing the latest version of Firefox on my laptop.  Well, it wasn’t a fresh install, more like a reset because I had so many plug-ins that had since died over the past little while that resetting things was the easiest way to clear the slate.  After that, all I wanted to do was re-sync my bookmarks, history, passwords, etc– which were all on the cloud.  There should be a redundant copy on my mobile phone’s firefox as well.

How long did it take me to set up the synchronisation? Over 30 minutes.  I was doing something else in other tabs while looking up how to do it– but for a feature that’s supposed to make my life easier, it was surprisingly difficult to set up.  I’m not an idiot either– I consider myself a lot more tech savy than the average person.

It just so happens that the settings for syncing on Android phones and on the Debian version of Firefox are just buried in inconsistent places, and rely on all these extra steps with this ridiculous “sync key” pairing code where they could probably get the same thing done with a traditional login and password interface.

 

I just want my bookmarks to be the same on my phone as on my laptop.  This should be easy– but even among two Firefox products, the process is just painful.

Yes, the world is still a lot easier than the days of DOS and things.  I would never have been able to get my parents into basic computing back in those days.  But still– it just seems like the conceptual stuff, like user interfaces (where you want to bury your settings and options) should be the easy part.  It doesn’t have to look pretty– you just need to make it consistent and logical.

 

But there’s always all these nigging little details that annoy me about interfaces that really ruin what could otherwise be an excellent experience.  This causes confusion and stunts adoption rates.

 

It’s not just a problem on computers– it’s also a problem in games.  User interfaces are incredibly important– it’s the technology equivalent of “people skills” in the business world.  Without a good connecter between the product and the user/client, it doesn’t matter how great the work into the product is– it will still be shit, because the client can’t access it properly.

I’ll give you some examples of user interface aggravations:

  • Facebook.  While I understand that Facebook wants to add features, if they’re going to do it, they should probably add them across all platforms simultaneously instead of rolling them out over one platform and not another.
  • Microsoft Office.  Whenever you change something as fundamental as the menu system, you need to keep in mind that the transition is probably going to lose you a lot of clients.  While logically grouping certain features together may make sense, you’re playing with fire if you group functions differently from how they used to.
  • Firefox: There is a hella lot of inconsistency across platforms and versions as to where you configure things.  In some versions, there’s an “Options” menu, in others, you find “Preferences” under the “Edit” menu.  It others still, it’s under “Tools.”  In some versions, Sync is a feature of one of the above– in others, it’s nested in the phone’s system settings (as opposed to in the in-application configuration).  What is this, hide and seek? That’s just annoying as hell.
  • Microsoft Windows 8: …Enough said.
  • Any university’s website, which has multiple logins for your email, your schedule, your Moodle / Blackboard, the library…

You wouldn’t go into the service industry if the interface for your service, the sales people, weren’t trained to get the client what they wanted.  So why is it that in the world of technology, we place so little importance on user interfaces?  What makes technology an acceptable place to screw around with mediocre client-facing investment?

It might be a bit too much to ask, but maybe if developers spent time on beta testers who didn’t just test a single product, but integration and consistency across multiple platforms, maybe users would be a bit happier?

Advertisements