[good stuff woefully excised]
But what is multitasking, really, and why does the iPhone model seem different? I mean, with some exceptions (for example, listening to music while you surf the web), you’re not really doing two things at once. In most cases you are actually doing two things sequentially, say, writing a term paper in iWork and looking up sources in Safari, not writing a term paper in iWork while looking up sources in Safari. Why does it seem like you’re doing both at once on a desktop?
Actually, Safari is one of the few iPhone apps that do run in the background. I frequently take advantage of this by requesting a page, clicking Home, and then launching another app, knowing that the page will be loaded and ready when I switch back to Safari. I’d love it if I could do the same with, say, Tweetie, or any other app that loads lots of data over an often sucky Internet connection. This shortcoming is one of the major reasons the iPhone feels (and, I assume, the iPad will feel) less efficient than a full-size computer.
In other words: no, I don’t personally multitask, strictly speaking, but it’s damn nice that my computer can.
- Memory-heavy apps. In tandem with the slow-loading issue is the fact that, with some apps, you can practically see the battery indicator draining as you’re using the application. A move toward old-school draconian memory management seems likely, as users become less and less tolerant of hoggy applications.
This is more an issue of CPU use than memory use. The iPhone is already fairly draconian when it comes to low-memory situations; if your app is using too much, you (the developer, that is) get a warning, at which point you have to start freeing RAM or be terminated. But it’s mainly games and other CPU-intensive apps that drain your battery.
- State (hat tip, Neven). This, I’d say, is the single most crucial point. By remembering where you last were in a given app, developers could easily make it appear almost exactly like multi-tasking. You open Safari; it’s on the last website you read before you closed the app. You open Notes; it’s on the blurb you’d just written. You’re pretty much multi-tasking at this point. You’re writing your term paper while researching on the web.
Yes, absolutely. Persistent state is the reason Tweetie is my favorite Twitter app and the Tumblr app is… well, not my favorite Tumblr app.
One reason so few apps store state is that the iPhone APIs don’t make it particularly easy to do so. The single best thing Apple could do to improve iPhone/iPad usability would be to provide a way to freeze an app’s entire runtime state in carbonite on exit and thaw it on the next launch, so the app wouldn’t even know it had exited (except maybe it has to reestablish network connections and oh look, suddenly it’s three hours in the future).
If every app did that, I bet no one would complain about the lack of multitasking. (Well, no one but me, but I’m a huge ass.)
- Jason Permenter: We should have just gotten one pot. I'm so stupid.
- Tea shop waitress: Yeah.
Let’s say I have a tumblr draft that I’ve spent a little time on. ;)
And let’s say I’m finally happy with this draft, what with it having all the proper emoticons and hyperbole, so I decide to publish it. :)
And then let’s say that An Error Has Occurred when I try to publish this draft. I get that. Shit happens from time to time when you are trying to maintain a live blogging platform with hundreds of thousands of users. No biggie. :/
But wouldn’t it be nice if tumblr didn’t lose the whole draft so that it shows up neither on the dashboard nor in my drafts? Like, say, oh, I don’t know, waiting for the draft to successfully publish before deleting it from drafts? :(
Mother pusbucket. :-S
There is a thing in database lingo called a transaction. Basically, it’s a group of related commands (subtract X dollars from account A; add X dollars to account B) of which it is absolutely guaranteed that either all will complete (payment) or none will (no payment), in which case the data “rolls back” to where it was before. This prevents an error in the middle of the sequence from causing an inconsistency (disappearing money; creditor lawsuit; FDIC investigation; loss of banking charter).
Pretty much all modern databases of any sophistication support transactions, and they’re super easy to use if, well, if you know what you’re doing.
I don’t know why I’m bringing this up now. Guess I just thought it was interesting.
(In the car, I put on “Up with People” by Lambchop for my wife to hear. Several minutes pass. Suddenly, she points at the dashboard.)
Wife: I wonder how this is made.
Me: Really? Do you?
Wife: Yes, I think the pattern is interesting.
Me: No no, it’s lovely. It’s just that I put on a song that takes six minutes to build up to the good part, I think we’re both enjoying it, but then right when it starts getting groovy you’re like “PLASTIC IS NEATO!”
Wife: I don’t like you for five seconds.
Me: Are the five seconds over yet?
Wife: Yes. But they existed.
In a great post called Always On, Adam Lisagor wrote about a subtle feature of the built-in iPhone camera app that makes your photos less blurry. It was a perfect I-see-what-you-did-there: a rare example of designers and engineers refusing to take the easy way out when clearly they could have gotten away with it. In the same spirit, I decided to write about something that impressed me in the desktop Mac OS. This might be old news, but I just noticed it.
You know how it’s easier to click on things that are at the edges of the screen, because you don’t have to aim as precisely? Well, that’s because of something called Fitts’s Law, which is fancy-talk for the fact that the bigger a target is, the easier it is to “acquire” with the mouse. Specifically, what matters most is the size of the object along the axis of motion of the mouse, which means the edges and corners can be considered infinitely large because you can’t overshoot them. So the best places to put a commonly clicked user interface element are at the edge of the screen (which is infinite along either the horizontal or the vertical axis) and in the corner (infinite along both axes). Acquiring a corner or an edge takes no time or thought at all; you just heave the mouse in that general direction until it stops.
Apple knew this when they decided to a) put the Apple menu in the upper-left corner and b) make its clickable area extend all the way leftwards (see below), so you could activate it just by slinging the mouse vaguely up and to the left.1
The same ease-of-acquisition applies to the Spotlight menu, which is in the upper-right corner. And, of course, Mac OS X’s menus are glued to the top of the screen, so for purposes of Fitts’s law, the entire menu bar is infinitely tall.
But what happens when you have two monitors arranged side by side, so what used to be corners are now just ordinary points along a horizontal edge of one big desktop? The menu bar can’t span two screens, so whichever end is on the inside loses that Fitts magic and becomes much harder to click, right?
Nope. Apple’s engineers are smarter than that. If the OS detects that your pointer is approaching one of those “internal corners” at a steep enough angle and with sufficient velocity, it pretends the other screen doesn’t exist for a moment and stops you right where you want.
To me, this is perfect user interface design. It anticipates your expectation and silently arranges things so that you aren’t surprised — most likely, you don’t even notice anything funny is going on. And in every other case where you’re moving the mouse near a false corner, it leaves you alone to do as you please.
This isn’t a feature Apple advertises — it’s not even a user preference because it’s completely unobtrusive, and probably 99% of users will never encounter it because they stick with a single screen — but if you need it, it’s there, quietly making your life a tiny bit easier.
1In contrast, Microsoft seemed to have heard a rumor that corners were useful when it decided where to put the Start button in Windows 95, but since the button itself was inset by one pixel, you couldn’t actually click it by aiming for the corner. (They finally fixed this blunder six years later, in Windows XP.)
- Wife: How far apart do you think we should have our kids?
- Me: At least six feet at all times.
Great question! But I must say I’m a bit surprised to hear it from someone so obviously intelligent and handsome.
The answer, of course, is not to use pronouns at all. Just speak in the universal language of sweet, sweet love.