Wednesday, December 31, 2008

Zero Visibility Vehicles (ZVV)

Ever been out driving and gotten stuck behind a big SUV?

And it blocked your view of anything happening in front of it in the lane?

So if there was a sudden slowdown of the line of cars in front of you in that lane, you wouldn't have any early warning?
If there was a sudden accident or collision, you also wouldn't have any visible early warning, because you couldn't see through or over the vehicle in front of you?

Yeah, it's horrible. And with the massive rise in SUV ownership in America, it now happens to me almost every day, whereas it didn't say 20 years ago.

I call that situation "ZVV" in my mind, which stands for Zero Visibility Vehicle.

It's annoying and dangerous. As a rule, I try to avoid and extricate myself from situations where I would be behind a ZVV.
I don't call it just SUV, because SUV's are one particular kind of vehicle that can cause the ZVV effect. Also trucks and vans, for example.

Though I'd love to drive around knowing I had the safety of armor-plating protecting me, like a tank, I would also feel bad about creating a ZVV situation for people behind me. Since my main ethical guideline in life is The Golden Rule ("Do unto others as you would have them do unto you.") it is this ZVV effect which is one of the reasons I haven't rushed to buy an SUV of my own even as the rest of society seemed to be doing so. The gas mileage was a bit of a lesser concern, though also important.

Mike

Tuesday, June 17, 2008

Call of Duty 4: Suggested Improvements

Here's a list of suggested improvements to the game Call of Duty 4: Modern Warfare, for the Sony PS3. It's a great game, one of the best FPS I've ever played, and it's finally knocked Counter-Strike off as my favorite FPS. And that says alot because CS, in various iterations has been my favorite for a long time. That said, despite how great the game is, it has some flaws. Nothing is too horrible, but most are annoying or are glaring opportunities to improve the user experience. Here they are:

1. fix team assignment balancing
Your algorithm currently will put 5 level 50+ players on one team, and put only 1 on the other. Are you on crack? This assignment almost always produces seriously unbalanced teams skills-wise, with a resulting lop-sided match results.

2. give players a way to turn off the hearing of other player's voice chatter
99% of it is the audio equivalent of garbage (teens saying "you suck!" or kids making funny sounds, dogs barking, lots of juvenile humor, etc.). In the case of total strangers playing together randomly on the internet.

3. honor a player's desire to leave a particular server/session
Currently when you quit a server/session, then try to join a new game that otherwise has the same traits (game mode/style), your system will often reassign you back to the same exact server/session you just left. This is annoying. In the case of an explicit quit action on the part of the user there is a *reason* why the user left: typically because the map sucked, the players sucked, or it was a lopsided steamroll fest, or there was possible cheating behavior going on. Either way, the user indicated he didn't want to play on it by leaving it.

4. improve/widen time span in which user can indicate he doesn't want to see his death replay
And/or give user a configurable option to specify whether he always wants to see it, or never. Currently there's a very quirky time window in which the system polls for a control action indicating the user doesn't want to see his death replay. Sometimes I do, sometimes I don't. Often when I'm in the heat of battle and I want to get back into battle quickly I do not want to watch the replay because I don't care that much about seeing where my killer was (perhaps I know or am confident of his location, or, I just don't care) and want to get back to playing immediately.

Friday, May 30, 2008

Time Machine Statements

Something I've always found funny and sometimes annoying are statements like this:

"The market for X will be 4 times what it is today by 5 years from now."
"It will take 4 years for the price of SSD drives to become competitive to traditional disk drives."
"Wall Street analysts agree that X will be selling at Y by the same time next year."

And so on...

I call these Time Machine Statements.

Because they are statements that can only be made with a straight face by someone who has just stepped out of a time machine, returning from a voyage to the future. In the magic Time Machine. Which, of course, don't exist. And therefore, what they are saying is nonsense. It's not factual. It's bullshit.

At best what they're saying is "Given a certain set of assumptions, and given current trends, and there's nothing else happening today or will happen tomorrow or the day after or a year from now, that could impact this prediction, that I have not already taken into consideration etc., then if I take a certain number and apply a certain equation or algorithm to it, it yields of value of Y around T units of time from now."

That's it. Basically, they've taken some numbers, made a few simple assumptions, they abstract away most of the details of nitty gritty REALITY and come up with a pretty graph. And the graph goes up or down or whatever. And all they're doing is giving you a summary description of that graph.

But what they have not done is describe the future in a factual way. Because they have not stepped out of a Time Machine.

As a counter-point to this, I'd like to give some examples of the types of statements you can make about the future, with a straight face, and without the benefit of a Time Machine.

"It will be darker tonight than it was at noon."
"I live in Chicago and I can assure you it will be colder and darker, with shorter days, during the winter than it will be during the summer."

That's a smart and 99.999%+ reliable way of making believable statements about the future. (Notice I did not say 100%: perhaps there will be some celestial event or nuclear conflaguration that night which lights up the sky enough that it's as bright as the day. But on 99.9999%+ of the days experienced by all of humanity so far in the past, it appears, that has not been the case.)

No Time Machine needed. No crack pipe need be smoked.

Sunday, April 6, 2008

The Man Taylor Lives

Sad day, just heard that Charlton Heston died.

At first I was shocked because I thought he had already passed away a few years ago.
I generally am not effected much when I hear a total stranger has died, especially an actor. But I grew up watching him in movies, and several of his characters were idols of mine, in various ways, as a kid. From Moses, to Ben Hur, to astronaut George Taylor from Earth in the original movie Planet of the Apes. Some people in the supposedly "liberal" camp painted him in a negative light for some things he's said in the last decade or so, but I think such criticisms are not important, plus, no man is perfect.

I do think that something that everybody needs is positive role models in their lives. Especially in childhood. Plus, as children, we have a much greater need to fantasize, live in stories, and play out imaginary situations in our minds. I think that's a very necessary and healthy part of growing up. It may be a built-in behavior, with a beneficial purpose, and so it needs to happen, or bad things will happen later on in life.

Regardless, I have very strong memories of some of his movies, and his role in them. Though I am not a religious person, I do like many of the central stories and myths in the Bible, and Moses and The Ten Commandants are right up there.
I'm also a big fan of science fiction and fantasy. And so he's in my pantheon for Planet of the Apes, The Omega Man, and Soylent Green.

One consolation is that the product of his work will live on. We'll always have his movies. Somewhere in the alternate dimension that exists when you're watching Planet of the Apes for the first time, the astronaut Taylor -- a man who speaks! -- is down on his knees on a beach on what he thought was an alien world, staring up at the ruins of the Statue of Liberty, with the terrible realization dawning, and he's yelling out his curse at the fate of his world, damning those who have ruined it. One of the best endings to any story, of all time. And also a powerful moment in movies and science fiction. Damn you all to hell, indeed.

Saturday, March 29, 2008

Magical DSL's

DSL's are great. But they're not magic. And they're not new.
They're everywhere and been that way for a long time.

HTML is a DSL.

Python is a DSL.

The Unix shell and command-line "wiring" syntax is a sort of DSL for the orchestration of program execution.

A program is a DSL. Take the 'ls' program, for example. A DSL? How so? Is it because it was made with Ruby? It WASN'T made with Ruby?!?! How so? How can 'ls' be a DSL? Easy. The DSL it provides is composed of it's own name 'ls' plus the set of invocation arguments it supports. So 'ls -la' means something different than 'ls -latr'. Those are just three words (or phrase permutations) that are expressable using this DSL. If it's a DSL then what is the domain? The domain (the D in the DSL) is a file system. The language (the L in the DSL) is for expressing something about files on a file system. Or rather, for expressing some query you have about files on a file system. Either way, it's not magic. And not new.

The configuration file content read by your programs also act as a DSL. The domain? The domain of your specific program's behavior, of course.

An API is a DSL. API's have the rather useful but unsexy and seemingly overlooked quality of being a way to create DSL's out of what is otherwise a general purpose programming language. So you start with a general language at the bottom (C, Java, Python, whatever), then on top of that build another language which is more specific to the problem you're trying to solve, or idea you're trying to convey. That top layer is the API.

English is a DSL.
French is a DSL. Though the domain is almost exactly the same as English. Not exactly, but pretty close. There's no law that says you can't have multiple completely separate and distinct DSL's that exist, that all apply to the same domain. They just treat that domain a little bit differently, by having different qualities or strengths.

Algebraic notation is a DSL:
f(x) = x ^ 2
By the way, this DSL was created BEFORE Ruby. I know, I know... Mats, what you say?!?!

Music notation is a DSL.

If there were a language for describing or casting magic spells that too would be a DSL. And it would be the only DSL that might justify being treated as if it were magical. :)

All of the above are DSL's.

If a DSL is any syntax or linguistic protocol used for doing something or expressing something, then why all the fuss? We've had these and known about them for years. I'm not talking a few years. Not even a decade. I'm talking thousands of years, perhaps millions, depending on exactly how inclusive you want the definition to be. I'm sure we could make a reasonable case for Egyptian hieroglyphics being a DSL, and the grunting of Neanderthal cavemen.

One last point. Yes, I know you like Ruby. And I know you like Lisp. I see the admirable qualities of both those languages, and they're shared by other general purpose programming languages as well. But please, don't think they enable you alone to make DSL's, or that you can only create them using those languages. It's been done without them. It's being done without them. It's been done before them. It'll be done again, without them, I suspect, after they're forgotten and we've all moved on to the next shiny language that perhaps hasn't been invented yet. It's been done. Been there. Did it. Got the T-shirt.

Wake up. Get over it. DSL's are not magical. And they're very old. Move along folks. Nothings to see here. The next shiny hype-able meme is just around the corner, in the hallway to your left. Just follow the signs. But no flash pictures please.

------------------

By the way, if you're a language geek you may want to check out a short piece of creative writing of mine, called The Suggestions System. Here's the link to it on my other blog:
http://grograma.blogspot.com/2007/05/suggestions-system.html

Monday, February 18, 2008

Lemonade Stand

One of the first experiences I had as a child with running a business was a lemonade stand.
The idea of a Lemonade Stand, as a sort of pattern or model for business still sticks in my mind today. I think because it's both very simple, and, it has all the key elements that make something a business enterprise and not just a work-for-hire labor-for-employment-and-paycheck situation.
It's very simple. You just make lemonade, set up a stand somewhere, offer it for a price, try to attract people, and try to make a profit on each sale and grow the volume of your business. There's no office politics. You don't need a resume. You just make something somebody wants, offer it for a price, let the world know about it, use the money to pay the bills and maybe have extra leftover, improve, rinse, repeat. That's pretty much the essence of business. I love it.

Now, as a practical option, in today's world, for an adult to consider, it's not really a very good idea. But for kids, or as a textbook example, it's really smart idea and a useful thing to study.
Here are some of the things that I think are bad about a lemonade stand, or certain kinds of problems or negatives associated with it, that make it less than ideal in the real world:
  1. lemonade is now a mass-produced commodity made by large companies with deep pockets and with lots of advertising
  2. you have to have a physical presence at the lemonade stand, to run it. This will hurt your abillity to take a day off, or to scale up by adding more stands in other locations. This is not a total showstopper because you can sneak around this requirement hiring others to man the stands and you simply manage them. Then hire managers to manage the folks running the stands then you simply oversee and direct at the highest level, and reap the profits. The problem here once again is that the more in this direction you evolve you evolve closer to an entity we've just described in point number one: you're now competing against a pre-existing large corporation who is already doing this, been there, done that
  3. your business model is fairly generic and can be easily copied by competitors or ambitious employees. UNLESS you have a very unique recipe that's significantly better tasting than the other vendors AND you keep the recipe top secret AND there's no way to reverse-engineer it. All of which is highly unlikely, so, again, you're back to a business that can be easily copied.
So trying to get rich (or even break even) by starting a lemonade stand business is probably not a wise idea in today's world. But as a model for talking about business, for educating about business, or for introducing kids first-hand to what it's like to run a business, it's great.

Thursday, January 24, 2008

iPod Touch: Improvement Ideas

I love the iPod Touch and it's quickly become my favorite little mobile electronic gadget. Very useful. Very fun. Very cool looking. Sleek. Addicting. Lots of smart choices and design decisions went into it. Apple rocks once again.
But it ain't perfect. At least for me. So I'm documenting here a list of problems I've had, or ideas on how to improve it.
I have an iPod Touch 16GB model with the 1.1.3 firmware and the January 2008 Software Upgrade.

Safari
  1. a way to click on a link such that it opens in a new, separate tab/window from the current one. This ability is one of the killer features of modern desktop browsers such as Firefox. Actually it's what I do on both Firefox and Safari, on both my PC and my Mac: I frequently scan through a blog-type web page and click on any link that interests me in such a way that it opens each in a new separate tab, without grabbing focus (they begin fetching into a background tab) and I continue reading the original page. Only once I'm done digesting the original page do I want to go look at the other newly opened tabs. Since the fetching for them occurred in the bg, I often don't have to wait once I switch to their tabs/windows: the page has finished loading and rendering so I can begin reading them immediately. This is a time saver and huge convenience and user experience win.
  2. hitting the Back arrow should not re-issue the prev request (a refresh) but rather just switch back to the rendering of the prev state. Currently you are forced to wait for a new req/response round trip to finish before you can see/use the prev page. This is unnecessary and annoying. It seems more smart to me that if a user truly wants to refresh/re-issue a page request that he simply hits the Refresh button. This would be more orthogonal and intuitive behavior, and more consistent with how browsers on desktop computers work. It could reduce battery power drain by reducing WiFi antennae use, and bandwidth/traffic on your provider, although in exchange for more memory use on your iPod Touch. ALSO once it has returned to the prev page's view, it should also be focused on the same spot as before. In terms of (x,y) focus and zoom level.
  3. issue #2 compounds the problem caused by issue #1, and vice versa
  4. double-tapping sometimes does not zoom in as I expected. Instead it seems to bounce or reject my zoom request. I'm then forced to do the two-finger-spread-out-motion technique, at the same spot, to make it zoom in. Not a huge issue, but a minor annoyance as it would be easier and faster to do the quick double-tap. The two-finger-spread makes more sense if I want to control precisely how much zoom in/out occurs. If I don't care, i just want it bigger/closer, the double-tap would suffice. Overall the link/button tapping, and page zooming & panning features of the browser UI work great, and are a joy to use. This is perhaps the only minor issue I can think of offhand.
  5. fix the crash bug(s). Safari crashes "to desktop" about once a day or so during use. At least the core GUI/OS stays up and I can usually get back to where I was before, but it's annoying and unnecessary that I take that time and focus hit.
Music
  1. smart sound volume limiting. The problem right now is that I frequently have to fiddle with the sound volume when a new song starts playing, because the played volume is either painfully too loud or far too quiet. I've turned on the Volume Limit feature, and tried setting that bar at different levels. No matter what I set it too, it seems to mess up the volume due to the logic it's applying. I've also played with the EQ and selected Loudness as the EQ profile. I did not notice any change nor do I know what the heck that does. I may not have done enough research into this issue so if this problem has an existing solution somebody please let me know. I don't remember having this much trouble with music volume on my classic iPod (5th gen? 30gb, black, video). The rule I want the player to follow is something like, "By default, play the song at the default volume specified in the sound file. UNLESS the maximum volume the player would emit would exceed some arbitrary volume level limit in the user options, in which case, the max volume played should be precisely that limit level, and the rest of the sound's within that file should be automatically adjusted downward in volume by a proportional amount, ideally without distortion (but meeting this goal is not as important as the requirement to cap max volume.) ALSO if the highest volume sound within a file is too low (too quiet) then dynamically adjust upward the volume of the entire sound file, maintaining proportions where possible, such that the highest volume sound is no less than some minimum level specified in the user options. Both the hard minimum and hard maximum volume limits are specifiable in the device Settings. The user can hear a sample tone played in his ear played at the current volume level he's considering, so he can get a sense of how loud or quiet that level is. I don't see any reason why, via hardware and/or software, this behavior can't be enforced. Maybe it's possible and I just haven't RTFM enough yet. Maybe it's a planned future upgrade.
YouTube
  1. specify an existing account on YouTube to use, so that Favorites and Subscriptions are the same or synchable.
  2. way for user to manually re-order Favorites, similar to how he can do Safari Bookmarks (unless app is associated with a YT.com account, in which case, use the same order as used there.)
  3. smart sound volume limiting (just as I'd like to have in the Music app)
Notes
  1. a way to sync content with your desktop computer. Notes is a great tool when I'm out and about to take and see notes. I mostly use it for TODO lists, but sometimes also for recording brainstorms or design ideas. But then the content becomes somewhat "dead" because I can't easily access it from my desktop computer. The email option does not cut it, because it sends a one-way snapshot to the dest address. The data on the far side (the received email) cannot be modified there unless you're willing to create a fork, or lose all new changes (they won't propagate back to your iPod Touch, thereby getting out of sync with the version there.)
Weather
  1. 'update failed' should not cause the previous weather report to be blanked out
Stocks
  1. graph time-period logic/plotting done by a crack monkey. it literally hurt my brain trying to reconcile the number differences implied between the various period sizes, I saw so many contradictions and non-correlations. Things that should have overlapped didn't. Things that did overlap would report different values for the same time slice. Maybe your app isn't at fault and it's just blindly rendering the data it gets from the source feed. In which case, the data feeds are provided by a crack monkey.
Google Maps
  1. way within Google Maps to make it show the location (Pin?) of every contact in your Contacts data that has an address. Since a contact can have an address and Google Maps can show addresses it makes sense that there should be lots of useful 'two-way' connections between these two apps.  What's good already? The fact that starting in a Contact I can click on an address and it takes me into Google Maps focused on that address. That's useful. It would also be useful if I can start with a particular location/address and then make it show (on map, perhaps via pins) or list (in some scrollable report?) all contacts whose address is within X range, for example. This entry is more of a thinking-aloud, nice-to-have entry, and not describing any serious problem or bug I'm experiencing, so no big deal to me if nothing like this ever gets added. Things like the non-cache-using Back button in Safari, or the auto-psychotic-text-clobbering feature of the Keyboard are much bigger issues.
Flash
  1. add it! it's missing. and yet it's almost everywhere on the web, used in different roles. I understand what some of your reasons may be for not supporting it (1: to be anti-Adobe, 2: to encourage non-"proprietary" alternatives, 3: to encourage a proprietary Apple alternative, 4: preserve CPU, 5: preserve battery, .... other reasons?) but my web experience is worsened on a daily basis by not having Flash work on your device. You would get me to spend more of my web use time on iPod Touch rather than my desktop computer if you supported it. On a little more personal note (although on one that I'm sure lots of other people share) I have a business that provides Flash-based web applications. It sure would be nice if my customers and also myself as the developer could access these Flash apps on your device. Would increase my market, and increase the usefulness of your device to those folks. Seems like almost everybody would win. Again, unless somehow one or more of the 5 example concerns I cited above are weighted more heavily in your decision-making.
Keyboard
  1. option to turn off the autofix/suggestions functionality. In many cases it's annoying and imposes makework on me, imposing a sort of "Microsoft Knows Best" anti-pattern of UI behavior. I often type using personal slang, industry-specific terminology or lingo, acronyms and abbreviations (to save typing) and I am frequently bitten by this autofix/suggest thing clobbering my input, changing it to something else, or otherwise requiring me to perform an additional input task (moving my finger from its original position over the keyboard up to the text body and tapping to dismiss their suggestion, then moving back to the keyboard again), thus unnecessarily slowing me down and annoying me. Fixing what is obviously or very likely a spelling error ('jsut' should be 'just' 99%+ of the time) is a good thing. Clobbering or distorting an acronym ('amd' becoming 'and') is bad. Especially in the Notes app where it's perfectly acceptable if my saved note contains spelling errors, because I will still know what it means. If I could turn off this feature in Notes, and thus increase the number of spelling errors saved in my notes, but also experience less clobbering/distorting of what I really did intend to type, then I would be happy with that tradeoff. If in my save notes the app has clobbered or twisted my input into something that I cannot "work back" or deduce what I intended to say, that is the fault of your software, not me. The autofix/suggest functionality may be a net-win in certain apps/contexts (web browsing & form input, contacts, calendar) but a net-loss in others, like Notes. And I'm sure it would vary by person. A perfectionist with great spelling who uses lots of lingo and abbreviations may hate this feature, whereas the opposite type of person may love it. Make it an option and let the user decide. Easy win!
General
  1. ability to sync readable documents (such as PDF and TXT) from my desktop and read them on my iPod Touch anytime at will, even offline. Basically do for them what the device already allows for images and video. I'd love to be able to have fiction and non-fiction books, as well as reference manuals or guides, on my iPod wherever I go. I could read a book or consult a manual whenever I needed or had the time.
  2. Image sync fails on common formats and fails mysteriously. When i sync images some of the images on my desktop computer consistently fail to do so, for no given reason. I just get a popup error message telling me one of the filenames that failed. (A JPEG file, in my case.) I wish it provided more detailed information on exactly what files failed to sync, exactly why they failed. Or, simply, it should "just work". If the reason is simply that the files are an an unrecognized or unsupported encoding, then just tell me that, with details, so I can try to rectify the situation by converting them on my own. I have a sneaking suspicion that I haven't dug into this issue enough yet, however, and so it may be my fault for not doing enough RTFM. But it is one of the issues biting me, so I thought I'd list it for sake of completeness.
  3. Copy-and-paste text between apps. So I can select a chunk of text on a web page, then switch over to the Notes app and paste that text into a notes document. Or into an entry in Contacts, or Calendar. Etc. I realize the trick is dealing with modality and input commands but there's probably a way to make it work. The 'move cursor with finger' functionality you provide in forms works pretty darn well, so perhaps the implementation could work like that when selecting text.
  4. Web page WiFi portals are too manual and crack-addled. Using AT&T WiFi account access to the Internet (the kind where you request anything in a browser, it redirects you to a web login page, you submit your WiFi login credentials, now you're approved, and after that all your web & internet access requests work) has two pain points or opportunities for improvement. Would be nice to have some way where I could store (or it could remember) my login credentials on the iPod, so I don't have to manually type it in and submit a web form every time. And two, I notice that sometimes, when I am already logged in and accessing the Internet successfully, surfing the web, etc, and then I temporarily turn off my iPod (a soft off, not a hard off), to visit the bathroom, for example, and then come back out, sit down and turn it back on again, then try to access the internet, I find that it has de-authenticated me or otherwise forgot that I had an authenticated session, and redirects me to the login page and asks me to login again. This is just stupid. They should be able to use cookies and timeout functionality to ensure that I continue to be authenticated and allowed up to X minutes since I last logged-in and/or issued a request on their network. Simply putting my iPod to sleep for a minute or two should not cause it to throw away my status. It feels almost as if some hard signal or explicit behavior was coded to make it do this. It's bad for users, so don't do it. If the justification was, "Well, since you turned off the device, then turned back on again, the device doesn't know who the human is using it, it may be a different person than the one who originally typed in the login credentials." then that argument is bunk because that argument didn't prevent the Mail app from having a totally automated and trusting auth process. Plus, worst case, you could make the iPod, at the device level, require the user to re-login to the device itself, as whole, if you were concerned about that. But even that should be a user option. If there's some subtle underlying technical reason why my device must lose it's auth status with respect to my WiFi provider then at least give me a way to automate the submission of my login credentials. I realize that part of the issue is the fact that it's a web-based auth system. But I'm pretty sure that's not a showstopper constraint. It's just software. There IS a way to make it work. A thumbnail solution is to automate that form submit, that's a simple HTTP request to a particular address (which could be communicated to it from the Wifi provider, or grabbed from user settings), using a username/password, also stored in the iPod's user settings. Not hard to do.

Tuesday, January 8, 2008

Forkers

A forker is someone who unnecessarily forks a conversation or otherwise throws a kind of fork in the road. Forking is annoying and wasteful because it causes makework to be performed and imposes an additional burden of energy expenditure and brain gymnastics upon the victim of the forking. And it wastes time as well. Here's an example of a conversation in which a smart person named Joe encounters a forker named Bob. Put your self in Joe's shoes as you read it, and I think you'll understand.

Joe: I'd like a cheeseburger to go, and that's it. Nothing else.
Bob: Ok. Would you like fries with that?
Joe: No. Just a cheeseburger.
Bob: Do you want onions on the cheeseburger?
Joe. I don't know. I don't care about onions either way. Whatever it is by default.
Bob: Ok. But onions will cost you extra. They're not on the burger by default. So are you really sure you want onions on it?
Joe: I never said to put onions on it. I said I didn't care. Just do whatever it is by default.
Bob: Onions it is. Will there be anything else?
Joe: I said that in the very first statement I made to you. It hasn't changed since.
Bob: Sheesh! Sorry for asking. Don't be so prickly......Oh, and will that be for here or to go?
Joe: Nevermind. Cancel it. I'm leaving. *walks off*
Bob: (to himself) What a crazy customer.


That's an example of a bit of an extreme case but I think it illustrates the idea. Poor Joe has a brain and clearly said what he wanted at the very beginning, and idiot Bob, for whatever reason, kept unnecessarily forking and extending the conversation much longer than needed, wasting both of their time.

As a counter-example, let me replay the encounter above but this time replace Bob with a smarter, non-forking colleague.

Joe: I'd like a cheeseburger to go, and that's it. Nothing else.
Steve: One cheeseburger to go, that's it. The total is $1.57.
Joe hands him the money
.

Do you see how much smarter and more efficient that was, for everybody involved? Joe wins. Steve wins. The burger restaurant wins. The economy wins. Civilization wins. Both Joe and Steve have more time and energy to deal with other things in life, and are also less stressed after the encounter. And I believe by engaging in an intelligent, efficient exchange it reinforces in both their brains the execution of that behavior and so it's more likely to happen in the future, as well. An ideal situation. The key difference is that a forker was replaced with a non-forker. To be fair, I don't know for sure that Bob was an idiot, it may just have been that he had some inane company rules or "customer interface script" he was required to follow, and that was the cause of his seemingly inane questions and cluelessness. Or perhaps the background noise level was too loud and he didn't hear Joe's statement. (But if that was true, Bob should have immediately told Joe, and Joe could have repeated it once, and no subsequent forking would have been justified.) If so, that sucks for him. But regardless, it was forking, and it's bad, and makes the world a worse place, so he should have not done it. I wanted to specifically point this out because this article isn't about saying "dumb is bad", which is rather obvious, but that forking is bad. And forking might be caused by seemingly smart (or at least, non-idiotic) people and processes.

Here's an example of forking that didn't involve a Bob-type person.

Joe is driving along a road one day.
Up ahead he sees that the road he's on forks into two distinct paths, one to the left and one to the right, though they both continue heading generally in the same direction as before (north, in this case). He's forced to choose which path to take, so as not to run off the road entirely. He quickly chooses the right path. (Ha.) As he drives along the right path, he glances off to his left to try to see exactly what it was that necessitated having the fork in the road. He sees no reason for it. There's just a path of empty ground in between the two forked routes. Eventually, both forked routes converge back together again
.

Why would every driver be forced to choose which fork to take in the above example? What purpose did it serve? There didn't seem to be any. All the "designers" of that road have caused is a lot more time/energy/thought to be spent by drivers (maybe only a little per driver, but multiply it by millions of drivers on that road segment each year!), and probably have also introduced a new source of accidents that would not have existed otherwise. Therefore, this fork is stupid.

Forking can happen in everyday, casual conversation between people as well. It's not limited to situations where one person is a prisoner of some business process or rulebook. That said, there are probably exceptional cases where forking is not entirely bad. Where forking may have a beneficial side effect which offsets it's inherent negative aspect. For example, imagine a hostage situation where a police negotiator is talking to a kidnapper and wants to stretch out the conversation as long as possible, to help keep the kidnapper cool, thinking, and somewhat under control, possibly while a SWAT team moves in behind the building to launch a surprise assault. In that case, it would be smart for the negotiator to fork as much as possible, to draw out the conversation. The good outweighs the bad. But this is an exceptional case, not the norm. As a rule, don't do it.

Don't be a forker.

Sunday, January 6, 2008

The Principle of Normal Use

I've developed an idea over the years that I call The Principle of Normal Use, or, PONU for short. It's an idea I use in evaluating the quality or appropriateness of the design of something. Particularly of things you come into contact every day like furniture, appliances, tools, containers, doors, architecture, and vehicles.

The principle is that a thing should work well in normal use by a normal reasonable person. To work well it means that it shouldn't do things like leak on the floor, cut you, hit you, be too heavy or too difficult to grasp, neither too low (close to the ground/floor) or too high (out of reach by a person of average height and arm length), too hot or too cold, should have reasonable default states or forms, and so on. The list of qualities it should have is fairly long and depends on the exact thing in question. A door has certain ideal qualities that are irrelevant or don't make sense for a hand tool, and vice versa. It takes experience and a little insight to learn what works and what doesn't, what is smart and what is dumb with respect to any given category of thing, but when a thing violates PONU you will know it: you'll bang your head, get wet, slip, trip, type twice as much as you really needed to otherwise, be annoyed, exhausted or inconvenienced for no good reason and you might end up cursing it's so-called designer.

If a thing violates PONU due to a fundamental decision made during the thing's design phase, I call it a design-time PONU failure, or just DTPF or DTF.

Sometimes a thing seems to have been designed correctly, but when it came time to implement, build or install it, the people who did it made a mistake and deviated from the design. They may have done it because they were less intelligent, or weren't aware of the reasons for the original design or were rushed or chose to optimize away certain qualities in favor of reducing cost, for example. But regardless of the excuse, it causes it to violate PONU, and thus they have created a worse user experience, and worsened the human condition.

Here are some examples of everyday things that clearly violate PONU:

A pipe that leaks when water passed through it.

A computer that crashes after a few hours of simple word processing tasks. (MS Windows?)

A high-traffic public building in a cold windy city that has a single door to the outside (rather than a double-door or revolving door), and it's winter, and the door-closing-spring-mechanism thing is broken. And so every time a person passed through it a gust of cold winter air rushes into the building, hitting the poor folks sitting inside trying to stay warm, until somebody gets up and manually closes the door again. And the cycle repeats every minute or so with each new person to pass through it. (This is an example of both a design-time failure and a maintenance-time failure. But regardless it violates PONU.) There is a door in a downtown Chicago train station that's been doing this for the entire time I've lived in the area, which has been 2+ years. So clearly either the designer or maintainer of the facility is an idiot, in my judgment. It would be trivial to fix.

I believe all good ideas are eventually discovered by many people independently and so I'm sure that the principle I've described here has one or more "official" terms for it, probably in the areas of architecture or usability design. I wouldn't be surprised. But from my perspective I did develop this idea and term a long time ago, through personal observation and analysis, so I thought I'd blog about it here to share with others who may not have stumbled upon it before.

Saturday, January 5, 2008

Focus Ambush

Software in a GUI desktop mode should not leap into the foreground.

I'm in the middle of typing something and SUDDENLY MY INPUT EVENTS ARE SENT to a completely different app than intended.
This causes bad things to happen, at worst. (Delete something? Agree to something I did not want to agree to?) And at best, it causes me to waste time, and to have to redo work I've already done. Or fix things that I should not have to fix.

Or....I'm in the middle of looking at something or reading something and SUDDENLY IT DISAPPEARS OR IS COVERED OVER by something completely different. Now I've lost my place, my focus, my rhythm. And you've increased my level of stress or discomfort. And I'm going to have to waste time restoring the desktop to it's previous state. And get a little older and closer to death, and have nothing positive to show for that extra chunk of time+energy you've caused me to throw away. Dumbasses.

There may be situations where it is absolutely and critically important that you do that. An emergency. Life or death is on the line, for example. If that is the situation, then yes, by all means, do it. But if that is not the case, then do not do that. Will my computer EXPLODE IN 5 SECONDS UNLESS I CLICK A CERTAIN BUTTON BEFORE IT'S TOO LATE?!?!?! No? Then don't leap into the foreground. Dumbasses.

I'm looking at you, Microsoft. You are the Poster Boy for this behavior. There are others that do it too, but your company's software has done it to me many times over the years, much more so than any other software or company. And due to the number of installations you have, I can conclude you've been doing it to millions of other people, probably billions of times, all together. All I can say is that, gee, it's 2007 now folks. There may have been an excuse for ignorance back in 1980. But it's 2007 now. Dumbasses.

If the hardware, operating system, desktop GUI and end-user applications are architected such that it's possible for the user to multi-task, then by all means, yes, allow them to multi-task. And do not unnecessarily disrupt or annoy them. Let me give one particular example of a real-world use case where I have a legitimate desire to do something on the computer without being bit by Focus Ambush.

To make good use of my time, I may tell the OS/GUI to launch an application that I know I will want to use or take a look at in the near future. But I also know that it will take a while for it to make itself ready for my use. So, in the meantime, rather than wait around sitting on my thumbs and wasting time, it would be smart if I do something else useful, in the meantime. Like read something. Use a different tool or application. BUT THEN SUDDENLY your app LEAPS into the foreground, stealing my visual focus and input events. There was no excuse for that. And the problem is worse when I have several apps running or in the middle of launching. Yes, I know that I launched your app. I told it to start. But you know what? Your OS/GUI also knows that I am doing something else or looking at something else on the screen, in the meantime. If you (by "you" I mean the OS/GUI) know that you are showing me something on the screen right now, and/or that I'm in the middle of typing something (because, say, you have a record of input events arriving from the keyboard/mouse within the last X seconds, for example.) then it is reasonable for you to assume that if your were to push a new app/window into the I/O foreground that you could cause Focus Ambush. That's a reasonable prediction for you to make, as the OS/GUI (or rather, as the writer of that OS/GUI). Therefore, a smarter alternative is to provide some other, less intrusive indicator of the fact that your app/window is "ready" or wants my attention. There are lots of ways of doing this, just pick one and go with it. Make a little "hopping" icon near the bottom of the screen, for example. Or play a gentle chirping sound. Or flashing. Once, repeating, whatever, even make it user configurable. Something. Anything like that would be better than leaping onto the screen, covering some pre-existing applications or document, and stealing input focus. That is so obviously a bad thing. At least the good news is that I have seen cases in modern apps and desktops where the GUI did do The Right Thing, sometimes. (I've gotten apps on Windows to flash themselves in the taskbar in the bottom of the screen, and had apps chirp and hop, on the Mac. Kudos to both. But it's not consistent, and not everybody does it, all the time.) (When I say obvious, I mean it is obvious to anyone who has actually used a computer, as an end-user or consumer. If all you did was design/build/code your computer/OS/GUI/apps WITHOUT having actually USED it, then yes, you may not have learned that that makes for a bad user experience. But it's like having a factory which produces cartons of some beverage, and the "mouth" of the carton, that a person would put their lips against to drink from, for example, has razor blades in it. Ouch. If all you did was design the factory and the carton and manufactured them, shipped them, gave a little CEO puff speech, advertised, did the accounting, etc. you may never realize just what a bad user experience you've caused. But the instant you try to drink from your carton the first time, ouch, you learn. All I expect is that you actually USE the thing you make. If you make food, eat it. If you make guns, shoot them. Chairs, sit on them. If you don't, you're a dumbass.) This behavior is understandable and acceptable in cats. They're just animals. If you're typing on the keyboard and suddenly a cat leaps onto it, disrupting your work, you have to accept it as The Cost of Having A Cat, because they're just "dumb" animals. We make the software. We did not make the cats. The cats are not our fault. The software is our fault. Humans are not (or rather, don't have to be) "dumb" animals. And software is just a set of rules or steps, written by humans, so we don't have to make it in such a way that it is obviously behaving like a dumb animal. If we do (if you do, whoever you are that made the software do this) then you, my friend, are a dumb animal. A dumbass. Again, excusable back in say 1980 (to pick a year in the past at random, but it's arbitrary). But it's 2007 now, folks. Wake up. Read the memo. The memo arrived QUITE a while back. It's there in your Inbox, if you only care to open your eyes. Dumbasses.

By the way, one of the reasons why I prefer to use the CLI/Terminal mode when Doing Real Work, rather than the GUI/Desktop mode, is that I know that I'll be almost entirely immune to Focus Ambush. It's not completely impossible to cause it in CLI/Terminal mode, but in practice, in most situations, it just doesn't happen. And that's great. You can work with a sort of serial, deterministic frame of mind. Or, you can also do things "in parallel" (by firing off background tasks or working with several terminals or shells open concurrently, spread across your desktop, each doing or showing something different, but all still useful and relevant), but even when you do tasks in parallel, in a CLI/Terminal mode, it's more controllable, predictable, flexible, and focus-friendly, than in the GUI/Desktop paradigm. Not all GUI/Desktop experiences are equal though, and some are more prone to Focus Ambush than others (Microsoft is more prone than Apple; older OS's/GUI's/apps by anybody are more likely to do it than more modern ones) but in general, choosing CLI/Terminal over GUI/Desktop is the smarter bet if you want to avoid Focus Ambush. One drawback to CLI/Terminal mode, however, is that the type of work you can do with it is usually better suited, or more biased towards, the work of a programmer or other computer techie. If you're an artist, and therefore you need to see and interact with a 2D or 3D graphical representation of your subject or medium then you may be stuck having to be in a GUI/Desktop mode most or even all of the time. But other than those two cases, which are probably the extreme opposite cases, there are many other cases that are more near the middle of the spectrum, in terms of their needs, and might be decently served using the CLI/Terminal mode. For tasks like accounting, heck, it might even be better! (I imagine using a combination of say Terminal, bash, vi and python to see, edit, calculate, re-calculate numbers and data sets, and perform data administration or management tasks. Could you do everything you needed to do in that environment with that tool set and paradigm? I dunno. Maybe not everything, it may be a compromise, but I bet you could do a lot. And at least you'd be insulated from Focus Ambush and the other ills of the GUI/Desktop approach. Like all the cognitive distractions and visual junk food that fills the screen. Think meaningless symbolic icons up the wazoo. Think advertisements. Purty colors. Animations for the sake of animations. But these latter things are not the subject of this article. Maybe another day.)

Focus Ambush is bad.
Don't do it unless absolutely necessary.
Where necessary means something like a case where someone will die, or something will explode, if you do not do Focus Ambush.

Grograma Delta: Purpose

I've started this new blog to serve as a place to post content that is both NOT related to my computer game startup, and NOT utter fiction or random silliness. Which basically means essays or posts about serious subjects. I had been posting a few items of the latter type on my fiction blog, Grograma Illusions, and it felt a little wrong to continue doing that.

Mike