Biddly Boop Technology

I relate this tale whenever I want to explain to someone how technology is viewed by the masses. It all started at a Safeway supermarket in Rose Hill.

I was in line at an "Express" checkout. I forgot what I was getting, but my son was very young, and antsy. I was anxious to get moving. Following the Murphy's Theorem that the line you choose will always be the slowest, I ended up in line that consisted of an elderly lady paying by check. At Safeway, as many food stores are wont to do, they issue "Check Cashing Cards," which is basically a card that you apply for so you can write checks at their supermarket. You apply for it, and they mail you a card. It's a security measure for them, and you can't write a check without it.

This elderly lady wrote her check, and like too many people, she didn't start writing the check until the sale was done. Proper etiquette in check writing is to write everything in advance that you know, so that when the total comes up, you just fill that in, and go. Well, not only was this lady slow in writing her check, but ignored the cashier's request for her check cashing card. When the lady was finished the cashier repeated her request for the card again, and a frustrating dialogue ensued.

"Do you have a check cashing card?" asked the cashier for about the third time.

The lady just smiled, and held the check out for the cashier. The cashier took the check, and repeated the question again. The lady smiled, and nodded her head. The cashier asked for it, and the lady just smiled. An awkward pause began to flow. The cashier realized that they were not communicating.

"In order to accept your check, you need to have a Safeway Check Cashing Card. Do you have one?"

The lady just pointed to the check, and nodded. The woman in line ahead of me let out a frustrated growl. The cashier said she could not accept the check without a card, and the lady said a phrase I will never forget:

"Can't you just look me up in that biddly-boop computer there?"
"Can't you just look me up in that biddly-boop computer there?" she asked, pointing to the cash register, and moving her fingers.

The Biddly-Boop Computer. That name stuck with me all these years. I began to understand how people like her viewed technology. Mysterious and magical devices from unknown crafters. An orifice of God. I take for granted many of the technological wonders around me, and many of you readers probably do, too. Imagine how wondrous and/or frightening the world must seem to those who do not understand technology. To this lady, computers were as mysterious as a crystal ball, having general all-knowing, all seeing prophesies as to anyone's identity.

And how she described it was perfect. The Biddly-Boop Computer. To this lady, the technology had surpassed her ability to define it in any terms but the most basic and guttural sounds that such a mystical object would create. Her world of technological devices were lumped up in the realm of science fiction, where omnipotent computers, as displayed on TV, are huge cubical beings with dozens of flashing lights and speak in haunting monotone voices. To this lady, computers not only think, but talk, and know everything with their biddly-boop electronic brains, created by scientists in basement labs behind giant vault-like doors. I am sure this woman has long since regarded the electronic beep, no matter the device, as the language of such beasts.

And how does she view the people who use such devices? The masters of the slaves, slaves which were still more powerful than herself? She pointed to the register, and mimicked the actions of such computer ringmasters as best she could. The keyboard is not regarded as a mundane input device as you and I might see it, but as a mysterious apparatus used by the wizards to craft magic from the biddly-boop technology. Why, if a giant robot made of silver cylinders reminiscent of 1950's science fiction serials walked stiffly from a hidden panel in the wall, and fixed her problem right on the spot with a giant wrench made of flashing lights, she probably would have thought nothing of it.

And who is to blame for this view? Is she to blame for not keeping up with the times, not realizing that new standards and procedures for check writing cannot be fixed by biddly-boop technology? Possibly. Is the media to blame, portraying computers and robots as masterful beasts, instead of the dumb slaves that they really are? Possibly. Are we to blame, as a society, for not keeping people like this educated, but leaving them behind on the road of technology, driving a sports car, while she is still using a horse and buggy? Possibly that, too. I say it's a combination of factors. Some I have mentioned, and probably some I don't even know yet until I reach such an age.

Is this to say all old folks are helpless and stupid when it comes to technology, that they are forever doomed to have VCR's the blink 12:00 all the time? Certainly not! I have known many people over the age of 70 would run rings around me with technical knowledge and prowess I can only hope to achieve someday. I am sure older people are as tired of being called inept fools just as young people are tired of being known as slackers. I think it is all mindset.

When I teach people how to use a computer, I have to be ever careful of what I call "the wall." The wall is a glass shield that goes up when you can't absorb any more information. Everyone has it. On some people, it's a faraway look, a glazed stare, or a panicked and frustrated expression. Some people's walls go up the second you start to tech them technology, it's like a phobia. I have had several instances that are like this:

Me: So, this is your desktop. It is the wide green space you see on the screen.
He: Uh huh…
Me: The desktop is like a desktop at work. You put all your most used stuff on it.
He: Okay…
Me: The Windows desktop has several icons that let you get your stuff. The first one is "My Computer." It shows what is on your computer that you are sitting at. The next one is "Network Neighborhood." It shows other computers you can connect to. Then there is your "Recycle Bin" which is where you put things you want to get rid of. Then there in "The Internet" which connects you to the Internet. You follow so far?
He: Uh, whatever…
Me: What don't you understand so far?
He: I don't know…
Me: Where did I lose you?
He: It's just that… what desktop? I don't see it, and these pictures, why is "Network Neighborhood" two TV sets, and the Internet spelled with a giant blue "E"?? Why do they have to make this so stupid, anyhow?

The wall has gone up. Depending on the student's self-confidence, he or she may have different reactions. Some become angry, some just become quiet. Either way, the wall has gone up because they believe that computing is truly hard and for mystical gurus. What we as teachers must understand is that "computer common sense" to us, like scrollbars, windows, and double-clicking, are not common sense to computer beginners. And those that feel they have been left behind in the information age may need to be reassured that they can learn this if they believe they can. I often ally people's fears with this statement before I begin teaching:

"Remember, computers are dumb. Really dumb..."
"Remember, computers are dumb. Really dumb. They will do nothing unless you tell them to. They have no intuition, they don't know what you want. You have to tell them, and they don't speak people speak, so you have dumb yourself down to speak to them. This class is going to teach you to talk stupid to machines."

Thank God for GUIs (Graphic User Interfaces). The icon/windows/mouse idea that Xerox came up with (yes, Xerox, not Microsoft, and not Apple, see below) in 1974 has been a virtual boon for computer users. It is one step closer to having an actual computer that understands the complexity of a human. We have had millions and millions of years of evolution in a relatively complex environment, and we are fine-tuned to it. Digital computers have only been around for about half a century (not counting mechanical counters like the slide rule or abacus), and they work in a simplified and logical yes/no true/false kind of environment. The advantage to this is that you can simplify everything. The disadvantage to this is that simplification increases instruction numbers. So a simple command for us like "get apple off of table" would have to be stated as "locate table, identify top, scan objects on top of table, identify objects and compare them to a known apple, when apple is identified, define position, determine distance from position, travel distance, extend arm, keep balance with arm movement, retrieve apple." Actually, each one of those steps has hundreds of sub-steps, like just extending your arm involves dozens of muscles, a hand-to-eye coordination, body balance issues, adjust heart rate and breathing cycle, and so on. If we had to memorize everything just get an apple off the table, we'd never have evolved past DNA replication. That's why we don't have walking, talking computers that are indistinguishable from you or I. Yet.

"In the 1970's, in order to just talk to a computer, you had to possess a certain degree of skills which needed accurate typing, an understanding of logic, patience, and a lot of disappointment."
When I started learning about computers in the 1970's, it was a different world. It was a world of geeks and lots of people who thought logically, which as anyone will tell you, is a social handicap. If you told someone you were a computer engineer, it was a mystical profession that few understood. Many of them assumed that you worked with equipment that beeped, flashed lights, spun reels, and spit out paper from noisy printers. And for the most part, they were right, although unlike the TV, our computers did not talk to us with monotone female voices. Pity. But things changed. In the 1970's, in order to just talk to a computer, you had to possess a certain degree of skills which needed accurate typing, an understanding of logic, patience, and a lot of disappointment. You really had to be smart, or at least have the capability to memorize weird commands like "cd home slash et-see more file pipe grep data greater than endfile." And woe befell those who typed a capital letter when really they wanted a lowercase one. There was a time when we didn't have screens, we had to sit at a printer that printed about 10 lines a minute to see if what we typed was what we wanted. And how furious we were when the systems did what was asked instead of what we wanted! User friendly was not even an option. You could ruin years of data with one mistyped command, and never even know until someone asked, "Hey, what happened to 1972's satellite research analysis?" Data would be on flimsy paper cards, or fragile magnetic tape with reels the size of tires.

But things started to change. As circuits were designed smarter, computers could be smaller. Then the home computers started popping up everywhere. Before, a computer in your home was a rarity left to strange hobbyists who often built their own. There were few standards, everything was proprietary, and one computer built by one company rarely, if at all, could talk with another computer built by someone else (see below). But now there were a few standards, like ASCII characters (so what one computer typed could be read by another) and standardized data formats (like floppy disks). Apple broke new ground, and soon, a lot of other companies started in on the home market. But really, until the late 1980's, a computer was something few people had. For one, there were too many companies producing too many things that only worked with other things by the same company. There was no open architecture. But then came the 8088 XT, which paved the way for open architecture, standardized data, and upgradability. You could now swap out parts, even parts made by a different company than the one that made your computer. But the operating system used on these machines was still pretty much cryptic commands that required several classes in college to take basic advantage of the system.

Then remember that operating system I mentioned Xerox developed in 1974? They dropped it pretty much, because it took too much memory and other resources. Apple snapped this up in the 1980's , and created the Macintosh. Now there was an operating system not geared to the programmer who had years of college, but to the average user, who knew how to place slot A into tab B. Instead of memorizing commands to move a file like "copy C:\files\text\foo.txt C:\files\backup\foobak.txt, del C:\files\text\foo.txt," you could simply click and drag a file from one folder to another. This changed everything, and Apple might have dominated the market, but they failed on three major points. The open architecture, the rights to clone technology, and adequate developer support. Then they failed to innovate properly, and when they weren't looking, the AT market snuck up behind them with the systems we now call "PC's." They realized that the DOS idea was getting old. So they copied Mac's system… sort of, and created Windows. It was really nothing more than a shell that typed commands for you, and it took three major revisions to become stable enough for common use, but it was cheap. Okay, cheaper than a Mac at least, which now had so many models, that when you needed to upgrade, you had to buy a whole new system instead of swap peripheral cards. Because everything was non-proprietary, you could customize your system to your own needs. Salesmen could sell different models, with the ability to upgrade systems when customer's needs changed without trashing all the old stuff. This made things cheaper, and more accessible to the customer, who really didn't have to know too much to operate one.

So nowadays, people who just 20 years ago wouldn't dream of sitting in front of a computer doing anything useful are now creating web pages and generally increasing their intelligence with the global knowledge database we call the Internet. But I hear the old programmers whine and bitch. They liked being gurus. They liked the fact that they could assume the guy who sent them e-mail knew what a diode was, and why reversing them on a circuit board was a bad idea. They liked the wizard robes they donned, and the mysterious powers they had over the biddly-boop technology. It reminds me of two old Jewish guys in a retirment home:

"These kids today don't appreciate the Holocaust…" says Saul. "They go about their daily business without a care that millions of Jews were killed." His friend Herschel turns to him with a confused stare. "They shouldn't have to, Saul. That's what we fought for, so they wouldn't know first hand the horrors of genocide. Are you saying we should expose children to the horrors of the Holocaust first hand, so they can jolt upright from a bad dream of being persecuted, hiding in attics, and burned alive?"

I hear a lot of computer programmers complaining about the simplifying of technology, making fun of Windows, and their general disgruntlement at people who don't know how to grep in Unix. What I want to ask is, why should they? What's wrong with technology that grandma can master? Isn't that what we have fought for? Technology now is better than it ever has been. It's easier, cheaper, and certainly made my life easier. Computers are finally becoming tools we can actually use and understand. It's becoming as common to households now as the television, and promises a better and brighter future for everyone. The mystery of biddly-boop technology is fading away, as we become the ringmasters of the circuits.

But some people will always be resistant to change. When I think of the poor old lady in line, who trusted technology to solve everything going wrong, or when someone who was born in the 1980's asks me why Floppies are drive A, hard drives are C, and what happened to B (see note below), I know mysteries will always lurk. Either because of missed opportunity, lack of resources, or just plain stubbornness. I think it is our responsibility to help people who can be helped by learning patience while teaching others.


Footnotes

The 
Alto - the world's first WYSIWYG editor, commercial mouse, 
graphical user interface (GUI) and bit-mapped display.Xerox's Loss: The GUI had its roots in the 1950s but was not developed intul the early 1970's at Xerox's labs at PARC. And just what does Xerox have to say about the operating system Apple stole from them? Well, they didn't say much, their research department is famous for developing new technology and then abandoning it. But they did have a good chuckle when Microsoft stole the idea from Apple. When Steve Jobs at Apple accused Bill Gates of Microsoft of stealing the GUI from Apple and using it in Windows 1.0, Gates fired back: "No, Steve, I think its more like we both have a rich neighbor named Xerox, and you broke in to steal the TV set, and you found out I'd been there first, and you said. 'Hey that's no fair! I wanted to steal the TV set!'" (And while we're at it, AT&T owns the patent on the flashing cursor, but they haven't seen a dime of royalties on that, either.) Return to essay.

The 
wonderful world of proprietary machinesFirst Home Computers: Oh, it was awful. Some computers used cartridges, some used floppy drives, and some used cassette decks to save things, if they saved them at all! Most of the companies that made computers were marketing departments or game companies. And frankly, if it wasn't a computer game, you needed a degree to figure out how to get things to work. Processing speed was so slow, you'd be better off doing your finances on paper. Many programs were on more than one disk, so you constantly had to swap floppies around to do different things. Technical support? Hah! Good luck, buddy. That's where user groups came into being, but that only paid off in major cities, where if you were lucky, enough people who used your same software AND knew user groups existed would be any help. This was in the days before BBS's, and the Internet was only being used by the military and some universities. And NOTHING could speak to another system, and in some cases, they couldn't even speak to their own systems unless the second system was an exact duplicate of the system you had. Return to essay.

Drives 
and their lettersDrive Letters: Just so you know, I do get asked this a lot. When floppy drives for disks first came out, it was the only game in town. Then some systems had two floppy drives for copying purposes, so one was named A, and the other was named B. Then after a few years came the hard drive, which became C, then the CR-ROM, which became D, and so on… but now people don't need two floppy drives anymore, so you don't see people with B drives much anymore. In the Mac OS, they are simply named Hard drive and Floppy Drive. To programmers, the floppy is /dev/fdo1 (fdo2 for a second drive), and it only gets worse from there, trust me. Return to essay.


[ Home ] [ What's New ] [ About Me ] [ My Writings ] [ Web Links ] [ Post Office ]