www.satn.org

Project MAC, where we met S at MIT A the Software Arts building where we worked together T and the attic N where VisiCalc was written
Other writings on our personal sites:

Bob's
David's
Dans's
RSS Feeds:

SATN

Bob

Dan
Comments from Frankston, Reed, and Friends

Friday, March 21, 2003

DanB at 3:55 PM [url]:

Quality of Service is in the ear of the beholder

The other day I was driving along, talking on my cell phone, when I went through the main cellular dead zone on the way to work. "I assumed you lost signal" was the nonchalant response when I called back. Dropped calls and poor coverage are a fact of life with cell phones in the USA.

This got me to thinking about "Quality of Service" (QOS). QOS is often given as a reason why Voice Over IP (VoIP) isn't ready for widespread use. It's also given as the reason all sorts of "special" features have to be added to Internet connectivity, breaking the End-to-End Argument. Surely, I hear, you must treat voice traffic as "special". People won't accept it without guaranteeing the "quality". Or will they?

Bob Frankston has been using the Vonage VoIP system for the last month or two. He usually uses it from his home, connected through his cable modem (along with all his other Internet traffic). He's also used it from conferences far away, through the free Internet connectivity they provide to conference attendees. I know from my own experience that cable Internet connectivity has its ups and downs, with uneven response at some times. Despite this, talking to Bob on the phone when he's connected through Vonage has been fine. The sound quality is great, and the drop outs are no more bothersome than talking on a cell phone when moving around (he does get one second and ten second silences sometimes). That's in today's world's run-of-the-mill IP connectivity, with no QOS. When I remember what IP connectivity was like from Bob's home a few years ago, with slow ISDN through slow routers, using Mosaic for browsing, I know how much better it will be in the near future. But that doesn't matter: It's already good enough to meet the "standards" of cell phones. The drop outs and delays are no worse than a system that has drawn a large portion of the population away from depending upon wired phones to become one of the bright spots in US telecom. (Sitting on a subway train in Atlanta, I see how people don't even make a comment when they redial after dropping a call in a short tunnel.)

So, I guess QOS as measured by many telecom people isn't so important after all. At least it's not so important as to be worth messing with the simple design of IP connectivity and hobbling future advances and experimentation.

There are many examples from the past which show that work to create complex special cases for a narrow view of "quality" can be misplaced.

Back in the 1970s, we went to much trouble to create printers that were "letter quality". Many early computer printers just didn't produce output that looked as good as a good typewriter. Much engineering went into "letter quality printers" that used spinning plastic wheels and later lasers. We finally got output that was virtually indistinguishable from a good typewriter. Finally, we thought, word-processed computer output wouldn't be second class and could be used for the most important of documents.

Then along came the fax machine. With the push of a button, you could send a copy of a typewritten page over the telephone to another machine anywhere. It revolutionized business. And after all that work making computer printouts look "real" and not like they were from some basic dot-matrix printer, all sorts of important documents were being read, signed, and used in commerce, with a "quality" that was worse than the most basic printers of the early PC years. Even when they could use the "high quality" fax resolution setting, most people didn't. The advantages of instant delivery were too great, and the content was acceptable in this different form. Innovation was not stopped by a need to meet the quality standards of the past.

For years telephone industry people worked to improve the "quality" and "reliability" of the voice network, yet from the 1920s to the 1970s one measure of quality and reliability didn't improve that much. Once the penetration of telephones into homes and offices hit a certain level, the chances of you actually talking to the person you were trying to reach at the moment you called did not improve that much. If they weren't there, you couldn't communicate. The "call" went through and the instrument "rang" (successful by the telephone management's measure), but there was no communications (by the user's measure). Then the HUSH-A-PHONE (acoustic coupling decided in 1956) and Carterfone (electrical connection decided in 1968 and standard, inexpensive interconnect in 1978) decisions brought the ability to connect "foreign" devices to the phone network (see these articles: one, two, and three) . Now anybody could walk into a Radio Shack and buy an answering machine, connect it to their phone line, and have "communications" when they weren't there. Soon millions of people, at the "ends", improved communications without any change in the transport, other than the "permission" to make improvements on their end. It wasn't until many years later that centralized voicemail was provided by the phone companies. Like fax machines, "at the ends", user-funded technology was the means for advancement.

So, back to VoIP and QOS. Trying to simulate the old system, and holding back the new in the mistaken belief that innovation won't be accepted without meeting the "standards" of the old, is probably the wrong thing. We don't know how people will eventually make best use of VoIP, but if history is a guide, it will be better in many important ways than what went before, and not just in cost.



Thursday, March 20, 2003

BobF at 4:18 PM [url]:

768x1280 and beyond

I'm sitting on a plane using my new laptop. (Or at least, was as I originally wrote this in February) Actually it's my old laptop and old screen but I've turned the screen into writing position--vertical--rather than movie mode which is 1280 wide and 768 high (a little wider than the standard 16:9). Text is best viewed in a column so our eyes can easily identify the lines of text.

I'm using a separate keyboard (IBM's small TrackPoint keyboard with a USB connection). I brought it with me to get a real keyboard instead of the one built-in which is compromised for the laptop. I discovered that it allows me to redefine my laptop since I can now position the screen separately from the keyboard. It's a bit awkward as I worry about the laptop falling from its perch on the tray and I'm closer to the keyboard than I'd like. It's wonderful to be able to really type and to view text "normally".

The widescreen 16:9 aspect ratio is a by-product of the assumption that the market for screens is driven by the needs of the television industry even though it is the computer industry that has been the initial market for such screens because of the willingness to pay early adopter prices.

One of the advantages of the tablet PC is the ability to rotate the screen. I hope that that capability becomes the norm with the screen becoming detachable. As I've written, we shouldn't confuse the tablet PC with the idea of interacting with our computers using the ancient method of forming letters by hand with a stylus (pen-based computing). That's just slow and awkward. The real value of the tablet comes from flexibility. Solidly attaching the screen to the keyboard was a clever idea but it defines the portable machines as laptop tops rather than small, flexible, computing platforms.

Small desktop computers are interesting as components since they are easy to deploy. The newer Shuttle computers are much quieter than the old ones due to their heat-transferring cooling systems. My Cappuccino computer has been joined by the Latt� which boasts an Intel 2.4 GHz processor, gigabit Ethernet and a gigabyte of RAM though the internal disk drive is slow because it uses laptop technology.

Replacing the laptop with components would give the users the ability to mix and match and discover new value. I should be able to just use the appropriate screen along with my favorite keyboard and computing platform. Instead of accepting the compromises of an all-in-one design I, along with millions of others, can create the systems that meet our needs.

We are in an economic transition, not just a slump. What the standard statistics do not measure are the many ways we can be productive outside of our jobs. Creating our own devices is just one such example. Doing so also creates demand by giving us the ability to not only create combinations that meet our needs; it allows us to create combinations that meet new needs and the needs of others. Instead of just one laptop we can start to deploy computing engines in many more ways and places.

Form is function. And new forms allow us to experiment with new functions. I'm sitting on a older plane now with shared screens for the movies. We are going to start to expect a screen for each seat and that trend will be accelerated if we can use that screen for more than canned entertainment. If the airplane provides me a large screen for movies I should be able to also use it in place of my small laptop screen. Conversely my computing platform should be able to tap into the information system on the plane courtesy of the new networks that Boeing is starting to deploy.

Having to run my laptop on its side is awkward but that's more than made up for in the satisfaction of discovery as well the convenience of the new form factor. As powerful building blocks become available we are going to begin to expect to discover new possibilities. Think of an educational system built around discovery rather than rote learning.




For more, see the Archive.

© Copyright 2002-2008 by Daniel Bricklin, Bob Frankston, and David P. Reed
All Rights Reserved.

Comments to: webmaster at satn.org, danb at satn.org, bobf at satn.org, or dpreed at satn.org.

The weblog part of this web site is authored with Blogger.