The world is becoming one giant computer.
The speed of that computer, or your part of it (or the part that works in your language) is the speed of the slowest link in the chain between you and your work.
Long story short, the better your broadband connection, the better your computing experience.
This has been true for some time. The speed of our broadband defines our computing experience. The goal of broadband activists is to eliminate this bottleneck, between you and the vast worldwide computer you want to use.
The barrier to that is the Bells, and the cable companies, who are hoarding all the bits and calling them "services." TV "service. Telephone "service." These are superfluous. For your computer, telephony is a low-bandwidth exercise. For your computer, TV is a higher-bandwidth exercise, but new software has cut that threshold dramatically.
Despite this Bellheads get away with calling speeds of 200k "broadband" and hoarding all the bits beyond that figure. Maybe they’ll sell you ADSL and let you have 20% of the available bits. Maybe they’ll sell you cable modem service and give you one channel for your Internet (out of hundreds available).
We don’t really need fiber. All we need to do is liberate more of the existing bandwidth in order to dramatically improve our computer experience.
There are several ways in which this can be done:
- Competition, from governments or entrepreneurs dedicated to delivering bits, would force these companies to compete.
- Regulation, which told these companies to deliver bits, would accomplish the same thing.
- Nationalization, a takeover of the bit hoarders’ networks, would also accomplish the same thing.
It’s really up to us, not the Bell companies, not the cable companies,
which path we choose. These networks were built under compulsion, in a
regulated environment. They do not "belong" to the Bells or cable
operators the way a private enterprise owns its assets. Regardless of
the claims made by the Bells or cable operators, this is a plain fact. (That’s the 1998 Wall of Macs at the University of Washington.)
In getting to where we want to go, it’s vital that we consider our
goal. The goal of broadband is to link everyone to the grid, as fast as
possible. On a grid, you can parcel big jobs among many systems, and
when you’re connected you’re equally close to every other system.
The grid idea is 20 years old now. It originated with parallel
processing, the breaking of the Von Neumann architecture at Sandia Labs
in New Mexico. It was first proven, in fact, using networked computers.
Today it’s being applied on chips, like the Intel Dual Core (watch for
quad and octal cores next).
But once you’re joined to the Internet, you should be able to use
all the computing power you can find there. You should be able to
parcel out your biggest jobs among Internet-linked computers and get
them all done.
This is bound to be controversial. Imagine bad people doing bad jobs.
But such claims are just an argument for Luddism.
This should be the ultimate demand of Open Source Politics, the
tools we need with which to change the world, unencumbered by
monopolists, governments or any other gatekeepers.
The nation which reaches this goal fastest, which gets closer to this goal sooner, empowers its people and captures the future.
The question, then, is whether Americans wish to be part of that future, or let fear make us part of the past.
On a grid, you can parcel big jobs among many systems, and when you’re connected you’re equally close to every other system.
The thing I love about Open Source politics is that you can literally make stuff up, invoke Open Source, and it’s like the new national imperative even if it isn’t true!
Your enemy here is latency. For many jobs you might wish to distribute on the grid (or over a LAN or between multiple cores on the same chip), upstream and downstream bandwidth is not the limiting factor. Please get your head around that before you nationalize the communications infrastructure. It will save you a lifetime’s worth of embarrassment.
On a grid, you can parcel big jobs among many systems, and when you’re connected you’re equally close to every other system.
The thing I love about Open Source politics is that you can literally make stuff up, invoke Open Source, and it’s like the new national imperative even if it isn’t true!
Your enemy here is latency. For many jobs you might wish to distribute on the grid (or over a LAN or between multiple cores on the same chip), upstream and downstream bandwidth is not the limiting factor. Please get your head around that before you nationalize the communications infrastructure. It will save you a lifetime’s worth of embarrassment.
Hey Brad, Partially right!
Yes, latency has a lot to do with parcelling out portions of a big task but little to do if the the job is to gets bits from here to there.
I make medical imaging equipment and can produce a gigabit picture of your insides just by putting you through an MRI or a CAT scan. Right now, hospitals are doing an adequate (not superior) job of letting folks within their four walls look at what’s wrong inside your skin — but not beyond. If you suddenly doubled over with a life-threatening malady, wouldn’t you like the doctor who knows your history to be able to pause his cast for bass on the Chattahoochee long enough to suggest a cure that will keep you paying his bills?
One single transaction. Hell, you or I could send the rest of the day writing down uses for terabit access speeds and we would be as prescient as IBM’s Thomas Watson’s judgement that seven computers should be enough for the world.
So, although I agree that there are certain classes of opportunities where speed of response is important, I also agree with Dana that raw petabit speeds will open our minds to new ways of using it.
Yes, I was initially confused by “open” because what it originally meant to me was “a system written under Unix” but seems to have taken on a wider meaning.
Hey Brad, Partially right!
Yes, latency has a lot to do with parcelling out portions of a big task but little to do if the the job is to gets bits from here to there.
I make medical imaging equipment and can produce a gigabit picture of your insides just by putting you through an MRI or a CAT scan. Right now, hospitals are doing an adequate (not superior) job of letting folks within their four walls look at what’s wrong inside your skin — but not beyond. If you suddenly doubled over with a life-threatening malady, wouldn’t you like the doctor who knows your history to be able to pause his cast for bass on the Chattahoochee long enough to suggest a cure that will keep you paying his bills?
One single transaction. Hell, you or I could send the rest of the day writing down uses for terabit access speeds and we would be as prescient as IBM’s Thomas Watson’s judgement that seven computers should be enough for the world.
So, although I agree that there are certain classes of opportunities where speed of response is important, I also agree with Dana that raw petabit speeds will open our minds to new ways of using it.
Yes, I was initially confused by “open” because what it originally meant to me was “a system written under Unix” but seems to have taken on a wider meaning.