I was chatting up a Washington liberal today, and it was depressing.
The subject was computing. The liberal bemoaned the power of corporations to wreck a great, highly-functional government project.
The project was starved for funds, its developers allowed to leave, and now its bones were being picked by lobbyists, all aiming their "best of breed" systems as replacements for bits-and-pieces of what had once been a magnificent computing edifice.
Even if Democrats are elected this fall, he said, they don’t understand these technical arguments about open source vs. proprietary. They’ll be bought off just like the current crop.
Which is when it hit me, the frame he could use to tear down all those vendors and bring back what was lost, what is in the process of being lost.
Open source is parallel processing. (Shown is the parallel processing lab at the University of Utah.)
No matter how big a vendor might be, it’s still one system. Like the Von Neumann architectures that dominated computing for its first 40 years they have a bottleneck. The only way to speed up the process of finding a solution is to speed the whole process, get more GHz. It’s this kind of thinking which led, by the 1980s, to so-called "supercomputers" like the Cray.
Parallel processing was developed in the 1980s at the Sandia Labs in New Mexico. The idea was simple — to break jobs into parts, to move the parts onto many systems, and then to put the solutions together on the back end.
In the 20 years since parallel processing has come to dominate computing, relegating Von Neumann to a Wikipedia entry. First people stacked Macs to beat a Cray. Then they used parallel processing on the Internet itself, creating distributed computing projects like SETI @ Home. Today parallel processing is used inside chips — all today’s latest AMD and Intel silicon is doing parallel processing. From two to four to eight — who knows how far we can go with it.
That’s sort of how open source works. Only on steroids.
Because with open source not only do you parse out pieces of a project to different companies, or different developers, but their work can cross-pollinate. Not only can you build systems in parallel, but you can also use a vast community of users to find bugs, and another vast army to stamp out the bugs.
The genius of Linus Torvalds lies in his ability to constantly re-engineer Linux’ development process, first farming out all the work, then finding new ways to coordinate the massively-parallel architecture which develops in response. And the design of Linux itself responds well to this parallel processing impulse, since it consists of central functions in a kernel, ancillary functions surrounding it, and a host of distribution providers who can build working systems from all the pieces — sometimes using just parts of the kernel for a mobile system, embracing optional things like virtualization for a server.
In the case at issue, the folks who developed the Veterans Administration’s VistA software were ages ahead of their time. Their weakness was they were doing all their parallel processing, all their open source (actually public domain) work, within a single organization.
By infiltrating that organization, seizing control of it, then choking off its air supply, the Bush Administration’s Nomenklatura were able to kill that innovation, and are now in the process of carving out the agency’s computing functions into big chunks for their proprietary, lobbying clients.
The frame they used to accomplish was simple, yet devious. Security. With security as the only imperative, all development had to be centralized.
This will prove a pyrrhic victory. Already, within the health care arena, we’re seeing a new rebirth of open source. The founder of the Eclipse project is now running Open Health Tools, aimed at using global collaboration to build open source medical software. Tolven Inc., founded by former Oracle executives, are working with Medsphere, which bases its software on the VistA code, to extend it, not just in commercial versions but true open source .org systems as well.
All this is similar to what the Dean and (now) the Obama campaigns have done within the political niche itself. By embracing the future of technology, they are becoming the future of policy.
Assuming Obama is elected (and that’s no longer a major assumption) I hope he understands that what his computer people have done is just as important as what he and his policy wonks have done, or what his press guys or his spin room or his volunteers have done.
Apply that same process to all of government computing, and you’ll not just have change we can believe in, but change which we know works.