How The GPL Can Save Your Ass

It is time to get serious about this multicore thing. No, for real, this time.

It is time to get serious about this multi-core thing. For years, I have dreaded the day when the computing world hits the parallel wall. As I have said many times in the past, multi-core is parallel computing and parallel programming is hard, expensive, and in some cases non-portable. It adds another dimension of complexity to writing software. There is no quick fix and no solution on the horizon that addresses this issue. The computer industry is now facing a huge challenge — how to transition software to multi-core platforms. No amount of marketing or wishful thinking will help. Trust me on this one. I have been neck deep in parallel computing for 20 years. The parallel software issue is real and it is standing in front of us. Before you throw me on the pile of doomsday lunatics, the polite experts are saying the same thing.


Of course there are methods to program parallel computers, but none of them really address the issue from a higher level. Indeed, they often drag the programmer down to the minutia of managing data and temporal issues that do not exist in the singe core paradigm. If we don’t come up with high level methods to address this problem, writing parallel software will be an excruciating expensive process that will stifle much of the computer industry. In the absence of a real solution, non-portable ad-hoc approaches will be the norm. In an industry where past decisions seem to outlive their life expectancy this is a dangerous proposition.


I have always, bemoaned the “not my problem” attitude of all the major computing companies that
are touched by parallel computing, which is now pretty much everybody. Since bemoaning seems little underpowered given the urgency of the current situation, I feel the need to elevate my position to a rant. But first, the all important car analogy will help set the stage.


Picture yourself a car company. So you have this new kind of super fast (yet safe) car. These cars can run on the existing roads, but only at a fraction of their top speeds. In order to run at full speed you need better roads. Without better roads customers have no reason to buy those new super cars you are building. Of course, it is not your problem, because you make the best damn car the world has ever seen, some one else should build the roads. As a captain of industry, you have a choice, either help build the roads for your new car or just push ahead and make as many cars as you can and hope that people buy them. What would you do?


If you are the multi-billion dollar IT industry you stick you head in the sand and just keep making cars. It is after all, not your problem. That seems to be the attitude of almost every company with a vested interest in the computing market. There was a recent announcement indicating Intel and Microsoft have put up $10 million to fund research in parallel software. Hah! I’m going to laugh harder this time HAH, HAH! Ever here the phase pissing in the ocean, well this is more like throwing a match into the sun. We need more — much more.


I have a different proposal — one that I believe has a chance of producing results in a reasonable time frame. First and foremost, I am not naive enough to think there is a silver bullet solution that works for everyone. Solutions will probably be tailored to specific domains and allow practitioners to focus more on their applications and less on the underlying hardware.


Second, the entire in industry must co-operate and be involved. We need everyone working on this problem. The best minds in high performance computing have been at it for quite a while and it is time to turn up the volume. Fantasies of telling your R&D guys to get on it are not enough. Trying to corral your Intellectual Property (IP) with trade secrets and patents is wishful thinking. The rocket scientists (and plenty of other smart people) have been working on this issue for a long time. You don’t have the time to waste trying to expand your IP fiefdom. Instead start thinking about what happens when the next generation of products is of absolutely no interest to your customers.


Third we need to respond quickly. There is no time for IP agreements, posturing, and NIH ego trips (Not Invented Here). We need leaders to recognize the scope and magnitude of this challenge and act. Before too long, it will not be unreasonable to have four or even eight cores in a desktop. A workstation or server may have double this amount. It would sure be nice if my software could effectively use all these cores.


Finally, we need everyone to crack open their wallets. The solution will not fall from the sky. The Intel Microsoft investment is a good start and we need more.


There is only one way I can envision solving this problem before it is too late. An independent foundation or other such organization funded by the industry needs to be created. The foundation will be in charge of soliciting, reviewing, and funding proposals from companies, educational institutions, groups, organizations, and even individuals. As with other research programs, there can be various levels of funding, feasibility, implementation, education, etc. Milestone funding can be used so that projects stay on tasks and schedule. Basic stuff, really. The catalyst, however, is that all those receiving funding must agree to release all their work under a GPL license (preferably version 3).


Using the GPL will immediately remove issues that would normally choke such an important undertaking. First, the any IP barriers get pushed aside and everyone can cooperate openly. There is no need for companies spend months and even years hammering out IP agreements. With the GPL everyone shares in the spoils. As I see it, there is a hole in the dike, by the time you patent your special finger and then cut a deal so that other people can use it, your standing neck deep in water. Second, it sets up a shareable framework on which the possible solutions can be used/tested/modified by everyone right away. Let those who actually code “vote” with their time on what solution works best. There is untapped wisdom in those crowds of programmers. Open access means, no beta test agreements, no non-disclosures, or encumbrance to usage. The more people that are thinking about and trying solutions the better. Third, future work has an open unencumbered base on which others can build. No one can hijack a promising technology. Finally, it offers a established framework in which to cooperate (i.e. the GPL already works, we don’t have to sit around figuring out how to share).


Of course there will be those that will start doing the not-invented-here-duty-to-our-shareholders-IP song and dance. To them I say, “Mr Nero, your shareholders are you main concern are they not? If you don’t get a handle on this obvious problem or if you spend all you time fiddling with co-operative IP agreements, your future product sales will suffer. As I understand it, shareholders like products that sell.”


If you really sit down and think about it, an shared open approach is the most time/cost efficient way to attack this problem because it is everybody’s problem.


And by the way, this idea is not new. The GNU/Linux operating system and associated software are the perfect example of how international cooperation can create a solid product and increased sales. There are few if any companies that can afford design and build an industrial strength operating system from scratch today. The GNU/Linux kernel and the associated software are essentially cooperative built products that serve everyone. Everybody helps build it and everybody gets to use it. Cost sharing at its best. And, in case you did not notice, open freely available software allows increases sales of hardware, software support, and services. Not a bad economic model, instead of each of us building little software gizmos to help sell our hardware, lets work together sharing the cost and build the right solutions. The parallel software challenge can be solved in the same way. All it takes is a few of the major players to realize the fastest and most economical way to ensure future demand for their products is to use a cost sharing development model that already works. That part, at least, is not rocket science.

Fatal error: Call to undefined function aa_author_bios() in /opt/apache/dms/b2b/linux-mag.com/site/www/htdocs/wp-content/themes/linuxmag/single.php on line 62