dcsimg

How The GPL Can Save Your Ass

It is time to get serious about this multicore thing. No, for real, this time.

It is time to get serious about this multi-core thing. For years, I have dreaded the day when the computing world hits the parallel wall. As I have said many times in the past, multi-core is parallel computing and parallel programming is hard, expensive, and in some cases non-portable. It adds another dimension of complexity to writing software. There is no quick fix and no solution on the horizon that addresses this issue. The computer industry is now facing a huge challenge — how to transition software to multi-core platforms. No amount of marketing or wishful thinking will help. Trust me on this one. I have been neck deep in parallel computing for 20 years. The parallel software issue is real and it is standing in front of us. Before you throw me on the pile of doomsday lunatics, the polite experts are saying the same thing.


Of course there are methods to program parallel computers, but none of them really address the issue from a higher level. Indeed, they often drag the programmer down to the minutia of managing data and temporal issues that do not exist in the singe core paradigm. If we don’t come up with high level methods to address this problem, writing parallel software will be an excruciating expensive process that will stifle much of the computer industry. In the absence of a real solution, non-portable ad-hoc approaches will be the norm. In an industry where past decisions seem to outlive their life expectancy this is a dangerous proposition.


I have always, bemoaned the “not my problem” attitude of all the major computing companies that
are touched by parallel computing, which is now pretty much everybody. Since bemoaning seems little underpowered given the urgency of the current situation, I feel the need to elevate my position to a rant. But first, the all important car analogy will help set the stage.


Picture yourself a car company. So you have this new kind of super fast (yet safe) car. These cars can run on the existing roads, but only at a fraction of their top speeds. In order to run at full speed you need better roads. Without better roads customers have no reason to buy those new super cars you are building. Of course, it is not your problem, because you make the best damn car the world has ever seen, some one else should build the roads. As a captain of industry, you have a choice, either help build the roads for your new car or just push ahead and make as many cars as you can and hope that people buy them. What would you do?


If you are the multi-billion dollar IT industry you stick you head in the sand and just keep making cars. It is after all, not your problem. That seems to be the attitude of almost every company with a vested interest in the computing market. There was a recent announcement indicating Intel and Microsoft have put up $10 million to fund research in parallel software. Hah! I’m going to laugh harder this time HAH, HAH! Ever here the phase pissing in the ocean, well this is more like throwing a match into the sun. We need more — much more.


I have a different proposal — one that I believe has a chance of producing results in a reasonable time frame. First and foremost, I am not naive enough to think there is a silver bullet solution that works for everyone. Solutions will probably be tailored to specific domains and allow practitioners to focus more on their applications and less on the underlying hardware.


Second, the entire in industry must co-operate and be involved. We need everyone working on this problem. The best minds in high performance computing have been at it for quite a while and it is time to turn up the volume. Fantasies of telling your R&D guys to get on it are not enough. Trying to corral your Intellectual Property (IP) with trade secrets and patents is wishful thinking. The rocket scientists (and plenty of other smart people) have been working on this issue for a long time. You don’t have the time to waste trying to expand your IP fiefdom. Instead start thinking about what happens when the next generation of products is of absolutely no interest to your customers.


Third we need to respond quickly. There is no time for IP agreements, posturing, and NIH ego trips (Not Invented Here). We need leaders to recognize the scope and magnitude of this challenge and act. Before too long, it will not be unreasonable to have four or even eight cores in a desktop. A workstation or server may have double this amount. It would sure be nice if my software could effectively use all these cores.


Finally, we need everyone to crack open their wallets. The solution will not fall from the sky. The Intel Microsoft investment is a good start and we need more.


There is only one way I can envision solving this problem before it is too late. An independent foundation or other such organization funded by the industry needs to be created. The foundation will be in charge of soliciting, reviewing, and funding proposals from companies, educational institutions, groups, organizations, and even individuals. As with other research programs, there can be various levels of funding, feasibility, implementation, education, etc. Milestone funding can be used so that projects stay on tasks and schedule. Basic stuff, really. The catalyst, however, is that all those receiving funding must agree to release all their work under a GPL license (preferably version 3).


Using the GPL will immediately remove issues that would normally choke such an important undertaking. First, the any IP barriers get pushed aside and everyone can cooperate openly. There is no need for companies spend months and even years hammering out IP agreements. With the GPL everyone shares in the spoils. As I see it, there is a hole in the dike, by the time you patent your special finger and then cut a deal so that other people can use it, your standing neck deep in water. Second, it sets up a shareable framework on which the possible solutions can be used/tested/modified by everyone right away. Let those who actually code “vote” with their time on what solution works best. There is untapped wisdom in those crowds of programmers. Open access means, no beta test agreements, no non-disclosures, or encumbrance to usage. The more people that are thinking about and trying solutions the better. Third, future work has an open unencumbered base on which others can build. No one can hijack a promising technology. Finally, it offers a established framework in which to cooperate (i.e. the GPL already works, we don’t have to sit around figuring out how to share).


Of course there will be those that will start doing the not-invented-here-duty-to-our-shareholders-IP song and dance. To them I say, “Mr Nero, your shareholders are you main concern are they not? If you don’t get a handle on this obvious problem or if you spend all you time fiddling with co-operative IP agreements, your future product sales will suffer. As I understand it, shareholders like products that sell.”


If you really sit down and think about it, an shared open approach is the most time/cost efficient way to attack this problem because it is everybody’s problem.


And by the way, this idea is not new. The GNU/Linux operating system and associated software are the perfect example of how international cooperation can create a solid product and increased sales. There are few if any companies that can afford design and build an industrial strength operating system from scratch today. The GNU/Linux kernel and the associated software are essentially cooperative built products that serve everyone. Everybody helps build it and everybody gets to use it. Cost sharing at its best. And, in case you did not notice, open freely available software allows increases sales of hardware, software support, and services. Not a bad economic model, instead of each of us building little software gizmos to help sell our hardware, lets work together sharing the cost and build the right solutions. The parallel software challenge can be solved in the same way. All it takes is a few of the major players to realize the fastest and most economical way to ensure future demand for their products is to use a cost sharing development model that already works. That part, at least, is not rocket science.

Comments on "How The GPL Can Save Your Ass"

cdsteinkuehler

Um…so what do you think about projects like Intel’s TBB: http://threadingbuildingblocks.org/

It’s GPL, and provides high-level abstraction for implementing parallelism. Or am I missing something in your requirements (perhaps you despise C++?!?) :)

Reply
wambamboozle

I agree we should be using the GPL, especially for research.

Although I prefer declarative languages like Erlang for parallel programming, there is a widely deployed alternative.

Sun has been making multicore machines for many years now. The designers of J2EE provided a framework so that you can write multithreaded software without really thinking about it much. I’ve been on machines with hundreds of cores serving up webapps without a problem. A lot of this stuff has been worked out. You can’t argue with working code.

Reply
pahosler

This would have been an interesting read if not for the irritating spelling and punctuation errors! Please don’t give the “English is my second language” excuse, you’re a SENIOR EDITOR!!! If English is not your second language, please stop relying on MS Office to grammar check for you, it’s getting it wrong and making your article appear to be written by a 5th grader. Sorry, I really just expect more from folks that have Senior Editor as a title. Then again, I guess it doesn’t mean much anymore these days.

Reply
sjinsjca

“…multi-core is parallel computing and parallel programming is hard, expensive, and in some cases non-portable…”

Aside from the GPL issues you raise… ever hear of LabVIEW? It’s the leading programming environment for instrumentation. For more than twenty years it has made programming parallel processes easy, and it has done a fine job of simulating parallelism on the single-CPU architectures typical until recently. Then it was extended to programming FPGAs (field-programmable gate arrays, basically build-your-own-silicon devices), where a typical user can easily create many dozens of truly parallel processes running simultaneously. From there it was a small matter to support multicore processors, and that was folded into the current release of LabVIEW last year.

And LabVIEW runs on Windows, Mac OSX, Linux, Solaris… your code is entirely portable, and if you have more than one processor core, it’ll take advantage of it.

It’s marvelous, marvelous stuff. I really hope it represents the future of programming, because it puts tremendous power in the hands of ordinary folks like you and me. FPGAs are a good example– before, it took special training, skills and costly tools (and lots and lots of engineering and test time) to design an FPGA. LabVIEW makes it easy, and turns Field Programmable Gate Arrays into USER-Programmable Gate Arrays. That’s astonishing new functionality, and anyone who uses LabVIEW can handle it.

Reply
lry198010

so,How about the erlang! what’s the role of erlang!

Reply
quickening

Folks here mentioning programming solutions and I think that misses the big picture. Handling multiple cores should be no different than handling multiples of any other hardware resource – a task suited best for the OS. Multiple cores should be considered an opportunity to implement advances in computer intelligence such as the community of specialized agents concept.

Reply
bradlepc

The point of the article is that the solution needs to be broader and more easy to retrofit than what existing options offer. There’s billions of dollars worth of software out there that needs to stay on the performance curve, but for which “rewrite it in J2EE or Labview” is clearly not a viable answer. It would be great if the OS could just handle it, but the OS has little visibility into the inherent parallelism of an application. TBB is closer to the right idea, but it is still closely identified with Intel and its scope is relatively narrow at present.

Reply
junnufunky

I agree with the editor that (we) programmers are really facing a challenge in front of the parallel programming paradigm. And we really need some new tools and thinking how to use the resources the new architectures provide.
But as if GPL would be the answer to the problems? My personal experience is that if, for example, libraries and tools are licenced with LGPL-type licence, the following business model covers much more ground compared to being licenced by plain GPL. There are numerous examples of high quality tools and libraries produced this way with remarkable developement input from commercial users.
To consider that GPL is the silver bullet that solves the problem…not! Much more can be gained
if the interests of commercial and non-commercial communities can be unified. What results is useful for parties developing commercial and non-commercial software. Is there something wrong with this scenario?

Reply
hhemken

No doubt this proposal will be derided as hare-brained commie pinko bullshit of no relevance to civilized capitalist society, but I think it is pretty much on the mark.

Parallelism needs to be a low-level functionality that is reused by developers as if it were just another library. Standardized use-cases should be supported, as well as the flexibility to design more specialized software mechanisms.

Even Aunt Tilly and Grandma will make full use of their multicore personal supercomputers once everyday software arrives making use of open-source parallelism libraries and techniques. Making videos, music, renderings of various kinds, and a variety of other heavy-duty simulation functionality will be subsumed into industrial, professional, and even consumer software.

Reply
nihil75

I think saying we need a solution before we are “knee deep in water” is an exaggeration.

The big corps don’t mind waiting a few years,
They will profit alot more from having patented the solution as their own.

The only ones who are loosing right now are the end users, who cant utilize this technology.

Reply
pmpope

Wow! I guess if I make as many coding mistakes as you do grammatically I could be some sort of senior and blame it on the fact that I’m a writer and not a coder. WOW! God bless Amerika!

Reply
grndrush

I see nothing ‘new” or “innovative” in this article, but one thing you need as badly as top-notch parallel programmers, is a top-notch technical editor. Both the lack of substance AND the copy, so bad it hurts to read (if you’re a technical copy-editor, anyway), indicate you started writing this about 2 hours before deadline.

Reply
hacklinuxdude

Microsoft does not like this and is trying to make GPL non free by patent threats.

http://www.digitalmajority.org/forum/t-49513/brad-smith-continues-its-fud-spreading-wants-to-tax-redhat

I dont know how long can GPL stand againts the $250 billion godzilla that microsoft is.

Reply
davrusso

By and large it appears to me that the vast majority of users use their computers for nothing more than word processing, email, and accessing databases. At this point the average PC strikes me as satisfactory with a single-core CPU. Even my parametric solid-modeling CAD and small engineering simulations work well with a single CPU. The extra CPU’s are helpful though to parallel-process a CFD problem or whatever other science you’re into. Perhaps this should really be the direction of the industry, developing the parallel processing towards those who need it, using a generic MPI or something.

Reply

Normally I don’t learn article on blogs, however I wish to say that this write-up very forced me to try and do so! Your writing style has been surprised me. Thank you, very great article.

Reply

What’s Taking place i am new to this, I stumbled upon this I have found It positively helpful and it has helped me out loads. I’m hoping to give a contribution & assist different users like its aided me. Great job.

Reply

Hiya, I’m really glad I have found this information. Today bloggers publish just about gossip and net stuff and this is really irritating. A good site with exciting content, that is what I need. Thank you for making this website, and I’ll be visiting again. Do you do newsletters by email?

Reply

Todos son iguales, sale más rentable comprar tu propio terminal libre e irte a una OMV barata sin permanencia para cambiar cuando te salga más rentable con las mismas condiciones. Ofrecemos servicio tecnico BALAY en Carabanchel todos los días de la semana, nuestros trabajos esta certificados por nuestra factura, que es la mejor garantía de un trabajo bien hecho. Si tiene problemas con su frigorífico, llámenos, Reparamos las averías de frigoríficos y cualquier electrodoméstico Carabanchel : lavadoras Carabanchel, congeladores, lavavajillas, frigoríficos, hornos.}

Reply

Outstanding post, I conceive website owners should learn a lot from this web site its very user friendly.

Reply

With almost everything which seems to be developing throughout this area, all your viewpoints are generally very refreshing. Nevertheless, I am sorry, because I do not give credence to your entire strategy, all be it exhilarating none the less. It appears to me that your comments are not entirely justified and in simple fact you are generally your self not even totally convinced of your argument. In any case I did enjoy reading through it.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>