Anandeep just wrote a little blog questioning the long-term link between High Performance Computing (HPC) and open source. His logic boils down to this:
1. HPC, like many other technologies, starts with the need for a small set of people to do something new.
2. Those people build the tools that they need.
3. When they let others use those tools, they assume the other people have the domain knowledge to properly deploy and use them.
4. Power, flexibility, and control are key at this stage and so open source has a strong hold
5. Over time, more and more people use the tools, so ease-of-use becomes more important.
6. Ease-of-use becomes more critical, so whether a platform or application is open source becomes less important.
I do not think I necessarily agree with this logic, at least not totally. I do think it is true that over time that ease-of-use will become more and more important (actually, it is already becoming the case). However, HPC is a very special beast. For one thing, hardware is still the ruling champion in HPC, and will be for some time to come. Therefore, the issue becomes how the owner of a HPC environment maximizes their hardware investment. Generally, they have to tweak the software to best fit that environment. Thus, the need for open source.
Now, I can see how there is convergence, even in HPC, toward a few specific hardware platforms and models of computing. So software vendors will be able to narrow down their scope, and thus the need for flexibility and control will diminish over time. However, that is going to be quite a while.