Microsoft released a second beta version of its high performance computing (HPC) server software (HPC Server 2008 R2) this April.
As well as adding in features to its package to run Excel spreadsheet data on large supercomputer hardware, Microsoft also introduced the option to cluster unused Windows 7 workstation processor capacity and use that for HPC.
Computing talked to Microsoft's director for HPC, Jeff Wierer, about HPC Server 2008 R2 and how the software giant plans to increase its market share for HPC.
Computing: When will the final version of
HPC Server 2008 R2 ship?
Jeff Wierer: It will definitely ship this summer, with general availability being in late summer or early autumn.
If I look at the 'TOP500 Supercomputer Sites' website, Microsoft's HPC system is installed on less than one per cent of those sites, why do you think that is?
A lot of people ask me about that website. A lot of those sites are public sector, research labs and universities, and there's not as many commercial customers on there.
I'd argue there's a lot of incentive for those people to get presence on the top 500, because that's how they get additional government funding for their research through the higher visibility.
Why is it different in the private sector?
I don't think there's much financial incentive for private sector firms to get visibility on that website.
For example, if you're a large investment bank, the time required to take down that system and run the benchmarks to get into the top 500 is prohibitive, especially in the current economic climate.
We have a customer running 32,000 servers in a cluster. Running the benchmark on that would make them number one, but as I said there's no financial incentive for them to do that.
How do you think Microsoft will drive uptake of Windows HPC?
I think the HPC market will be driven by that next-generation workforce that's coming out of university.
Everybody knows the youth today are technology savvy, and it's not just those familiar with browsing the web or Facebook. A lot of those graduating are coming out with Windows skills, and I think you'll see industry demands for those Windows skills.
People think HPC is very technically demanding, and that you need to be a rocket scientist to make these packages run. We want to make HPC easier to use and we think we can take HPC and make it mass market – like we did for personal computing.
How much HPC are financial services firms using?
Some of our biggest customers are in the city of London. We have one bank that's been adding between 5,000 and 7,000 Windows Servers to their HPC infrastructure per year.
I would say at least in the financial industry, it's equivalent to the arms race in the Cold War during the eighties.
The investment banks are building up HPC arsenals to the point that you can't get any more power in Canary Wharf.
They're having to look to build their datacentres elsewhere.
What's the reasoning behind allowing firms to cluster Windows 7 systems together?
The feature you're talking about here is COW - short for a Cluster of Workstations. This is addressing firms that want to expand their supercomputer clusters, and also want to be part of a green initiative.
Imagine you're a small department within an engineering firm. The Windows 7 systems you're running are likely to be high-end workstations with 32GB of system memory and processors with four or more cores.
A lot of times during design work, that power is idle, and you've probably m ade a significant investment in that hardware. If you just add a single server in there running HPC Server 2008 R2, you could cluster those workstations, and maximise the utilisation of that investment.
Any system that is not doing anything out-of-hours could be used.
In fact, if I walk around the halls of Microsoft, I think we could create the world's biggest supercomputer, because every employee has three PCs!
This paper seeks to provide education and technical insight to beacons, in addition to providing insight to Apple's iBeacon specification
Focus on cost efficiency, simplicity, performance, scalability and future-readiness when architecting your data protection strategy