With almost 25,000 students and over 4,000 staff relying on its IT infrastructure, the University of Southampton knew it had to weigh up its options carefully when it put out a tender for a brand new data centre. And being one of the Russell Group of elite British research universities, it was aware that any mistakes could damage its reputation as a seat of learning.
So it’s hardly surprising that out of the 52 vendors that expressed interest in the contract, Mike Powell, the university’s data centre manager, only focused on the specialists.
“I only wanted a data centre delivered by someone who does it for a living, I didn’t want a general construction firm coming in and thinking they could do a data centre. So the percentage of their business had to be fairly high, a record of over 70 per cent building or fitting data centres,” he said.
While Emerson Network Power was responsible for kitting out the data centre with UPS and providing consultancy throughout the process, this wasn't the original plan, with a deal initially signed off with Schneider Electric.
“We were basically a Schneider house,” Powell said, but following a presentation by Chloride Group – which Emerson acquired in 2010 – he changed his mind after being “absolutely convinced by what they offered”.
Changing the contractor after the original choice had already started operations incurred a financial penalty, but the move will save the university in the long run, and has brought it other benefits, as Powell explained.
“What Emerson brought which Schneider never did – along with energy efficiency – was scalability. What we wanted was seamless expansion of our UPS estate to maximise our corporate IT load as well as our high performance cluster, and what we had with Emerson’s Trinergy UPS was a seamless expansion of cores that could be bolted on the existing deployment,” he said.
Being a research-led university, Southampton regularly needs to prepare network capabilities for projects that require high-powered computing functionality. However, only a small number of proposed projects actually get funded, with the money sometimes allocated at very short notice. That means that the data centre has to be prepared to cope with relatively little turnaround time, something Emerson offers thanks to its agile, easy-to-deploy cores, said Powell.
“You might have had a conversation with a researcher eight months ago who says they need a petabyte of storage and we say it costs X amount,” he explained. “It all goes quiet for eight months before they come back and say we need to do this and I’ve got two months to do it. We have these things coming from all over the place, so we have to be very agile.”
The Emerson network allows Powell and his team to efficiently manage the short notice requests made by researchers and academics.
“With the old UPS it was very hard to be agile because they came in bulky lumps of 400 kVAs a time, whereas the Trinergy comes in much smaller denominations that are easily deployable,” he said.
As well as the ability to scale up resources at short notice, another key requirement was energy efficiency.
“We have chosen the most energy efficient UPS around,” said Powell. “I don’t think anything else rivals it. So we’ve got some big ticks in the boxes on our carbon reduction. That’s the ultimate reason we chose them, the energy efficiency.
“All universities are under huge pressure at the moment to reduce their carbon footprint so getting an energy efficient item in the power chain like a UPS, which has huge amounts of electricity running though it, is a massive tick in the box,” Powell added.
As well as building a new data centre the university has also been revamping its networks.
“The number one item on the university’s risk register was the data centre. Now we are looking at the network architecture, how that information gets to the user’s desktops,” said Powell, who explained how the university’s networks are being revamped in order to bring more flexibility.
“We’ve embarked on a network architecture programme of works that has seen the complete replacement of our fibre infrastructure on the campus. The trouble is at the moment the existing buildings are connecting in series, so you lose one building ahead of the chain and you potentially lose three more downstream.”
When the project is completed, every building will be individually connected to the network rather than needing to rely on a chain of other buildings.
“The new architecture of air-blown fibre is delivering dual diverse connections into every building, and every building will have its own dedicated feed rather than being hung off other buildings. That’s started and has an 18 month programme to deliver,” said Powell.
However, it isn’t just the University of Southampton’s main campus that is to benefit from improvements to IT infrastructure, with the medical teaching campus at Southampton General Hospital and the Oceanography Department on the docks all in the midst of receiving improved services in order to give students and staff the best opportunities possible.
“The university operates a lot of remote campuses, such as Southampton General Hospital, where we deliver services to all university users there and that has a single WAN link. But what we’re doing is doubling up on all of our WAN links to remote sites to improve connectivity.”
The university is also in the process of updating much of its data network equipment as part of what Powell described as a “massive infrastructure programme”, which is being completed with the aid of solutions provider Cisco.
“On top of the dual diverse WAN links we’re replacing our entire core and edge switching data networking equipment and we’ve completed our core upgrade – a Cisco solution – and we’re in the process of replacing 600 eight-port gigabit switches with Cisco variants.”
Another major project is the rollout of a new wireless network spanning the university’s entire main campus, including student residences.
“What we’re doing is rolling out a wireless solution that hits the future features we want the students to benefit from, such as location services, voice over Wi-Fi, and we’re starting to put that infrastructure in place in the halls of residence and other buildings on campus,” Powell said.
In the backgound to all these projects has been what Powell described as an “aggressive virtualisation” programme. “That saw 78 per cent of our IT services virtualised in a big bang approach and we enlisted the help of IBM for the transition,” he told Computing.
“What that effectively meant for us in a data centre is we went from a 100-rack data centre to our new data centre of only 32 racks, which is an absolutely huge saving for us.”
Looking to the immediate future, Powell said his main goal was to ensure that the Emerson Network Power data centre migration is completed by the project’s Easter deadline. Beyond that, Powell’s next big task looks like being the commissioning of another new data centre.
“We’re looking to procure another secondary data centre because the existing secondary data centre at the end of its life and is in an old crumbling sixties tower block,” he said. And with the apparent success of the main data centre, Powell told Computing that Emerson Network Power is likely to come away with a deal to build a new secondary unit.
“Going for the Emerson Trinergy in this data centre has seen a switch in UPS manufacturer for the university. So much so that one of our brand new engineering buildings has got Emerson UPSs in it, and I can see that trend continuing,” he said.
“It’s actually stated in our ICT specification for new builds that it must be an Emerson UPS product,” Powell concluded.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)