News Stay informed about the latest enterprise technology news and product updates.

HP intros RAM- and CPU- packed blade for cloud computing, HPC

HP has introduced a new blade server with more CPU and RAM than its other blade server models for more intensive workloads and the cloud computing space.

Hewlett-Packard Co. announced the new HP ProLiant BL2x220c G5 server blade that combines two independent servers in a single blade.

These new blades offer significant possibilities for dense high-performance computing (HPC) and cloud computing environments with their more intensive workloads. Putting two independent servers, or nodes, into a single blade form factor is equal to double the performance, greater energy efficiency and cuts data center space requirements in half, said Jim Ganthier, the director of HP's BladeSystem marketing.

Each blade holds up to two dual-core or quad-core Intel Corp. Xeon 5400 and 5200 processors per server node and up to 16 GB of RAM and allows for 32 server nodes per 10 U in the c7000 enclosure. The BL2x220c scales up to 1,024 cores and 2 TB of RAM per 42 U rack.

Two Gigabit Ethernet ports are provided per server node with the option to upgrade to 10 Gb Ethernet or InfiniBand for high-performance, low latency interconnects. The HP ProLiant BL2x220c G5 is available today at a starting price of $6,349.

Though HP calls its new blade a two-in-one, other blade server providers pack the same amount of CPU cores and RAM. San Diego, Calif.-based Verari Systems' BladeRack2 XL introduced in February supports two quad-core processors from Intel or Advanced Micro Devices Inc. (AMD) and up to 672 TB of storage in a full 72 blade chassis. Sun Microsystems Inc. also announced a blade that month, The Sun Blade X8450, which has a total of 16 Intel Xeon processing cores in four sockets and 32 dual-in-line memory module (DIMM) slots with up to 256 GB memory.

Entering the cloud
HP's new blade, part of its Scalable Computing & Infrastructure (SCI) portfolio, is geared toward scale-out environments such as cloud computing; Web 2.0 businesses, financial modeling, oil and gas companies and so on.

"We see the opportunity and growth of this model toward 'Everything as a Service,' not limited only to big Web 2.0 companies that come to mind, but as a fundamental shift in the way IT services will be procured moving forward," Ganthier said. "We see cloud environments gaining traction beyond social networks and search engines to segments such as financial analytics and media and entertainment."

Gordon Haff, an analyst at Illuminata Inc..said the cloud computing market covers a wide gamut and includes most types of software being delivered over the network.

"There's a lot of computing happening in the network and, more broadly, a huge amount of computing happening on scale-out infrastructures," Haff said. "From the 100,000-foot level, it's clear that there is a huge opportunity for very large, scale-out infrastructures."

HP also recently announced the HP StorageWorks 9100 Extreme Data Storage System (ExDS9100) in addressing the scale-out market.

Watch your heat with power-packed blades
There have been concerns about the amount of heat generated by tightly packed blade chassis, and stuffing twice the RAM and CPU power into a single-blade form factor begs the question of how much heat a chassis full of the BL2x220c G5 blades will produce.

To keep heat at a minimum, HP engineered the blades with Thermal Logic active power management and cooling fan technology to keep heat and power levels in check, and the blades also use the lower watt Intel 5400 Series quad-core processors; low power memory support and the Intel chipset supports low power ECC DDR2 memory to further reduce heat and power.

"Heat and power isn't an issue," Ganthier said. Despite HP's marketing rep's promise, Haff said as with any blade server rack, users should consider heat issues when placing a chassis of these two-in-one blades in the data center.

"There's a lot of concentrated power that has to be delivered and heat that has to be removed. That's why a lot of thermal and mechanical engineering work has to go into these dense systems. And into the data center infrastructures that support them. You don't throw systems this dense into a supply closet someplace," Haff said.

Let us know what you think about the story; email Bridget Botelho, News Writer.

Dig Deeper on High-performance computing in the cloud

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.