Data Center Cooling Methods

In recent years, as the number of agencies load up on big data and move to the cloud accelerating dramatically, more data centers will come online, and cooling may become one of the biggest problems to overcome. It is thus natural that data center cooling has attracted much more attention ever since. The article focus on describing the current condition of data center cooling performance and also managing to present a variety of cooling methods.

The Current Circumstances of Data Center Cooling

It is never an easy task to keep data center always cool. They contain processors with enough heat energy in each one to fry an egg, pumping it out inside a small space. That may explain why most data centers were wasting money in electricity and cooling costs. Hence, cooling still persists as the biggest drain on energy in most data centers—the biggest one beyond feeding the machines power. This has driven organizations to try some innovative methods that seem to work well, even if some of their techniques are a little extreme. Basically, there are several tips that data centers can follow to help lower cooling costs. Let’s just get an overlook at the most common data center cooling methods.

Cold or Hot Aisle Air Containment

This trend has lasted for at least 5 or 6 years and it is achieved by physically isolating the possibility of the hot or cold air mixing and driving it directly from and to the CRAC unit.

Aisle containment

It actually performs pretty well and reduces substantially the issues with “hot spots” and air mixing. However, the downsides are that you still need to control the pressures of your plenums (and everything that goes with it) and that you’ re cooling or heating large areas that you really don’t need to.

In-Rack Heat Extraction

Things begin to get more creative from applying this method. This data center or rack cooling method focus on extracting the heat which is generated inside the rack to prevent it from going into the server room.

In-rack heat extraction

There is another similar method that is to put the actual compressors and chillers inside the rack itself, thus to take the heat directly to the exterior of the data center. This may contributes to build a nice and neat server room. However, the demerit is that you still fail to get very high computational density per rack, moreover, the setup is very complex and hard to maintain unless you get much improvement in your Power Usage Effectiveness (PUE).  

Liquid Immersion Cooling

Liquid Immersion Cooling generally using a dielectric coolant fluid to gather the heat from server components, which means that we can put it in direct contact with electrical components. Liquid coolant running through the hot components of a server and taking the heat away to a heat exchanger. This is proved to be hundreds of times more efficient than using massive CRAC (Computer Room Air Conditioning) devices. By adopting this data center cooling method, you have a greater chance to achieve unprecedented PUE.

Combing the infrastructure

Modeling the infrastructure enables to achieving data center cooling efficiently, which means to find out the hot spots by looking hard at all the cracks and corners. Perhaps adding a curtain or moving a server from one rack to another can result in a much more efficient operation. To reduce costs associated with that type of monitoring and modeling, consider some of the small data sensors that can help track the temperature in the data center.

Conclusion

From what we have discussed above, you may have acquired some basic knowledge about data center cooling. Huge steps can radically enhance cooling, whereas smaller steps might be a good interim solution with a high return on investment. Working efficiently with the tools at hand is always sound advisable. Just move those ideas into the data center and start making a difference.

How to Clean the Data Center?

Dust and dirt in data center could be a nightmare that troubles most of the telecom engineers. Now and then as they try to put their fingers on a distribution cabinet or a patch panel in a data center, the fingers are always stained by dust or dirt. However, this annoying situation is not rare for those engineers working in the field of telecommunication. Some of them may have realized the importance of cleanliness in data center, but they seldom take action to remove the dust and dirt. It means people simply attach less importance to keep the data center clean enough. Some contaminants can easily be seen or checked by eyes and hands, but there are still myriads of them existing inside the equipment which may lead to disastrous consequences such as overheating as well as various network failures if no proper action was taken to clean.

The Importance of Cleaning Data Center

Imagine what would happen if there is no regular cleaning in the data center? As it was mentioned above, the most direct result of contaminant is overheating. Since dust and pollutants in the data center are usually light-weight, If there is air flow, dust or dirt will move with it. The cooling system of the data center is depending largely on server fan which can bring the dust and dirt into the cooling system. The accumulation of these contaminant can cause fan failure or static discharge inside equipment. The heat dissipation will need more time and heat emission efficiency is limited. The following picture shows the contaminant at a server fan air intake that can explain this phenomenon.

the contaminant at a server fan air intake

As the cooling system is affected by the dust and dirt, the risk of the data center increases largely. Contaminants will capture every possible place in the data center where they are capable of. In addition, data center nowadays relies heavily on electronic equipment and fiber optic components like fiber optic connectors, which are very sensitive to contaminants. Problems like power failures, loss of data and short circuit might happen if the contaminants are not removed completely. What’s worse, short circuit might cause fire in the data center, which could lead to irreparable damage. The following picture shows the data center after a fire. It is really a disaster for the data center managers.

the data center after a fire

Dust and dirt can also influence the life span of data center equipment as well as their operation. The uptime of a data center may decrease if there are too many contaminants. Cleaning the data center regularly would help to reduce data center downtime and extend the life span of data center infrastructure equipment. It is proved to be cost efficient and energy saving comparing with restarting the data center or repairing the equipment.

Furthermore, data center cleanliness can offer an aesthetic appeal to a clean and dust-free environment. Although it is not the main purpose, a clean data center can present a more desirable working environment for telecom engineers, especially for those who need to install cable under a raised floor or working overhead racks and cabinet. No one would reject a cleaning data center.

Contaminants Sources of Data Center

There is no doubt that data center cleanliness is necessary. But how to keep the data center clean? Before taking action, source of contaminants in the data center should be taken into consideration. Generally, there are two main sources. One is inside the data center, and the other is from outside of the data center. The internal contaminants are usually particles from air conditioning unit fan belt wear, toner dust, packaging and construction materials, human hair and clothing as well as zinc whiskers from electroplated steel floor plates. The external sources of contamination include cars, electricity generation, sea salt, natural and artificial fibers, plant pollen and wind-blown dust.

Data Center Cleaning and Contaminants Prevention

Having known where the dust and dirt come from, here offers some suggestions and tips to reduce the contaminants.

  • Reduce the data center access. It is recommended that limited access to only necessary personnel can reduce the external contaminants.
  • Sticky mats should be used at the entrances to the raised floor, which can eliminate the contaminants from shoes largely.
  • Never unpack new equipment inside the data center, establish a staging area outside the data center for unpacking and assembling equipment.
  • No food, drink or smoking in the data center.
  • Typically all sites are required to have fresh air make-up to the data center, remember to replace on a regular basis.
  • Cleaning frequency depends on activity in the data center. Floor vacuuming should be more often as the traffic in the data center increased.
  • Inspect and clean the fiber optic components regularly, especially for fiber optic connector and interface of switches and transceivers.
  • The inside and outside of racks and cabinets should be cleaned.

Conclusion

Data center operates like an information factory nowadays as it processes countless data and information as well. Therefore, the cleanliness of the data center becomes increasingly important. If this essential “factory” is polluted by dust and dirt, it will eventually fail to provide reliable and qualified services. Not to mention that a clean data center could ensure a much more extended life span of equipment and applications thus to effectively save a great amount of money for the maintenance.

Prefabricated Modular Data Center – an Extremely Agile and Cost-efficient Option

Prefabricated Modular Data CenterWhether it’s an enterprise or multi-tenant (co-location) facility, the data center is fast becoming an organization’s most valuable strategic asset. Its ability to handle immense volumes of data, provide highly reliable IT services for users and quickly adapt to the increasing demands of a dynamic environment can make or break a business. As a result, IT and facilities professionals are constantly looking to make their data centers more agile and efficient.

A typical data center is a traditional brick-and-mortar facility that can range from a few thousand to a million square feet or more. A data center of this type is often pre-designed to house all of the necessary racks, power distribution, cooling, cabling, fire suppression, and physical security systems needed to support IT services over the next 10 to 15 years. These facilities can come with a hefty price tag in the hundreds of millions of dollars and take two to three years to plan, design and build.

So, what’s the problem? As IT technologies rapidly evolve and virtualization and cloud computing complicates traditional capacity planning, a brick-and-mortar data center designed today could conceivably become obsolete before it is ever deployed. Its power, cooling and IT “white space” requirements might have been specified by the business at a certain point in time, but by the time the facility actually goes live, the needs of the business—and the data center technologies available to support the business—may have irrevocably changed. This constant demand for change results in stranded or inadequate space, power, or cooling, and leaves traditional data center owners trapped in a perpetual and expensive retrofit cycle while attempting to save their initial capital investment.

As a result, many data center owners and operators are exploring alternatives to traditional data center design-and-build.

Prefabricated Modular Data CenterWithin the last several years, the market has warmed to the concept of the prefabricated modular data center. A modular data center is a concept that uses prefabricated modules—built and tested at a factory, disassembled, shipped to a site and then reassembled to deliver data center white space, power and cooling infrastructure. A modular data center can be set up and operational within 14 to 20 weeks instead of two to three years. Also, as business capacity needs or technologies change, new modules addressing the change can be quickly and cost-efficiently added or existing solutions pre-engineered for upgrading can be seamlessly modified.

This approach enables businesses to focus on meeting their current and very near term data center capacity needs, rather than attempting to project and build for their anticipated long-term demand. It creates a purpose-built data center infrastructure that’s built to fit from the start.

The benefits of a prefabricated modular data center include:

  • Significant capital expenditure savings in design, planning, construction and infrastructure
  • Lower power, cooling, and operational expenses due to infrastructure right sizing, engineering out complexity and the usage of hyper-efficient innovative cooling designs
  • The ability to future-proof the data center by easily upgrading whenever more capacity is needed

These benefits are why the prefabricated modular data center is an extremely agile and cost-efficient option for data center owners and operators looking beyond traditional approaches to address rapid changes in business and technology needs today and tomorrow.

Challenges and Innovations: the Modern Data Center – Modular Data Center

Modern Data CenterThe modern data center is a complex place. The proliferation of mobile devices, like tablets and smartphones, place an ever-increasing pressure on the IT departments and data centers. End-user and customers’ expectation levels have never been higher and the demand for data shows no sign of slowing down. Data center managers must manage all of these elements while also remaining efficient and keeping costs under control. So where does the data center go from here?

One thing I have noticed in the evolution of the modern data center is that the facilities are gaining importance; improving energy efficiency and IT management have come to the forefront. Maximizing the organization’s resources is vital, and that means delivering more to facilities and equipment without expending more on staffing. IDC forecasts that during the next two years, 25 percent of all large and mid-sized businesses will address the power and cooling facility mismatches in their data centers with new IT systems and put a 75 percent cap on data center space used. So there again is the crucial challenge of doing more and innovating while keeping budgets and spend under control.

Another key part of the next generation data center mix is automation. Today’s data center manager is engaged in sourcing the right automation tools that will help them manage energy consumption and add new technology without disrupting normal operations. These are a few of the key challenges in the modern data center—so data center managers and IT departments must find ways to address them.

Where does the Data Center Go Next?

At the heart of data center evolution is the information technology sector’s rapid rate of change. Many new products and services must be implemented with much less time to value, and data centers need to be agile enough to assess and accommodate them all. If you examine enterprise data centers, then you might observe the ways that cloud computing and hyperscale innovations are displacing traditional enterprise systems, with new paradigms pioneered by innovators like Amazon and Google. With new options being developed, enterprises now have to chart strategies for cloud computing, including public, private or hybrid cloud. Gauging where the technology will go next is difficult to tell. Will the traditional vendors, such as Cisco and EMC, prevail or will new paradigms from Nutanix or Simplivity disrupt and displace these traditional data center dominators?

The race is on to manage the rapid rate of change while also staying agile, meeting end-user expectations and managing costs. For example, data center managers must handle the level of capacity their data center requires while ensuring they don’t overspend on unused capacity. This is where the focus on data center design comes into play.

Taking the Data Center Forward

These specific needs and challenges that the modern data center faces require working with the right tools and solutions. Modular, purpose-built data center infrastructure allows organizations to develop data center services based on need—when capacity rises and where capacity is needed. For example, we’ve observed in Singapore that most data centers operate slightly above 2.1 Power Usage Effectiveness (PUE). This means that companies spend more on cooling their data center rather than on operating and powering the IT equipment. It is a simple challenge—drive efficiency without impacting operations. You want to drive PUE down to approximately 1.06, regardless of where you need to operate, and reap huge energy savings while better serving customers. If done right, there is a positive environmental impact.

Changing the paradigm of the traditional data center enables organizations to reap these rewards. Assessing and establishing business objectives that reflect what is possible, rather than what always has been or what is easier and more comfortable, has led to innovative services and new business models that reset the competitive standards for everyone. Better PUE is a mandatory step in this process. The PUE journey continues as evidenced by Amazon, which had recently taken to harnessing wind to power its data centers. Modular data centers will play a major part in this PUE journey, thanks to more efficient use of energy and greater flexible support for resiliency and compute density.