4 Data Center Use Cases You Don't Consider

<span id="hs_cos_wrapper_name" class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_text" style="" data-hs-cos-general-type="meta_field" data-hs-cos-type="text" >4 Data Center Use Cases You Don't Consider</span>

Nov 16

Nov 16

Big Data

data center use cases.jpgData centers have been around for as long as there have been computers, but data center design and use cases continue to evolve with new network architectures. Migration to the cloud and applications such as virtualization have dramatically changed the layout and applications of today’s data center.

Worldwide spending on data centers is expected to reach $185 billion by 2020, and most of that spending will be on new technologies to enable virtualization and cloud integration. With increased cloud migration, Gartner, Inc. predicts that public cloud spending will reach $204 billion in 2016, up 16.5 percent from 2015. Many of the emerging data center use cases are to provide more computing power and more versatility to integrate data centers with cloud resources.

Here are just four of the latest data center trends that you and your data center customers may not have considered.

1. More cooling for higher-density computing

Even with the boom in cloud adoption, high-density computing is becoming part of data center design. IDC estimates that, between 2013 and 2020, data storage will increase by a factor of 10 from 4 zettabytes to 40 zettabytes. Not all that data will reside in the cloud, so data centers are going to have to find a way to accommodate more hardware for data storage, and more cooling.

There are a variety of emerging strategies for data center cooling. One is cold aisle containment, where data center hardware is physically contained within a compact pod that consists of an aisle with hardware on either side and end doors. This compact space makes cooling more efficient, and more predictable, because it avoids obstructions such as cable trays. Another approach that is growing in popularity is liquid-cooled data center racks that use circulating water to reduce heat.

2. Complementing the cloud

Early adopters predicted that cloud computing would eventually kill the data center. While we have seen an increased adoption in cloud computing—the RightScale State of the Cloud report says that 88 percent of businesses are using the public cloud and 63 percent are using a private cloud—the need for data centers is stronger than ever, because the data center is the tether for cloud computing.

More businesses are adopting colocation services as part of their cloud computing strategy, using the cloud for extensible data storage. IDC predicts that, by 2020, 40 percent of all data will be touched by the cloud, which means data will be stored or processed in the cloud in some fashion. For data center administrators, the objective has to be closer integration with cloud resources and complementing cloud computing with local data services.

3. Edge and fog computing

Where data processing actually occurs is migrating with cloud computing, which will affect data center design. Edge computing, for example, is growing in popularity, pushing data processing closer to the data source. Rather than using centralized servers, data is preprocessed before being sent to end points. Programmable automation controllers are responsible for handling processing and communications of data in transit. Edge computing is becoming increasingly important for mobile users and applications where people need to access more content.

Fog computing brings computing closer to the local area network (LAN). Cisco introduced the fog computing concept as a strategy to bring cloud computing closer to connected devices. Data is gathered from the cloud and preprocessed at the LAN before being transmitted back out to the necessary devices. As the amount of data generated by the Internet of Things continues to escalate, fog computing will grow as an effective means to deal with the additional data processing.

4. The software-defined data center

Virtualization has created new software-defined computing environments that connect virtual machines to outside servers. Now the data center is programmable, and the entire infrastructure can be delivered as a service or on-demand software. Welcome to the software-defined data center (SDDC).

The concept of the SDDC is to abstract all the data center components to make it easier to integrate and automate data center activities. The advantage is being able to change network configurations, such as servers and storage, on demand. The SDDC also opens the possibility for hyperscale data centers. And the SDDC will look like a cloud infrastructure, giving you a higher level of management integration.

At the heart of the SDDC is virtualization, which enables a dynamic infrastructure by decoupling physical network connections. Virtual connections across virtualized environments can promote open software-defined networking.

An SDDC also can improve networking efficiency. Rather than using dedicated hardware, you can divide resources as needed. For example, you can separate databases and data and migrate computing tasks where they are required. Experts predict that the SDDC will continue to thrive in the years to come.

These are just four new use cases to consider in next-generation data center design. There are countless others, but, as you can see by these examples, new use cases tend to have a domino effect. Demand for more data storage drives data center density, but, at the same time, it drives virtualization to access that data, which, in turn, is driving SDDC adoption. As you are planning data center expansion for your customers, keep the ripple effect in mind and be ready for the next big thing in data center design.

Topics: Data Center

2016 Guide to Big Data in Public Sector