5 Big Data Examples Where Components Can Shine

<span id="hs_cos_wrapper_name" class="hs_cos_wrapper hs_cos_wrapper_meta_field hs_cos_wrapper_type_text" style="" data-hs-cos-general-type="meta_field" data-hs-cos-type="text" >5 Big Data Examples Where Components Can Shine</span>

Sep 13

Sep 13

Components

big_data_examples.jpgWhen it comes to big data infrastructure design, value-added resellers (VARs) and integrators are being called on by a wide variety of businesses across many industries to design highly scalable data centers. It is the data center IT infrastructure on a component level that enables storage and interpretation of big data that transforms industries and markets.

Of course, one of the biggest drawbacks to the term “big data” is how it abstracts very real, specific, and varied needs of accessing and using extremely large data sets in an era of the cloud, social, mobile, and the Internet of Things. Because VARs and integrators look at the data center on a component level as well as through the lens of a specific end-user scenario to realize the data center design, it’s possible to see how the components shine in specific big data examples.

Real-Time Real-World Big Data


Big data in the real world is about speed of data access, analysis, and the ability to act upon that analysis for real-time (or near-real-time) decision-making and business responsiveness. Incorporating flash and solid-state drives (SSDs) for high-performance big data analytics is an important step in increasing the speed of the infrastructure. In addition, server infrastructures with more processing cores, faster memory, and additional cache, among other important features, are critical to big data across industries.

Flash memory, all-flash arrays, Ethernet and PCIe switches and adapters, PCI cards, in-server SSDs, and the emergence of rack-scale flash are increasingly driving real-world big data examples across almost every industry and sector. These industries all share the reality of data- intensive workloads that require the low latency and dramatically fast I/O for data analysis needed for real-time decision-making. These industries include:

  • Finance
  • Defense and manufacturing
  • Healthcare, research
  • Telecommunications
  • Retail

Big Data Examples in the Real World


Enterprises within these sectors are being driven by high-performance custom and analytic applications such as SAS. As the line between artificial intelligence and big data continues to blur, it is fueling new market approaches to analytics and technology for a variety of industries and uses in the real world:

Example 1: Big data applications in sales and marketing help with figuring out which customers are likely to buy, renew, or churn, by crunching large amounts of internal and external data, increasingly in real time. Customer service applications help personalize service; HR applications help figure out how to attract and retain the best employees.

Example 2: Big data in finance is manifesting itself by enabling a lot of banks to partner, build, or buy systems to allow them to make credit decisions on loans within five minutes as well as a way to analyze transactional encounters via their own mobile apps to improve service.

Example 3: Healthcare big data examples abound and show how the components that make up these data architectures shine brightly. Evidence-based medicine is changing how patients are diagnosed and often demonstrates that alternative treatments are more effective (and cost-effective) than conventional care.

Researchers and epidemiologists map disease outbreaks and break down new bacteria to genetic components to stop viral outbreaks. Everything from smarter screening and more powerful population management to reduced claims fraud, health plans incentivizing healthier behaviors, and more efficient surgery center operations is already a reality with big data in healthcare.

Example 4: Communication services provider (CSP) big data examples show how they are utilizing it on a component level to increase market growth and customer service. Network capacity planning and optimization make up just one of the many areas where the intersection of speed, capacity, and throughput capabilities are provided on the component level of the data center.

By correlating network usage and subscriber density, along with traffic and location data, CSPs can more accurately monitor, manage, and forecast network capacity and plan effectively for:

  • Potential outages
  • Capacity expansion rollout
  • Plugging and minimizing revenue leakage
  • Managing network and cyber security
  • Driving down order-to-activation lead times
  • Proactively identifying and fixing customer issues in order to minimize truck rolls

Example 5: In the retail sector, big data is about gathering data via trend forecasting algorithms, recommendation engine technology, and machine learning to:

  • Track consumer/customer buying habits, preferences, and demand
  • Track inventory levels and competitor activity and automatically respond to market changes in real time, allowing action to be taken, based on insights, in a matter of minutes
  • Forecast demand for individual geographic areas based on customer demographics data for faster order fulfillment and supply chain management
  • Gather and analyze data on individual customer interactions online and within store locations to drive product and promotion outreach at the point of sale as well as across preferred channels

In order to make all of these big data examples a reality, enterprises must meet the increasing demand for faster access to more data by both existing and new applications via storage and throughput options. Enterprises are gearing up to create the modern data center capable of effectively handling big data solutions like Hadoop and more recently Spark, among others. As a consequence, the components are critical to eliminating I/O bottlenecks and bandwidth and capacity issues that throttle these solutions.

Solving the Big Data I/O Bottleneck Dilemma


Increasingly business data centers are tackling the I/O bottleneck created by the memory/storage performance cross point by using flash memory PCI cards or in-server SSDs that put the storage closer to the memory—right inside an individual client system. Another strategy is to move to all-flash arrays. An emerging solution that is seeing more adoption is a class of storage called “rack-scale flash.”

Rack-scale flash promises to provide a shared pool of storage accessible by a number of servers over a high-speed connection either within a single rack or among neighboring racks. These flash modules communicate using NVMe and proprietary protocols over 10, 25, and 40 GbE PCIe via adapters that promise easy and affordable scale-out performance storage capabilities. Of course, different solutions target different workloads and use cases.

Regardless of the industry, VARs and integrators are in an unprecedented position to support big data initiatives that have profound real-world, bottom-line benefits for their clients. It’s about more than just understanding big data from a component level. In order to help steer big data projects to success, VARs must have crucial skill sets that enable them to support, contribute, and work toward a defined goal.

Topics: Sales Strategy for VARs

Samsung SSD Whitepaper