The Internet of Things (IoT) is generating a demand for the management of Big Data. There is a need for data from specialized devices and applications be made available to business applications for increased corporate value. Businesses are integrating these specialized devices and applications with core business processes for analytics and advanced business processing. This data is a mix of varied data types and has now become corporate data. This complex data creates the demand for new technology.
Hadoop has arrived to answer the challenge. The global market revenue, which was estimated at $2.0 billion in 2013, is rapidly expanding and may grow up to a staggering $50.2 billion by 2020.
While Hadoop is offering answers to the need to handle Big Data, Hadoop requires skills and training. Finding those human resources can be challenging. Another issue is to be able to backup Big Data for the purpose of regulatory requirements or recovery of a catastrophic event.
A Big Data implementation using Hadoop presents a need for even more focus on the ability to recover from a catastrophic event quickly. However, the SMB is not often staffed or tooled to design and execute a backup strategy of this level of complexity. The other consideration is that since the attractiveness of Hadoop is to use local servers, there is a further need to implement a data backup and recover strategy that can be managed remotely but not have a requirement that the live data be transferred to or running in a cloud environment.
There are Data Backup/Recovery Managed Service Providers (DB/R MSP) that provide remote management of the Backup process, along with professional Disaster Backup and Recovery consultation. Contracting an DB/R MSP with the model of remote DB/R management allows the SMB to maintain their data locally without the need to hire new staff or train existing staff in sophisticated data backup and recovery processes. Additionally, the SMB can have a comprehensive Data Backup and Recovery strategy while housing their Big Data locally.
There is a continuing conversation on what effects climate change and fracking are having on our environment. However, there seems to be no disputing the increase of natural disasters in the past few years when compared to periods in the previous century.
As stated in the Iron Mountain Blog post What’s Shaking: Preparing for the Next Big One “According to a recent U.S. Geological Survey report, the number of major domestic earthquakes (magnitude 7 or greater) in the first quarter of 2014 was more than double what the national average has been since 1979. And that doesn’t even include hundreds of smaller, man-made temblors caused by fracking and oil drilling; incidences of these events quadrupled in Oklahoma this year.”
In the world of enterprise data, there are other trends impacting the perspective of business, while they prioritize their data backup and resilience programs.
Customers using Data Backup and Recovery Solutions are reevaluating their present backup and recovery strategies. This is being motivated due to the increase in data complexity because of data variety, velocity and volume.
To help businesses of all sizes manage Big Data, there is Hadoop. The Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using a simple programming model.
The Internet of Things
Businesses are integrating specialized devices and applications with core business processes for analytics and advanced business processing. This data from these specialized devices in cars, water meters, etc., has now become corporate data.
These new data types present a need for even more focus on the ability to recover from a catastrophic event quickly. However, the SMB is not often staffed or tooled to design and execute a backup strategy of this level of complexity. The solution is a Data Backup/Recovery Managed Service Providers (DB/R MSP) that provides remote management of the Backup process, along with professional Disaster Backup and Recovery consultation. Contracting a data backup consultant with the model of remote DB/R management allows the business to maintain their data locally without the need to hire new staff or train existing staff in sophisticated data backup and recovery processes.
On November 20, 2014 IBM announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM’s platform-as-a-service. As IBM states “The new platform enables developers to build applications around their most sensitive data and deploy them in a dedicated cloud environment to help them capture the benefits of cloud while avoiding the compliance, regulatory and performance issues that are presented with public clouds.”
Bluemix is an implementation of IBM’s Open Cloud Architecture, leveraging Cloud Foundry to enable developers to rapidly build, deploy, and manage their cloud applications. According to IBM, this means that cloud applications built on Bluemix will:
- Reduce time for application/infrastructure provisioning
- Allow for flexible capacity
- Help to address any lack of internal tech resources
- Reduce Total Cost of Ownership (TCO)
- Accelerate exploration of new workloads – social, mobile, big data
However, as IBM is moving fast in cloud but is it fast enough to matter?
“Still, Amazon with an 8-year head start in public cloud, shows no sign of slowing down and Google and Microsoft — both of which with money to burn — show a willingness to burn it on their clouds. It’s unclear if IBM has that luxury.”
In any case, the small and medium size businesses are creating the need for alternative solutions to their hunger for more data and more complex data types. This all increases the need to understand how the data is to be backed up and recovered. Experts in data backup and recovery are critical to the overall solution. If you would like to speak more about backup and recovery procedures for the new requirements, contact Salvus Data consultants.