Large enterprise are taking steps toward a total cloud data center.
In the post on GIGAOM’s survey results Survey: What the Enterprise Cloud Needs to Become Business-Critical, they state “Many organizations are now progressing beyond these workloads, putting cloud computing to work in support of business-critical applications and workloads.” The post goes on to say “Sixty-six percent of respondents consider one or more Software-as-a-Service (SaaS) applications to be business-critical today, and a significant number also support critical workloads with public Database-as-a-Service (DBaaS) or Infrastructure-as-a-Service (IaaS) compute and storage offerings.”
However, adoption of a total cloud environment may continue to be impacted with concerns over security, meeting regulations, network bandwidth, and transition costs.
In response to these issues, an enterprise will first implement non-critical applications. As that goes well, the company will gradually begin to implement mission critical applications until the entire data center is a cloud implementation. Cloud applications will interact with in-house applications for a period of time as all issues of concern are worked out or accepted.
Data backup and recovery can follow a similar progression. The data can alternatively remain within the customer’s network and backed up off site to the customer’s choice backup location. The data backup and recovery processes can be managed in the cloud. The data may or may not be in the cloud with the model of remote data backup administration. When the term “cloud” is mentioned, we visualize an “all or nothing” scenario. We think that the data must be replicated to its backup site. We also imagine that the management of the DR processes and the data together must be in the cloud. But, alternative models are possible.
There are Data Backup/Recovery Managed Service Providers that provide remote management of the Backup process, along with professional Disaster Backup and Recovery consultation. To further discuss the subject of cloud data backup and recovery management, contact Salvus Data Consultants at 903-201-7233
When referencing the Internet of Things, it is important to understand that it is really about the data being transmitted, not the devices or the applications. If these devices or applications remained in communication with only their own platform, they would not be part of IoT. These devices must be interacting with traditional applications to be playing a part in the IOT.
To facilitate the expansion of the IOT, APIs are being built at an accelerated pace.
In the report from GIGAOM Research called Building an API-driven Ecosystem for the Internet of Things, there are Key findings that include:
- IoT hardware is not the end game: The profits, margins, and innovations will come from the new products and services built on open, flexible APIs.
- Technical excellence is not enough. Successful IoT developers must properly onboard, support, monetize, secure, and evolve their platforms in order to compete.
- Modern APIs enable service composition rather than individual functions, creating an IoT supply chain.
- Ecosystem architects should build on practices established in mobile-app development for their foundation, customizing only when necessary.
The IT staffs of medium-sized companies will be seeing more requests to integrate the IoT with their mission critical applications. There is a need for the data from these devices and applications to be made available to business applications for increased corporate value. Businesses are integrating specialized devices and applications with core business processes for analytics and advanced business processing. This data has now become corporate data.
The result is more complex data types within the corporate business data environment. This provides an additional burden on the IT staff in understanding the implications of developing backup procedures for these diverse data types.
To discuss this trend in further detail, contact Salvus Data, a data backup and recovery MSP with consultants experienced in backing up complex data types.
Data Recovery time is a critical part of meeting the FFIEC IT exam for financial institutions.
The Business Impact Analysis was a section added to the FFIEC (Federal Financial Institutions Examination Council) Business Continuity Planning Booklet in 2008. The Business Continuity Planning Booklet is one of 12 that, in total, comprise the FFIEC IT Examination Handbook.
According to the FFIEC, a business impact analysis (BIA) is the first step in the business continuity planning process and should include the:
- Assessment and prioritization of all business functions and processes, including their interdependencies, as part of a work flow analysis;
- Identification of the potential impact of business disruptions resulting from uncontrolled, non-specific events on the institution’s business functions and processes;
- Identification of the legal and regulatory requirements for the institution’s business functions and processes;
- Estimation of maximum allowable downtime, as well as the acceptable level of losses, associated with the institution’s business functions and processes; and
- Estimation of recovery time objectives (RTOs), recovery point objectives (RPOs), and recovery of the critical path
The last two points are of special importance. Being able to recover your data is not the whole issue. Being able to recover your data in a time frame that meets business objectives is critical.
As we have stated in our post Don’t Forget These Things When Data Backup And Recovery Processes Are Being Developed, a major part of the backup and recovery process is the physical network. To name just a few of the factors that impact the infrastructure design would be the frequency of the backups, the required time for the restore to be completed for effectiveness, the medium the data resides, the proximity of the backup location to the original site, etc. Networks may be under-powered to meet data backup and recovery requirements.
Recovery depends on more issues than just recovering from a catastrophic event. Data backup and recovery strategies must also meet company policies regarding regulatory requirements, data breaches, ability to respond to court orders, and more. This requires coordinated strategies and testing. Data Backup strategies must be planned and tested to assure all company requirements regarding data retention and recovery are met.
Outsourcing data backup processes is an approach that can be considered to have expert guidance from Data Backup specialist that know their field. Outsourcing to an American managed service provider is often the preferred choice; especially if the data can remain within the control of the company and only the backup and recovery procedures are performed remotely by the data backup and recovery MSP.
To discuss data backup and recovery processes further, as they apply to regulatory requirements, contact Salvus Data Consultants. Salvus uses Tivoli Storage Management (TSM) remotely to manage Data backup and recovery while you maintain control of your data.