API’s Are Being Developed to Keep Up With the Expansion of the Internet of Things

When referencing the Internet of Things, it is important to understand that it is really about the data being transmitted, not the devices or the applications. If these devices or applications remained in communication with only their own platform, they would not be part of IoT. These devices must be interacting with traditional applications to be playing a part in the IOT.

To facilitate the expansion of the IOT, APIs are being built at an accelerated pace.network security

In the report from GIGAOM Research called Building an API-driven Ecosystem for the Internet of Things, there are Key findings that include:

  • IoT hardware is not the end game: The profits, margins, and innovations will come from the new products and services built on open, flexible APIs.
  • Technical excellence is not enough. Successful IoT developers must properly onboard, support, monetize, secure, and evolve their platforms in order to compete.
  • Modern APIs enable service composition rather than individual functions, creating an IoT supply chain.
  • Ecosystem architects should build on practices established in mobile-app development for their foundation, customizing only when necessary.

The IT staffs of medium-sized companies will be seeing more requests to integrate the IoT with their mission critical applications. There is a need for the data from these devices and applications to be made available to business applications for increased corporate value. Businesses are integrating specialized devices and applications with core business processes for analytics and advanced business processing. This data has now become corporate data.

The result is more complex data types within the corporate business data environment. This provides an additional burden on the IT staff in understanding the implications of developing backup procedures for these diverse data types.

To discuss this trend in further detail, contact Salvus Data, a data backup and recovery MSP with consultants experienced in backing up complex data types.

Recovery Time is a Critical Element of a Financial Institution’s Business Continuity Plan

Data Recovery time is a critical part of meeting the FFIEC IT exam for financial institutions.

The Business Impact Analysis was a section added to the FFIEC (Federal Financial Institutions Examination Council) Business Continuity Planning Booklet in 2008. The Business Continuity Planning Booklet is one of 12 that, in total, comprise the FFIEC IT Examination Handbook.

Banking regulation

 

According to the FFIEC, a business impact analysis (BIA) is the first step in the business continuity planning process and should include the:

  • Assessment and prioritization of all business functions and processes, including their interdependencies, as part of a work flow analysis;
  • Identification of the potential impact of business disruptions resulting from uncontrolled, non-specific events on the institution’s business functions and processes;
  • Identification of the legal and regulatory requirements for the institution’s business functions and processes;
  • Estimation of maximum allowable downtime, as well as the acceptable level of losses, associated with the institution’s business functions and processes; and
  • Estimation of recovery time objectives (RTOs), recovery point objectives (RPOs), and recovery of the critical path

The last two points are of special importance. Being able to recover your data is not the whole issue. Being able to recover your data in a time frame that meets business objectives is critical.

As we have stated in our post Don’t Forget These Things When Data Backup And Recovery Processes Are Being Developed, a major part of the backup and recovery process is the physical network. To name just a few of the factors that impact the infrastructure design would be the frequency of the backups, the required time for the restore to be completed for effectiveness, the medium the data resides, the proximity of the backup location to the original site, etc. Networks may be under-powered to meet data backup and recovery requirements.

Recovery depends on more issues than just recovering from a catastrophic event. Data backup and recovery strategies must also meet company policies regarding regulatory requirements, data breaches, ability to respond to court orders, and more. This requires coordinated strategies and testing. Data Backup strategies must be planned and tested to assure all company requirements regarding data retention and recovery are met.

Outsourcing data backup processes is an approach that can be considered to have expert guidance from Data Backup specialist that know their field. Outsourcing to an American managed service provider is often the preferred choice; especially if the data can remain within the control of the company and only the backup and recovery procedures are performed remotely by the data backup and recovery MSP.

To discuss data backup and recovery processes further, as they apply to regulatory requirements, contact Salvus Data Consultants. Salvus uses Tivoli Storage Management (TSM) remotely to manage Data backup and recovery while you maintain control of your data.

IBM Continues to Expand Its Cloud Offering

On November 20, 2014 IBM announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM’s platform-as-a-service. As IBM states “The new platform enables developers to build applications around their most sensitive data and deploy them in a dedicated cloud environment to help them capture the benefits of cloud while avoiding the compliance, regulatory and performance issues that are presented with public clouds.”Data Recovery

Bluemix is an implementation of IBM’s Open Cloud Architecture, leveraging Cloud Foundry to enable developers to rapidly build, deploy, and manage their cloud applications. According to IBM, this means that cloud applications built on Bluemix will:

  1. Reduce time for application/infrastructure provisioning
  2. Allow for flexible capacity
  3. Help to address any lack of internal tech resources
  4. Reduce Total Cost of Ownership (TCO)
  5. Accelerate exploration of new workloads – social, mobile, big data

However, as  states in her post IBM is moving fast in cloud but is it fast enough to matter?

“Still, Amazon with an 8-year head start in public cloud, shows no sign of slowing down and Google and Microsoft — both of which with money to burn — show a willingness to burn it on their clouds. It’s unclear  if IBM has that luxury.”

In any case, the small and medium size businesses are creating the need for alternative solutions to their hunger for more data and more complex data types. This all increases the need to understand how the data is to be backed up and recovered. Experts in data backup and recovery are critical to the overall solution. If you would like to speak more about backup and recovery procedures for the new requirements, contact Salvus Data consultants.