Showing posts with label IBM Cloud. Show all posts
Showing posts with label IBM Cloud. Show all posts

9 July 2019

Data Protection for the Data Fabric

Last week I had the pleasure of attending the NetApp UK partner academies in Manchester and London. I was pleasantly surprised by the venues, the presenters, the content presented and the number of attendees. 

One thing that was crystal clear is that NetApp are fully focused on enabling customers to build their Data Fabric, regardless of the vendors involved. They have recognised that data is distributed, dynamic and diverse and that each customer is unique, but all are looking to build and enhance a Data Fabric to enable digital transformation. 


Whilst NetApp understand they have the vehicles to store and serve that data quickly and efficiently with their AFF arrays, HCI solutions and Cloud volumes, they are embracing partners such as AWS, Microsoft Azure, Google Cloud, IBM Cloud, Equinix and Catalogic (of course!) to deliver solutions alongside NetApp as part of a customers Data Fabric. 

One customer who has built a Data Fabric based on NetApp is Ducati. Data is a key driver for Ducati and helps determine the construction of the bikes, the performance of the riders and strategy of the team. 
To maximise the impact of Data on the track, Ducati has partnered with NetApp to optimise the team's Data Fabric and meet its unique and expanding data management needs. 
Here's a short video that's well worth a watch with Piergiorgio Grossi explaining how Ducati are transforming the world of racing with data. 
The only spanner in the works for Ducati at the moment seems to be Marc Marquez!
Stick Jack Miller on that Red bike for 2020 and I think they have a good chance of gaining that elusive MotoGP world championship, it's been 12 years since Casey Stoner won it on the Ducati in 2007 (now I feel old!) 

But enough about MotoGP!

In terms of customers Data Fabric's data protection is a key element, it's an element that Catalogic DPX can deliver with our patented block-level incremental backups enabling quicker and more efficient data protection and data recovery capabilities.

DPX extends data protection for the Data Fabric to create a universal data protection, DR and copy services solution. 

By delivering rapid, low-impact backup and near instant recovery, DPX provides modern data protection for the data centre, remote office or cloud. 
DPX can send backup data directly to ONTAP based devices (physical or cloud), or via the vStor software-defined backup repository we can make use of any block based storage. 
Also DPX agents can also replace end-of-life OSSV agents and support the latest versions of ONTAP, this is still a problem a lot of NetApp users are having when moving from 7-Mode to cDOT due to OSSV not being supported on cDOT.  

An overview of the architecture of DPX is shown below

Pretty neat eh! With DPX block-level, file-level, NDMP and agentless VMware backups result in support for almost every operating system.
Heterogenous application-aware support means the vast majority of enterprise applications are supported including Oracle, SQL, Exchange, SharePoint, Db2, Notes, SAP and more - see details here
So as you can see most bases are covered with DPX.

Our recovery capabilities are second to none with instant access, instant and full virtualization meaning we can in the event of a disaster get the Data Fabric back online as quickly as possible ensuring even the most stringent RTOs are achievable. 
We also support bare metal recovery to dissimilar hardware i.e. physical to physical or physical to virtual so in the event your mission critical system running on an a physical system fails you can quickly get back online as a virtual system. 

DPX 4.5.5 has recently been released and there's never been a better time to check out the most comprehensive and affordable data protection solution available for the Data Fabric.
The latest release includes:
✅vSphere plugin for agentless backup
✅Rapid Return to Production (RRP) wizard for agentless
✅Exchange 2019 support
✅SQL Always On Cluster support
Full details on the updates in the release are available here 

If you want to know anything in relation to data protection for the Data Fabric, would like a demo or a free trial of DPX setting up please do feel free to ping me via wbush@catalogicsoftware.com

30 March 2018

Spectrum Protect Plus 10.1.1

On Friday the 23rd March Spectrum Protect Plus 10.1.1 was released. The release represents a big step forward for the product with some significant new features. Lets look under the hood and discuss some of the new features and what they mean to me and hopefully to you as well.
The biggest feature in the releases for me Is the vSnap replication for Spectrum Protect Plus. Why? Because this means Spectrum Protect Plus 10.1.1 can be implemented as a standalone product to protect virtual environments. For longer term retention and offload to tape it makes sense to still offload to Spectrum Protect. Hopefully the offload function is made more elegant over time and integrated into Spectrum Protect Plus as expected. 

The replication is ZFS based asynchronous replication. Its very simple to configure via the Spectrum Protect Plus GUI. You setup the storage partnership under backup storage and define the replication in the SLA. You can define the frequency of the replication and if you wish the retention policy to be the same as the source selection here as well. 

One of the other key enhancements for Spectrum Protect Plus is the introduction of application aware backups. The two applications seeing the enhancement in 10.1.1 are SQL and Oracle. For both SQL Server and Oracle Database, full restore and Instant Access scenarios are supported. In addition, for SQL server, you can use the log backup feature for continual backup of transaction logs to a specified destination. Support for additional applications will be introduced in subsequent releases. 

On first login following an upgrade its obvious to see the enhancements made based on the fact the menu contains significantly more options. The interface is slick, but still needs some work to enable you to interactively drill down into the summary’s, hopefully that comes in the next release. 

22 March 2018

Hybrid Backup and Recovery Solutions


As the evolution of the software-defined datacentre continues application support for integration with cloud solutions is improving.
One area you all know I’m very passionate about is backup and more importantly recovery and something I’m going to cover in this blog are the significant steps forward that have been made in terms of the enablement of Hybrid backup and recovery solutions.

Tectrade’s Helix Protect appliances began the Hybrid journey back in 2015 when Spectrum Protect version 7.1.3 introduced a new type of storage pool was introduced, a cloud container storage pool.

A container pool is a pool that combines multiple in line data reduction technologies. Data deduplication and compression technology work together to provide effective overall storage savings. Because of this inline data reduction, container storage pool I/O is dramatically reduced, enabling the use of lower-cost disk technology. Inline data deduplication and compression eliminate the need for post processing tasks, improve daily client ingest capacity, and simplify the management of the IBM Spectrum Protect server.

Support has enhanced and as of Spectrum Protect 8.1.4 cloud container storage pools enable data to be stored in the following cloud environments.
Off-premise:
Microsoft Azure Blob storage
Amazon Web Services (AWS) using the Simple Storage Service (S3) API
IBM Cloud Object Storage using the S3 API (IBM Cloud)
IBM Cloud Object Storage using the Swift API (IBM Cloud)
OpenStack Swift with Keystone Version 2, and OpenStack Swift with Keystone Version 1

On-premise:
IBM Cloud Object Storage using the S3 API
OpenStack Swift with Keystone Version 2, and OpenStack Swift with Keystone Version 1

So why are cloud container pools interesting?
Because they enable you to have an offsite storage tier without the investment required in a dedicated DC or co-lo space and the ability to simply purchase storage to store a secondary copy of your backup data on.

A lot of conversations our Technical Sales team are having are focused around Hybrid adoption and how to reduce the footprint of infrastructure in DCs and ultimately reduce cost.
Our goal is always to focus on outcomes and ensure the correct solution is architected for clients and data is stored in the most efficient and cost effective manner but ensuring that the required level of performance and recoverability is delivered. We enable customers to start on the journey of re-architecting there environments via the creation of a service catalogue. If your interested in a sample catalogue and discussion do let us know.
Once the catalogue is clearly defined we then get to work on the realisation of the benefits defined in the service catalogue. This involves ensuring the solution in place is fit for purpose, the correct service levels and cost are associated with the defined tiers and ensuring the data is aligned to the correct tier. 
The solution we use to ensure a best of breed and fit for purpose solution is in place for data protection is our Helix Protect appliance. They fully integrate with cloud environments utilising Spectrum Protect. A sample architecture is pictured below.