Data management: Where radical and back-to-basics meet

DNE
DNE
8 Min Read

You may have issues with your data system and, with budgets tight, you might opt for building a new one with just a few improvements, using for instance a different product that performs broadly the same task, but without the issues of the prior system. This can be flawed thinking though as the new product will undoubtedly have different challenges which hopefully are not as bad as the last. Such changes will only deliver purely incremental change, never provide something fundamentally different.

You could propose that this is an argument against doing anything, but of course it isn’t. For example, replacing a server with a virtual one leaves you with pretty much the same server functionality but there are plenty of operational benefits and efficiencies. Transformational changes like this cost money but they save much more in terms of utilization or even provide possibilities that were just not open before.

Only the IT projects with the best justification will get a nod — and these will often be the ones that will also deliver assured savings. There are a few key methods and initiatives that are effective on their own, but combined can be even more worthwhile, such as data management and shared services.

The first of these is to implement a data and information management strategy. This may sound very basic to some or a massive job to others, but very few see data as a river running through the organization. It’s usually seen as “piles of stuff” scattered around and categorized by owner or function and treated accordingly. This scattered approach often suits the data management vendors as they like to sell lots of products to deal with these piles of data all over the place.

Stepping back and taking the river view allows you to plot where data should be throughout its life, automating its travel and protection along the way. Combined backup, disaster recovery and archive strategies can significantly reduce the amount of tier-one storage required as well as massively cut the cost of Disaster Recovery (DR) without necessary pooling resource. Handing control of data management over to a software layer breaks the link between features and hardware, allowing any choice of disk with a number of knock-on effects. Firstly, disk prices drop when you are not tied to a specific vendor and it means that innovation such as drives that spin-down can be introduced to help achieve other key initiatives such as carbon reduction without continually pumping more disk into tier-one arrays. Secondly, having a single point of control drastically reduces integration and reporting headaches, with much less time spent on customization or what should be simple configuration changes.

The other key change that comes about when you treat information as something that flows is that you have an opportunity to dip into one place and see everything that runs past. Right now many organizations struggle with freedom of information (FOI) requests. Even those that introduce Enterprise Search Technology often find gaps in coverage and end up with two or more search tools — one for the compliance archive and one for live data for example. Combining and correlating results from two or more systems can be a tiresome task and worst of all, there will still be huge “black holes” where key information can hide. Having one place to search live data, archive and backup provides real benefits and once in place, it is possible to design automated search so that data required on a regular basis can be pre-searched and collated ready for use.

The second key initiative is shared services. Combining data and information management into a shared service may seem alien, but a fundamental shift can be made that sets costs savings in stone for many years. This is easier now with the advent of smarter, more customizable and more scalable technologies delivered via a cloud model. If there are data security concerns, then organizations can create their own cloud of IT services or indeed data and information management specifically, such as backup, DR, archive and eDiscovery.

In the past the barriers to sharing IT were technological — especially with data protection and eDiscovery. The ability for one party to access or change protection levels for privileged or secure information of another party is not an option; nor is running a search against documents containing personal data if you don’t have the authority.

Thankfully, choosing the right technology means these challenges just aren’t there anymore which paves the way for shared data and information services that are transformational. Not only can control be central but it can also be confidently delegated or made self-service with full auditing of who did what, and when, for extra piece of mind. The flip side to this level of management is that data protection functions can be merged with fault tolerant and load-balanced processes that reduce the server and media footprint as much as virtualization helps with the production server estate.

Choosing the right system means that archive can also be rolled right in to the same infrastructure for even greater savings. Deduplication is one of the hottest technologies around in the data protection space and can be effectively distributed via the data management software layer — off-site DR and remote protection all of a sudden become viable and automated from one place without expensive hardware lock-in or restore penalties.

Whilst all of this is transformational from an ICT perspective, apart from the budget savings and the ability to increase the quality service with less of everything, just gaining access to the right information can be the biggest benefit. Legal, HR or regulatory searches can become quick and easier without the black holes; they can even be spread throughout the user-base with secure role-based limits on scope. Data and information management seems anything but radical until you really do get back to the basics, which is when fundamental improvements can be achieved.

Steve Bailey is the Regional Operations Director of CommVault Systems.

Share This Article