News Column

Adapting to change

August 6, 2014

ArabianBusiness.com Staff



Equally challenging will be providing secure access to, and protection of, potentially sensitive but undoubtedly valuable information generated on a growing number of mobile devices.



With approximately one third of the global workforce predicted to be mobile by 2015, it's clear that the bring-your-own-device (BYOD) movement has created massive market demand for end users to be able to quickly access data from any device and to benefit from collaboration and knowledge sharing, regardless of their location.



The mountain of data now residing on mobile devices, however, is forcing companies to rethink how they securely capture, store and retrieve data in order to derive more value from it whilst remaining compliant. Indeed, Gartner has recently suggested in its report, BYOD is an Applications Strategy, Not Just a Purchasing Policy' (November 2013), that what is needed is 'global class' computing "an approach to designing systems and architectures that extends computing processes outside the enterprise and into the cultures of the consumer, mobile worker and business partners."



That said, it also appears that too few organisations have either devised or implemented comprehensive strategies to support these mobile workers, or are yet to put in place measures to ensure regular backups from mobile devices as a minimum. Protecting access to sensitive data and providing an alternative to storing company data on local or removable devices is critical and needs to be addressed.



There is a lot of reliance on cloud services like Box and Dropbox to gather and share information across an organisation, which itself presents a number of IT headaches for companies. Trying to make a consumer product work for the enterprise inevitably means that mobile management remains outside the IT department's awareness and control, and as such valuable corporate data is left unprotected. From another perspective, it's essentially "lost" to all employees across the organisation for the purposes of collaboration and knowledge sharing.



So while the emphasis in 2013 may have been on creating modern data management strategies, the priority for 2014 now has to be the synchronisation of files and preparation for automated retention methods to reduce the risk of data loss.



The biggest single benefit of automated and continuous file and folder synchronisation is that it makes data fluid, not siloed. It supports different environments and operating systems and provides guaranteed access to either old documents indefinitely or to the most recent files instantly, regardless of the device on which they were created. However, given that synchronisation is different from traditional backup, what is also needed in order to encourage the widespread adoption of a central corporate repository are centrally defined yet automated processes that improve mobile data protection without any involvement from the user. IT needs to regain sight and control of all remote offices, and mobile devices, and to be able to provide a backup service that reaches all areas of the network in an instant. This is perhaps the only way after all to ensure that IT has the latest version of every user's files at any given time.



Just as important in this data focused age is the delivery of self-service access to protected files and e-mail 24-7. Fast restore times and intuitive, role-based search capabilities are now expected by users across all enterprise data as standard in order to simplify search and e-Ddscovery. Submitting a ticket and waiting for IT support is no longer acceptable. Any delay in access only increases the likelihood that employees will return to the high-risk, consumer file sharing tools and ad-hoc cloud backup services.



It's also important to note at this time, however, that enterprises are becoming more and more comfortable with the security and scale of public cloud resources from big providers. Rather than burden their networks, they are instead opting to use this solution when using a heavy application like SharePoint, Oracle or SAP. This potentially means that we are likely to see private cloud growth coming from industries like finance and healthcare where there are significant compliance or security issues, and where there is a need to know that data hosted onsite is treated as mission critical.



Organisations are beginning to investigate the benefits of integrating their data management solutions with cloud platforms, enabling them to seamlessly move data from an on-premise data centre to infrastructure as a service (IaaS) from leading cloud providers such as Amazon Web Services (AWS). The challenge is to harness automated, content-indexed data collection, protection, access and retrieval from a central console with low cost-cloud storage for data archiving. This has the potential to enable organisations to add storage capacity to keep pace with data growth without placing an additional burden on the IT department. It also makes it possible for organisations and service providers to efficiently leverage cloud services to meet SLAs and budgetary requirements.



Whilst it may be true to say that this approach is best suited to long-term archiving where data is not expected to be restored frequently, and where it can be stored in de-duplicated form to minimise long term capacity growth, integrating data management with cloud platforms does provide organisations with a much needed degree of certainty about their ability to carry out complex eDiscovery requirements. Content-indexing combined with policy-based, integrated alerting, tracking, and data verification of data copies in the cloud is proving to be the best way to ensure that secondary or tertiary data can be retrieved from long-term storage successfully. In essence, this means that newer de-duplicated backups can be prioritised for storage on higher performing, more immediately retrievable disks and that companies can tier storage costs according to the SLA profile of the data, application or service, without running the risk of losing any data along the way.



That said, given the application-centric nature of IT, it's also likely that we will start to see organisations trying to balance the benefits of virtualisation with the costs associated with virtual machine (VM) sprawl. The promise of reduced running costs and network infrastructure, application flexibility and scalability will undoubtedly continue to encourage organisations to deploy more critical applications within a virtual machine context. However, these critical applications come with the most demanding SLAs for application uptime, granular recovery points, and rapid recovery times. With limited resources for even routine IT management tasks (like backup and recovery), storage and backup administrators could find it even harder to manage and protect critical VM application data which could be at risk.



Enterprises clearly need to re-think traditional data protection techniques. In the face of growing storage virtualisation in 2014, the blurring of the lines between cloud and on-premise, and the impact of BYOD on the adoption of virtualisation and the cloud, what's needed is the ability to automatically manage the data management strategy from a single console. Companies need to be able to move unused VMs to cost-effective storage, with the ability to instantly recover them for increased utilisation, efficiency and savings. If organisations don't tackle these issues, they risk paying more in the long term to store data that is never going to be used again, as well as struggling to regain control of the sprawl placing an even greater burden on the already stretched network.




For more stories covering the world of technology, please see HispanicBusiness' Tech Channel



Source: ITP.net (United Arab Emirates)


Story Tools






HispanicBusiness.com Facebook Linkedin Twitter RSS Feed Email Alerts & Newsletters