How to optimise storage infrastructure

BY DAVID DZIENCIOL


The recent economic crisis injected a disconcerting amount of uncertainty into the business climate. And while there are strong signs that better times may be ahead, organisations are still reluctant to spend any more money than is absolutely necessary. New initiatives are being shelved and as Symantec’s recently released 2008 State of the Data Center Report observes, data centre managers everywhere are under tremendous pressure to “do more with less.”


If there is a silver lining here, perhaps it is that IT departments can use this opportunity to consolidate existing projects and focus on optimising existing systems to extend their useful life. After all, the State of the Data Center Report found that, in 2008, data centre servers were operating at just 53% of capacity, while data centre storage utilisation was even lower at 50%.


There are four key areas where IT departments can take action to reap significant near-term benefits from their storage systems.


As the State of the Data Center Report demonstrates, years of robust economic growth and storage over-provisioning have resulted in a significant amount of underutilised storage today. But having an array only 50% utilised is like paying twice as much for the storage you need. Idle capacity also consumes power, increases cooling costs, and unnecessarily takes up floor space, which is often at a premium.


A storage resource management (SRM) tool can help storage managers rectify this situation. It’s been said that managing storage without an SRM tool is like going on a journey without a map. With an SRM tool, managers can assess their situation and gain an enterprise-wide view of the storage environment.


They will then be in a position to answer three key questions: What is the average utilisation rate? What is the utilisation rate by application? Which applications are growing fastest (or slowest)? With answers to these questions, managers can identify problem areas and consolidation opportunities, and create a priority list of solutions.


As IT departments are called on to do more with existing or even fewer personnel resources, storage administrators need to simplify both the management and the provisioning of storage in an effort to be more productive and efficient. As a result, IT managers are looking to employ less storage more effectively. In addition, IT managers are aiming to deploy less storage to address power and environmental costs at the same time.Thin provisioning is a relatively new technology that is gaining mainstream acceptance. It allows administrators to allocate storage capacity to an application or a host multiple times in a shared pool concept.


Thin provisioning challenges the long-standing storage approach of having to dedicate capacity up front, based on allocation. That results in higher capacity utilisation, eliminates the guesswork in new application provision, and reduces capital expenditures and operating costs.


While most major array manufacturers support thin provisioning, the management tools from these manufacturers are highly vendor-specific. IT departments should explore the latest release ofSymantec’s Veritas Storage Foundation solution, which is “thin provisioning aware” and supports all thin provisioning architectures currently available, including those from 3PAR, EMC, Hewlett-Packard, Hitachi Data Systems, IBM, NetApp, and Sun.


Storage Foundation SmartMove enables hardware-independent online migrations of application information from traditional or “thick” volumes to thin provisioned volumes. This capability enables organisations to move quickly into thin provisioned environments and not incur wasted storage capacity.


Data deduplication is another recent technology that enables companies to eliminate duplicate backup data and significantly decrease their storage consumption. For example, if a Microsoft PowerPoint presentation is stored on different file servers multiple times, deduplication ensures that only one copy is stored, no matter how many full or incremental backups occur. The built-in data deduplication technology of Veritas NetBackup PureDisk reduces data backup volume by as much as 90% and reduces bandwidth needed by 97%. Microsoft Exchange backup can be reduced by as much as 98%. And deduplication is entirely transparent to the application.


Not all content is of the same value. For example, there is confidential business data, personal email, junk mail, spam, and so on. As a result, companies should be taking a hard look at ways to control their archive storage costs. Intelligent archiving is a content-aware classification methodology in Symantec Enterprise Vault that helps companies shape the archive and store only business-valued content with context. This is accomplished through multiple classification options using automated classification, user-driven classification, or third-party classification technologies.


Once data is classified, policy enforcement technology applies retention and expiry rules across different classes of information to ensure it is kept only as long as it is needed. In the current economic climate, IT departments are being tasked to do more with less.


And that means taking steps now to optimise existing storage assets. The good news is that the situation presents an opportunity to implement processes, procedures, and simple technologies to significantly reduce storage costs. Symantec is in a unique position to help IT departments optimise their existing storage infrastructure and ensure that optimisation persists over time.



 



David Dzienciol is the Senior Director, Enterprise Sales and Partners for Symantec in the Pacific region. In this role, Dzienciol is responsible for driving sales and operations in the enterprise and partner business across Australia and New Zealand. He serves as one of the senior leaders for the overall Symantec business in this region.