Vendor Highlight

Mobile Broadband Boost

As the demand for a smooth and seamless integration of Wi-Fi access becomes increasingly important part for the mobile broadband service, Ericsson has announced its 3GPP compliant Wi-Fi network access, control and management solutions.

Read more ...

Expert Talk

Securing Utilities Infrastructure

As a highly critical sector, the oil and gas infrastructure should be one of the most secure, both physically and digitally. This is not the case.

Read more ...

 

Deployment of new and advanced technologies, support for an increasing number of devices, increasing virtualization and the general expansion of business have dramatically altered the volume and nature of the workloads beings handled by data storage systems.

storage

The need to implement a strategy to deal with growing storage requirements is one which resonates with IT managers and CIOs across the region. According to Gartner, the Middle East & Africa IT infrastructure market, comprising of servers, storage and networking equipment, is forecast to reach US$ 3.9 billion in 2013, which is a 4 percent increase from 2012. In a race to accommodate the deluge of data, all too often, enterprise IT teams are forced to react by creating storage silos, each with its own IT operations model.

 

As data centers have been scaled up over time, all too often, these new storage systems have been added without sufficient consideration for what was previously deployed. This has then led to a scenario wherein there exists a storage silo for each application workload: database data; a silo for shared file data; a silo for web object data; and so on. This reactive approach can not only increase the capex for storage but can create a huge impact on on-going operational expenses owing to the different management tools, different provisioning tools and different skill sets required. Given the size and rapid growth of data, and the prohibitive cost of copying large data sets around the enterprise, organizations simply cannot afford to build dedicated storage silos.

 

Virtualization is now being widely employed by enterprises in order to both reduce infrastructure costs through better utilization, as well as to pave the way toward cloud deployments. Since the success of these virtualization deployments depends on shared, network enabled storage infrastructure capable of eliminating the disparate silos associated with various applications and workloads, data storage and data management are now seen as a top priority for IT managers. In addition, a centralized approach to data management is no longer feasible in the age of big data. Data sets are too large, WAN bandwidth is too limited, and the consequences of a single point of failure are too costly. A big data storage platform must be able to manage data through a single, unified pool distributed across the global enterprise. Instead, managing storage through a unified platform is a viable solution that not only simplifies data access and management but also greatly improves operational aspects such as power, cooling and space utilization in data centers.

 

The Unified Storage Approach

The ideal approach to data storage is to have all the data reside in a general enterprise storage pool and make this data accessible to many enterprise workloads. This provides a unified platform for the procurement, provisioning, and management of enterprise storage. Furthermore, this approach is agnostic to the type of data such as files, objects, and semi-structured or unstructured data. There are a number of unified platform based storage solutions available in the market and by implementing such a solution, organizations will start to realize huge benefits in reduced operating expenses and increased service levels to end users. What is also important for CIOs to bear in mind is that rather than attempting to protect against failure through the use of proprietary, enterprise-grade hardware, they can opt for an open unified big data storage platform which assumes that hardware failure is inevitable and offers reliable data availability and integrity through intelligent software. Accomplishing this requires a different approach by storage software vendors- one that is based on community-driven innovation.

 

Community-driven innovation is the hallmark of a true open source approach to solving enterprise storage problems. For example, the emerging area of big data alone has more than 100 distinct open source big data projects with thousands of software developers contributing code, enhancing features, and increasing stability. It is hard to match this pace of innovation when software is being written within a vendor’s four walls. In a scenario wherein IT can well play the part of competitive differentiator, and flexibility, availability and scalability are key words in any deployment, enterprises need to identify those areas which play the most vital role in the smooth operation of business. Time and time again storage has been earmarked as a pain point and the advent of big data is only set to further complicate this problem. Adopting a unified storage approach eliminates the unnecessary overheads of managing disparate silos and makes data available whenever and wherever it is required.



By George DeBono

General manager, Middle East & Africa at Red Hat

 

You have no rights to post comments