June 16, 2017

Archiving: Safe port in the sea of data

Marc Andreessen declared “Software Is Eating the World” in 2011. Today, we find the enterprise awash in a sea of data, often from disparate systems hobbled together as organizations respond to the ever-churning tide of consumer demands. A staggering volume of data is generated, created more rapidly, and retained for longer periods of time than ever before. In fact, more data has been created in the past three years than in the previous 20 years, and 80% of that is unstructured content, e.g., PDFs, emails, SharePoint sites, web pages, clinical records, production management files, and other proprietary formats.

Although aged, data rarely becomes throwaway. This puts pressure on I.T. departments to maintain obsolete technology and disparate systems for data which may or may not be accessed. In this storm of competing business needs, demanding users, and the expectation that accessing old data should be as easy as online shopping, an Enterprise Archiving system can be the proverbial lighthouse to safely navigate out of the hurricane of demands to the safe shores of real solutions.

An Enterprise Archiving system allows users to access data quickly, easily, and from disparate data silos. Enterprise Archiving applications provide a management layer above the storage, which tracks the data, characterizes it with metadata to support indexing, rapid recall, and ─ most importantly ─ enforcement of retention and deletion policies. The best of these systems support data and content management under one unified system. With this type of system, data and content are viewed as a single file. For example, images of scanned invoices are tied to the customer’s records, so the user has access to everything through a single interface.

As regulations affecting data impact organizations, they find themselves caught in an endless cycle of reinforcing legal booms through a perpetual review-and-revise approach to internal policies. Enforcement of these polices becomes increasingly difficult and expensive for aging systems. The challenge of regulatory compliance and internal policy enforcement can be greatly mitigated by rationalization of applications, data centers, and the use of an archiving system.

Organizations tend to give only a passing glance at archiving systems when faced with a system replatform or upgrade. Corporate acquisitions or mergers initiate a more robust review of redundant systems. Regardless of the reason for consideration, the decision to move to an archiving system is based on dollars. Migrating legacy data is time consuming and can be quite costly. System vendors actively discourage migrating data beyond a 6-month to 2-year window, arguing the prohibitive cost. Large complex system upgrades can require the same effort as replatforming. All this can be daunting; more times than not, the organization decides to keep the legacy system alive (at significant cost) due to its need to access the siloed data. Maintaining legacy systems that have been drydocked is expensive and becomes more challenging as the manufacturers stop supporting the systems and internal SMEs workers retire or leave the company.

Archiving systems can provide an organization with significant ROI by reducing costs associated with storage, maintenance, and licensing. The first step is determining what systems are viable for archiving data. While some prefer to run a federated operating model, cost and complexity typically force the organization into a rationalization effort. Rationalization identifies a short list of retirement candidates. In many cases, the reduction more than covers the cost of the overall rationalization effort.

Data Center rationalization offers similar opportunities for savings. Critical, but non-operational, data can be retained and safely moved to lower cost storage. An enterprise archiving system can simplify ingestion and management of both relational data and unstructured content, reducing effort (time and cost) of the migration. If the archiving system provides real-time access, even operational data can be managed in this way.

To address the exponential growth of data and content, and the coming tsunami of IoT data, start by prioritizing the effort to bring order to the data explosion. Addressing the issue today will be less painful and far less costly than procrastinating another year (or two, or three). Take a serious look at the true cost of inertia and the increased productivity and profits that may be hiding in the “it’s good enough for right now” approach to data storage.

Can archiving and backup systems be used interchangeably? Although most anything can be rigged to “get by for the moment,” archiving and backup systems are quite different in their purpose and functionality. The purpose for a backup system is primarily disaster recovery. Backup systems are not designed to handle issues surrounding e-discovery and record retention (and deletion), whereas archiving systems are designed specifically to address these issues. Accessing data through backup systems is multi-step, often arduous, and always slow process whereas archiving systems are designed specifically to handle disparate and antiquated technology.

Now is the time to get hold of your data; an IDC Digital Universe study estimates the amount of digital data created per year will be 35 zettabytes by 2020. A zettabyte is 1 billion terabytes. It is time to get ahead of the impending storm.

January 6, 2017

What to Consider when Choosing a Cloud Solution

Leveraging technology in the cloud allows companies to lower their total cost of ownership and stay current with the best solutions in the marketplace. It works for large enterprises and certainly for small to medium sized businesses.

Many implementation partners for cloud services just roll out the solution, then abandoned their customers or provide them with poor ongoing support. What is worse, they don’t bring the expertise of innovative thinking to integration efforts or alignment with proprietary processes or application. Be prepared by taking a few steps to ensure the right choice.

When looking to introduce new cloud technology to your enterprise, you should consider a few things to ensure maximum business impact. 1) Choose a cloud solution with a proven track record, 2) ensure the cloud application is open enough to allow for integration points with key internal applications, and 3) align key business processes so the configuration or implementation partner is able to be more effective for you.

  1. Choose a Proven Cloud Solution – There are many cloud applications on the market for every key business function. However, many are still experimental, unproven for your industry, or lack the functionality or near term roadmap to ensure the right fit for your business. Ask the provider for references in your industry, a glimpse into their roadmap, and have them demonstrate functionality aligned to your business needs.
  2. Get an open application – As your business advances and needs change, it is important to have an application that can grow with you. At the very least be an open enough platform that allows for integration into proprietary systems or new innovations to separate your business from the competition.
  3. Align Key Business Processes – One of the biggest mistakes a company can make is not thinking through existing and best practice business processes when choosing a cloud product. Go through the activity of documenting current and then ideal business processes. If you don’t have the bandwidth or expertise within the organization to accomplish that, leverage a good partner to help you identify key business needs.

Tour VisionIT

We're in the heart of Detroit's New Center- come work with 
us and explore innovation in person

Learn More