Following is a guest post by Jim FitzGerald, Exterro's Sr. Product Marketing Manager. Jim attended the ARMA Conference this week. This is the first in a series of posts we'll be running on highlights from the show.
At this week's ARMA conference I heard a lot of emphasis on applying analytics to the challenges of information governance (IG). Many archive and ECM vendors spoke about their efforts to make the data within their repositories smarter – easier to find, correlate, and manage. This is definitely a step in the right direction and key to shining a light on the enormous pile of “Dark Data" that lurks in most organizations.
The most common use cases cited as beneficiaries of applying analytics to IG were:
- Reducing risk -- Being able to locate and take action on data for compliance reporting and e-discovery requests quickly and authoritatively and to identify potentially problematic data containing personally-identifiable information (like credit card data) and those subject to privacy/confidentiality requirements.
- Access to Valuable information – improving classification and retrieval of information to improve productivity and speed up time to market.
- Reducing costs – Whether it's reducing storage costs or the management overhead of dealing with unnecessarily large data volumes, there was a lot of emphasis on Defensible Disposition of data deemed Redundant, Obsolete, or Trivial.
Brent Gatewood, CRM from ConsultIG, gave one of the better educational sessions on the topic of analytics. He did a good job describing the guiding principles of IG Analytics – the Four Vs of Volume, Variety, Velocity, Value -- that IG professionals have to keep in mind when conducting analyses.
Brent shared some sobering forecasts for the growth of information that's adding to the pressure to make our data smarter. The number of files or containers are expected to grow 75 times in the next 15 years, but IT and Governance staff will grow at most by 50%. Further, a growing portion – perhaps a majority – of that data will be outside the organization's firewall in Cloud or mobile solutions.
Brent also shared some recent analysis work he did with a client on1.2 petabytes of storage. 30% or 360TB ended up being classified as redundant, trivial, or past retention schedule needs.
Sobering stuff, but Brent did leave us with the hope that applying appropriate technology and processes can really help tame IG problems. Perhaps that's one of the reasons he is seeing Big Data Analytics on the march in the organizations he consults with.