Skip to content

Data Risk Management

Data Integrity, and Why It’s Important

April 14, 2021

The following is the fifth post in a new blog series from Exterro CEO Bobby Balachandran, where he shares his thoughts on the issues legal leaders care about and his vision for addressing them. Read Bobby's last blog here.

When Exterro’s executive team was first considering acquiring digital forensics firm AccessData, we thought a lot about the concept of data integrity and its increasing importance in today’s enterprise data management processes. For corporations performing data collections and processing the information, validating that a data set is accurate can help ensure defensibility by actually proving the data’s integrity. Questionable data quality can also represent a significant challenge in criminal cases, as law enforcement officers in the forensics space can attest.

For this blog, I’ll focus on answering two main questions regarding data integrity:

  1. How does an investigator prove that collected endpoint data hasn’t been modified since collection?
  2. How do we ensure defensible and repeatable collection, processing, review and production?

I’ll also discuss how the addition of industry-leading forensics capabilities to the Exterro suite helps ensure the quality of organizational data during investigations and litigation.

What Is Data Integrity, and Why Is It Important?

Generally, lawyers think of data integrity in terms of the trustworthiness and accuracy of data throughout its lifecycle. In the legal space, data integrity is most often an issue when we are discussing compliance and privacy protocols or evaluating the reliability of information in investigations or litigation.

Law firms and corporate legal departments are concerned with data integrity from a couple of angles: First, it’s necessary for litigation and internal investigations because these processes require that data be recoverable, searchable, traceable and connectable. Second, it’s an important element of a company’s privacy practices regarding their client or customer data. Put plainly, accurate data is essential to meeting compliance requirements or mandates.

So, how do we prove that collected endpoint data hasn’t been modified—and therefore has maintained its integrity?

When data is requested for, say, an HR or internal investigation, forensic experts must ensure that they’re maintaining and documenting chain of custody—and that they’re collecting data remotely in a secure manner. Our software collection creates a forensic copy of the data and places it in a secure forensic container (such as an AD1 file), which helps ensure data integrity during the transfer from the endpoint to the server from which it is collected. And there’s a bonus! Our software also offers encryption mechanisms to secure the data even before it begins to transfer. The transition channel is also encrypted with algorithms to help ensure that the data has maintained its integrity from the moment we start collecting it.

When processing a forensic image, our platform ensures that the software is only reading from it—never writing to it (and therefore affecting its integrity). This is especially important for law enforcement because it means that evidence is accurate at all times through processing. If there’s a need to reprocess the same data, you must be able to get the same result every single time. An inability to replicate results means the data can’t be validated, which could affect defensibility in court.

We’re now one of the few vendors that customers turn to when their data validation fails, because they’re able to use our FTK® portfolio of products to prove repeatability, defensibility, and data integrity.

And how do we ensure defensible and repeatable collection, processing, review, and production?

Robust, reliable, and repeatable: These may sound like buzzwords, but they are also the best terms to describe our forensics capabilities. Exterro lays the foundations for data quality by helping clients establish robust, repeatable and reliable collections of evidence. For now, I’ll focus on “repeatable,” because as I noted above, repeatability ensures that processing certain data sets produces the same results, which is a cornerstone of data integrity.

One way to ensure repeatability is to use effective and reliable tools. A unique feature of the Exterro platform is that our clients can use it to prove and validate the effectiveness of our own software. In fact, we’ve incorporated a “tool validation” process in our FTK portfolio that allows users to test not just the effectiveness of our own forensics tool, but any forensics tool. In doing so, we are giving our customers the opportunity to test our tool’s ability to provide repeatable data processing against that of other vendors.

Repeatable collection, processing, review and production have always aligned within both the forensics and e-discovery worlds—and the inclusion of incredible forensics capabilities as a part of an enterprise legal platform helps supercharge both the validation of data and its traceability. Having the capabilities to identify, recover and connect the forensic dots within a data set not only guarantees airtight legal oversight of sensitive organizational information, but is an important part of the larger picture of enterprise data management.

“Data integrity” and “data validation” will continue to be major buzzwords throughout 2021 and beyond. It’s time for legal and compliance teams to turn their attention to verifying the quality of their data to help ensure greater organizational defensibility.

Sign Up for Alerts

Get notified when new content for specific topics is available.

Sign Up