The Basics of E-Discovery , Chapter 7A: Document Review, Analysis & Production


Review: The E-Discovery Stretch Run

When most legal professionals think 'e-discovery,' their mind immediately jumps to attorney document review. There are a couple key reasons for this that we'll discuss throughout this section, but first a quick overview. The e-discovery review process involves the review and analysis of collected documents to determine which ones are relevant to the case, which ones are not relevant, and which ones contain privileged or other protected information.

Find out what happens in the review phase of the e-discovery process–and why it’s important–in this video introduction to review.

Where Analysis and Production Fit In

In the traditional electronic discovery reference model (EDRM), review, analysis, and production are listed as independent e-discovery phases. For the purposes of this guide, we are combining all three into one section since they are interconnected.

Even the creators of the ubiquitous EDRM model acknowledge that 'analysis' can be deployed in many phases of e-discovery, including during early case assessment (ECA), before collection. Analysis is the process of evaluating data sets to determine key information that will help guide the review process. This might include the key topics, custodians, and specific vocabulary that reviewers will need to look out for when conducting their more detailed assessment in the review phase.

The output of the review effort feeds into the production stage of e-discovery, described in greater detail below at the end of this section.

Who is Involved in Review

For corporate litigants, document review is typically handled by outside resources, such as law firms or review service vendors. Review is time consuming and demanding of resources, and internal legal departments are usually ill-equipped to handle the process in-house. Moreover, review is heavily associated with case strategy and piecing together the overall litigation "story." These considerations usually fall to a company's outside counsel, who represent the corporation during the course of litigation and at trial, so having them lead the review effort usually makes the most sense.

Document review is often handled by a 3rd party

There are companies that conduct some review activities in house. Often referred to as a "first pass" review, corporate attorneys may scan documents and make a quick determination on whether or not it's relevant to help guide the subsequent steps and winnow down the document set. Rarely does this type of review dive deep into the contents of each document to explore the actual degree of relevance, how it fits into the overall case strategy, and whether or not there is confidential company or client information, such as intellectual property (IP), that must be withheld from the eventual production. That being said, companies are increasingly handling review demands for small cases internally to avoid paying exorbitant fees.

Why Review is Important

Review is often viewed as the culmination of e-discovery efforts and the most critical step prior to actually producing documents to the other side. It's the point at which legal teams traditionally gain a greater understanding of their case and are able to develop legal strategies based on the information that is uncovered. However, advanced analytics technologies now allow legal teams to gain greater factual insights before the review phase, a point that was addressed in the ECA section of this guide.

Cases that involve a broad set of issues and many relevant players (custodians) will invariably result in a larger, more expensive review projects. But review is also heavily influenced by earlier e-discovery stages. Sloppy preservation practices or overly broad collections can result in more documents, creating more ESI to review than what is actually necessary to the matter, a problem that many companies experience.

Review is often discussed in the context of two critical e-discovery topics: costs and technology.


Review is expensive. Most estimates have review accounting for between 70% and 80% of total e-discovery costs for the average case. Indeed, large-scale review projects can cost hundreds of thousands of dollars. Reviewers are often paid on a per-hour basis, which can also make project budgets hard to predict. However, many companies are pressuring law firms to adopt alternative fee arrangements, such as billing models that are not based on billable hours.

Document review is expensive


Because review is so expensive, many of the most significant advancements in e-discovery technology address this stage of the e-discovery process. While new review platforms have greatly streamlined the traditional, linear review model, they can also achieve significant savings during early case assessment section of this guide.

The E-Discovery Review Process

Review is often conducted by a group of people, commonly referred to as a review team, comprising different levels of attorneys and other legal professionals. Most legal experts advocate taking a systematic approach to document review that accounts for the following considerations:


Underlying Case Information

It goes without saying that it's impossible to conduct an effective review without a deep understanding of the issues underlying the case. This includes keywords, key names, and date ranges to look out for when examining the contents of each document.


Case Strategy

Besides the factual background, reviewers have to also understand what the legal team hopes to accomplish. What is the desired outcome? Ultimately, the facts will tell the story, but the review may reveal patterns that impels a change of course to a more aggressive defensive approach or, conversely, a quicker settlement.


Document Tagging

Consistency is critically important for any review project. Reviewers have to have a common set of tags or labels that they apply to documents, or making sense out of the final reviewed document set will be next to impossible. There isn't one way to tag documents. Some review teams may opt for a simplistic approach (e.g. relevant, irrelevant, privileged, etc.), while others may go more descriptive (smoking gun, unfavorable, neutral, favorable, etc.). The key is that all reviewers understand the system and are held accountable for following it.



Accuracy and thoroughness are obviously important, but so is speed, as reviewers often get paid on a per hour basis and large review projects can cost companies hundreds of thousands, if not millions, of dollars. A strong review team will set expectations on how many documents each reviewer should be able to get through in a given period of time and track those numbers to address productivity issues.


Quality Control

Even the most experienced and thorough reviewers are going to make mistakes. Relevant documents might be unintentionally marked as irrelevant, or a document filled with confidential information might slip through the cracks and make it into the production set. Quality control (QC) measures to prevent these types of mistakes might involve a system by which review teams conduct different levels of review to verify accuracy of other reviewers, or by sampling certain categories, maybe reviewing one document for every 10 deemed irrelevant and looking for mistakes, which may expose larger issues such as a consistently overlooked keyword.

Privilege Review

Besides assessing relevancy, the review process involves identifying documents that contain privileged information. Privileged documents are those that fall under the protections of attorney-client privilege, a legal concept that protects certain communications between a client and his or her attorney and keeps those communications confidential, even when they contain information relevant to the legal matter. While attorneys can take measures to protect privileged documents in the face of inadvertent disclosure (i.e., FRE 502(d) orders), the information they provide can't be "unlearned" by the opposing party's legal team.

For this reason, privilege reviews are handled with the utmost care and are usually conducted by more senior attorneys, who have the most experience reviewing documents and know how to spot protected information and log it correctly for legal purposes.

E-Discovery Review Checklist

As we've established, review accounts for a majority of e-discovery costs. That expense has driven many experts and practitioners to look for ways to improve review speed and accuracy. Technology plays a big part in that discussion (as we'll soon discuss), but there are other means to improve the review process, including:

Ways to Improve the Review Process

Narrow Data Volumes Prior to Review

It may not technically be a review best practice, but controlling the amount of data that is collected is a sure-fire way to ease the burdens of document reviews. There are a number of ways to narrow the review funnel, many of which were addressed in both the collection and early case assessment sections of this guide. You can also read Exterro's white paper, "Eliminating E-Discovery Over Collection," for best practices.

Leverage Proportionality

The principle of proportionality dictates that the costs involved in e-discovery should be in line with the value of a case. For example, a case valued at $25,000 shouldn't impose six-figure e-discovery costs on either party. Unfortunately, non-cooperative litigants can lead to spiraling e-discovery requests--and resulting cost increases. In the context of review, it's critical to look at potential data volumes and estimate review costs based on historical metrics around hourly fees and productivity rates. Having that information in hand prior to review will help expose disproportionate discovery requests and place the onus on the other side to demonstrate why the review is necessary.

Privilege Non-Waiver

Privilege reviews require a very strong attention to detail and take time to complete. They are also conducted by more experienced attorneys, meaning they are much more expensive than a typical "first pass" review. There is a seldom used federal rule that allows parties to protect against the inadvertent disclosure of privileged documents. Federal Rule of Evidence 502(d) allows parties to enter into "clawback agreements," whereby each party agrees to return inadvertently produced privileged documents to the opposing side. FRE 502(d) is by no means a license to ignore privilege reviews, but it does give litigants a safety net should something slip through the cracks. Amazingly, few attorneys even know of this rule, let alone take advantage of it, a point that was underscored in Exterro's recent, 4th Annual Federal Judges Survey.

Collaborate with Review Team

All too often, corporate legal teams hand off review projects to their outside review team and simply wait for the bill to arrive. Document review should be a collaborative effort between inside and outside counsel with frequent communication and status checks. The more collaborative the relationship, when information is consistently shared back and forth, the easier it is to negotiate alternative fee arrangements that are equitable for both sides and help enormously with budgeting.

E-Discovery Review Software

So far in this section, we've described some of the key tenets of traditional, linear review. But review is rapidly evolving. Technology advancements are moving review away from the manual document-by-document inspection described above to a more analytics based approach, where computers can basically determine whether a document is relevant or not with little human intervention.

The first generation of technology assisted review (TAR) consisted primarily of predictive coding tools. While predictive coding is still in use today, subsequent generations of TAR have incorporated advances in artificial intelligence (AI) that we'll address in a dedicated section of this guide.

For now, we'll go over some of the more basic e-discovery review software features, which include:

Search Filtering

Once documents are uploaded into an e-discovery review software platform, users can run searches using specific keywords, combinations of words, or even concepts to help home in on relevant documents much quicker.

Document Tagging

As described earlier, consistent tagging is critical during review. Review platforms give users the ability to apply pre-defined tags, so that all reviewers are using the same labels as they go through each document.


Depending on the type of legal or regulatory matter, and the issues involved, certain information may need to be redacted (deleted or removed) before a document is produced to the other side. Many e-discovery review software applications allow users to select words or passages in-text and have them blacked out to help expedite the redaction process. Auto –redaction features will automatically redact all instances of a word or phrase.


As discussed earlier, review projects are usually performed by teams, not single individuals. This means that review sets need to be broken out into groups, or batches, for each individual reviewer. Many review platforms have the ability to do this automatically once a review set is imported into the application.

Bates Numbering / Stamping

As cases progress, it's important that each side of the dispute has an easy way to reference specific documents in the review/production set. Many review platforms will offer a Bates numbering feature (named after the 19th century inventor Edwin G. Bates) that automatically attaches a unique identifier to each document to expedite document identification and retrieval.

Production: The E-Discovery Finish Line

The final output of the review, the set of documents deemed to be relevant to the legal matter, must be produced to the other side as the final piece of the e-discovery puzzle. In the paper age, production was fairly straightforward. It literally involved sending large boxes of documents to the opposing side. Today, the process of production is more complicated. Cases involve more documents (or ESI) and those documents come in a variety of forms. Even if it was feasible to print everything out and deliver it in paper form, doing so would neglect an important component of digital evidence: the metadata that underlies the visual or text content of the ESI and describes key contextual information about each document, like when it was created or modified.

Due to the complexities of producing documents in the digital age, the Federal Rules of Civil Procedure (FRCP) were amended in 2006 to address the topic. Several state courts have followed suit and amended their rules. According to the FRCP, litigants are required to produce electronically stored information (ESI) in the form in which it is ordinarily maintained (often referred to as "native format") on in a "reasonably usable form." Standard non-native production formats include the Tagged Image File Format (TIFF) and Portable Document Format (PDF).

Next Section

Now that you've had primer on review, our next section looks specifically at applications of artificial intelligence in the review phase of e-discovery.