Traditional discovery was an endeavor fraught with no small amount of peril: what if something was lost, concealed, destroyed or (even if produced) just plain missed? The lurking danger of a damaging “needle in the haystack” was a constant concern.

At the dawn of the age of electronic discovery, Electronically Stored Information (“ESI”) offered a new hope: the opportunity to “freeze” the evidence in time (in the form of a single computer hard drive) and methodically sift through all of the information on the hard drive.

Because data collection for a single machine (or each machine at issue) could be documented and the results recorded and repeated, the field of electronic discovery began to take on the look of a computer forensics operation. Relevant information (even if “deleted”) was difficult to conceal and as a result, even the location of data on a hard drive could give counsel a valuable window into what the data meant to the opposing party. Forensic soundness mattered. Forensically sound data would hold a wealth of information which could be used to tell an enlightening story about the data and what had been done to it, unlocking revealing answers, or at the very least, better questions.

One key rule in computer forensics is, it can be difficult, if not impossible, to prove who was at a computer keyboard at any given time. However, with this rule in mind, the application of computer forensics principles to the collection and review of ESI provided an unprecedented level of information, insight and evidence to litigators.

At last, there was cause to rejoice: it was now harder than ever before for a party to actively conceal the sort of information that could blow a case wide open. Computers were poised to provide counsel with incontrovertible evidence, as long as a human could review all the information and find the needle hiding in the haystack.

These miraculous gains, however, unleashed a new beast to tame: the sheer volume of data.

As most people around the world have access to electronic devices, the amount of ESI that might fit squarely into a discovery request has increased geometrically, with no end in sight.

As attractive as the computer forensic discovery model was for electronic discovery where an adversary’s ESI might consist of a few desktop computers, each with a 40 GB hard drive, that same model becomes logistically and fiscally crushing when the corpus of data exceeds several terabytes of information stored on a company’s server(s), computers, laptops, smart phones and tablets.

While an attorney might have argued that the safest and best practice would be to image the entire corpus of an adversary’s data, the prospect of imaging all of that data in 2012 has ceased to make sense. It is also worth noting that the “image it all” strategy is arguably a defense against claims of legal malpractice, but the era of this sort of “defensive discovery” has ended due to the volumes of data involved.

In short, computer forensics as we knew it in the first decade of the 21st Century is eyeing extinction. It is also an inexorable truth that electronic discovery based on this same forensic model will also not long survive.

So What Does it All Mean?

First, the good news: electronic discovery, overall, is likely to get cheaper. The bad news is that the cost reduction is a direct result of the loss of traditional notions of forensic soundness.
It is, however, important to note that forensic soundness is not synonymous with evidentiary quality and the two concepts are not (and never have been) mutually exclusive.

Arguably, the application of the strictest standards of forensic soundness did not add to the quality of the data recovered, even though these standards ALWAYS increased the price per gigabyte for all of the data recovered, processed and reviewed. Forensic acquisition was a “one size fits all” approach to discovery that certainly provided certain standards and safeguards, but did so at a significant, and often unnecessary, cost to the client.

What the current state of ESI really means is a return to “old school” discovery, a form much more akin to what discovery was before.

The days of isolating gigabytes of information and then sifting through all of the dates with a search tool are over. The amount of data at issue, even in small commercial cases, is simply often too large to justify a forensic collection of absolutely all of the data.

Is Paradise Lost?

In a word, “No,” paradise is not lost. While a forensically sound and complete hard disk image provided a wealth of information, alternate paths to some of the same results do still exist. The methodology is the same as that which was employed before electronic discovery, back when production was done on paper.

Don’t Image Everything

With very few exceptions, forensic imaging of an entire corpus of a client’s data is unnecessary and prohibitively expensive, often running into the hundreds of dollars per gigabyte for imaging, processing (like de-duping and near de-duping) and conversion. An adequate response to discovery can be achieved by means of a close review of a client’s systems along with an agreed upon strategy formulated with the client’s IT department/computer consultant.

Does Remote Access make things “Cloudy?”

Recalling that the traditional computer forensics acquisition paradigm, one of the key processes to forensic soundness involved isolating the target computer, powering it down (or, it if was off, not turning it on) and removing the hard drive from the computer in order to make a forensic image with a read-only device. The dates, times, number, kind and character of the files were thereby preserved, and the discovery process could then shift to reviewing the results of the acquisition and determining “what you had” as a result of the operation. The hard drive was, at first, a black box, but once imaged, it revealed its secrets from a point which the forensic technician had frozen in time.

This static acquisition is as quaint a notion as a stroll down a main street full of small shops and Model T Fords. Remote access, cloud computing, has eliminated static acquisitions, but it has not destroyed the evidence necessary to a successful litigation, just altered the way we must acquire it.

Does remote access make things “cloudy?”

The answer to this question is the same as the answer to most clients’ questions: it depends.

Cloud Computing

A brief word about cloud computing: it’s here and it’s not going away. The convenience of remotely accessing data, for work, or for entertainment, is eminently appealing. Cloud computing provides the ease and portability that have come to define the Internet era.

In support of this new computing frontier, service providers have built a “brick and mortar” (or “aluminum and plastic”) physical infrastructure to support cloud computing. While the user can travel light, the provider has a large, heavy footprint.

Modern server farms and datacenters are constructed as secure facilities with redundant electronic assets and fire suppression/safety systems. In essence, a server farm is run like a bank, but instead of money, the facility safeguards data.

It is this bank comparison which has finally allowed some attorneys to counsel their clients (or allow themselves) to utilize cloud computing. The traditional concerns about not having control of one’s data are allayed when the comparison is made to placing one’s money in the bank, as opposed to stashing it in one’s mattress. A loss can occur in either scenario, but the valuable asset is better protected and insured by the professional custodian as opposed to a data owner’s often inadequate do-it-yourself approach.

Bearing all this in mind, cloud computing necessarily translates into a certain level of uncertainty when it comes to the provenance of a quantum of data. Unlike the file pulled from a forensically isolated hard drive, the issue of alteration or authenticity does loom. Securing adequate answers as to the provenance of data weighs heavily on the system in use. Some cloud based systems maintain detailed logs regarding access, as well as copies of subsequent versions of a file/document. In this case, the provenance of the data is as good and as reliable as the logs.

If the Cloud Service Provider (“CSP”) is reputable, the logs will be both complete and accurate. While a litigant might argue that the logs could be manipulated, this sort of conspiracy theory seems a remote possibility in light of the larger issues at stake, after all, why would a service provider endanger its reputation with current and potential clients for the limited benefit of one client in one case? The need for digital integrity and the threat of negative consequences in the absence of digital integrity provide ample incentive for CSPs to be as reliable as any other custodian of records or valuables.

Discovery Methodology under the Cloud

ESI produced from cloud based storage may (or may not) contain the same metadata it had when it was created and then uploaded to the cloud. The key factors in determining the state of metadata is the system and software being used and the access records maintained by the CSP.

Depending on the type of cloud, public or private, and the configurations on the backend server (or servers), metadata in the traditional sense may not exist. However, it is the job of the technical experts to determine what “metadata” does in fact exist. Big players, like Google, have introduced Message Vault, an integrated electronic discovery (ediscovery) layer, and Microsoft 360 has had ediscovery-centric features since its introduction. However, this is a far cry from what we traditionally understand as metadata and there are literally tens of thousands of cloud based storage options, SaaS (Software as a Service) and web applications that are used by business today, these are the unknowns that experts are accustomed to and must handle every day.

From a planning perspective, it would be wise for organizations to fully understand the capabilities of cloud providers and find out what is, and what is not, available from both a security and ediscovery perspective, so they are prepared to facilitate producing documents when required.

Applying Old Lessons to a New World

Perhaps one of the greatest tributes to our legal system is the timeless quality of some of our basic rules and practices: perjury is so corrosive to the fact finding process it’s one of the Ten Commandments. Hearsay is inherently unreliable since it does not allow for cross examination. There are some concepts, such as these, that work, in any time and any season. In this same spirit, working with ESI from the cloud should follow the same sound principles of evidence that have always applied. In the absence of a forensically “frozen” corpus of data, the validity, admissibility and evidentiary quality of data can still be asserted, or assailed, based upon a cogent analysis of the facts and circumstances surrounding that data. This is the same calculus in which lawyers have engaged, quite literally, for centuries. In this regard, everything old is new again, and that’s an ironclad grasp that even cloud computing can’t escape. The trick is finding the right experts with the right expertise to help you successfully negotiate the many potential perils of ediscovery in the cloud.

 

This article written and submitted July 17, 2012, by:

Michael P. Reynolds, JD – President of Michael P. Reynolds, PC

Contact Information:
(347) 433-6068 (v)
(516) 750-9028 (f)