Published: March 25, 2010 09:22 AM
Last Build Date: November 4, 2009 10:45 AMCopyright: Copyright 2010
November 4, 2009
InsideCounsel offers some tips for those interested in bringing eDiscovery in-house while avoiding pitfalls, including some comments from yours truly. Check out "Inside Job" in InsideCounsel's November issue, published in the monthly Technology section.
It's truly a challenging time for companies, but it's doable with the appropriate vision and approach. Many GC's and AGC's are under significant pressure to reduce their litigation and eDiscovery spend. Along with other approaches, more and more this usually includes looking inward to insource and automate repeatable and defensible processes as well as gaining greater control over their information management. Increases in efficiency and effectiveness in identifying, preserving, and culling down the data in-house in earlier stages should translate to lower review and hosting costs, and hopefully shorter review cycles related to the decrease in volume.
However, it's not just about bringing in technological solutions. I see technology as enabling processes and improving efficiency when done right. But before that can happen, companies need to discern the impact that various policy and technical choices will have on their ability to manage, identify, and cost-effectively work with their corporate data in eDiscovery, investigative, and compliance contexts. There's also the question of scale, as many small to mid-sized companies may not have either the volume or types of litigation or perhaps the internal human capital to justify some of these investments. For larger companies, the concerns typically fall at the other end of the spectrum, such as will their insourced solutions scale appropriately and cover the desired data types through all the hand-off points? Thus I think it's safe to say that for most companies, insourcing will be a multi-year effort, with iterative cycles of designing effective and defensible workflows to connect all the dots.
October 27, 2009
LinkedIn. Facebook. Twitter. Blogs. Bob Ambrogi, always on the forefront of web technologies and their impact, recently published two helpful "Top 10" articles - one each for attorneys and experts, with some great tips for those navigating online communities for networking and socializing.
One such tip is to separate professional and personal contacts into different networks. However, don't fall prey to the myth of anonymity or that restricted social networks will necessarily protect you. It isn't always clear which content is restricted to just your approved network contacts. Others have been known to seek invitations or sign up for accounts solely for getting at the "good stuff". As always, be ever mindful of what you post online.
Not surprisingly, the best and number one tip is to use good old fashioned common sense. However, given some of the gaffes Bob used as examples, it's easy to agree with his observation that it "sometimes seems to be in short supply these days".
Definitely good fodder for any law school ethics curriculum, since these are among the modern day challenges lawyers face while building both their practices and professional reputation online.
August 18, 2009
From the many presentations and discussions at the AHIMA sponsored Legal EHR Summit in Chicago, it's clear that healthcare records and records management in the U.S. are changing. (In case you were wondering, "EHR" = Electronic Health Record). In George Paul's (Lewis & Roca) keynote, he shared how the U.S. government is pouring money into healthcare records via incentives in the ARRA and HITECH acts. Several presenters referred to these changes as the biggest change to healthcare privacy and security rules since HIPAA was enacted. Indeed, even as we discussed these developments, new security breach notification rules were due out yesterday. Also discussed in several sessions, these new laws will likely require many business associate contracts to be renegotiated.
It's also interesting to note that as much as some think of U.S. healthcare as high-quality and high-tech, the underlying HIT and records management systems and professionals are struggling with addressing these new changes, challenges, and ramifications, especially with respect to the legal aspects. For instance, many HIT systems are not geared toward the legal aspects of preservation (think dynamically changing databases on a daily basis) and production. Not surprisingly, their focus is on enabling the healthcare professionals and organizations in the provision of their services. Several cases were mentioned where the plaintiff's attorney wanted to see the data and screens of what the doctor saw when he/she was treating the patient. The response I heard throughout was that this wasn't possible due to the constantly changing nature of the data in these systems. It doesn't take much imagination to sense how well this goes over in litigation, and the need for creative solutions. Much discussion also centered around records management and creating/refining document retention policies, and just as importantly, complying with them.
There were also some pretty scary stories relating to Iatrogenesis, or the patient harm caused by the use of computer systems, and the lack of transparency and sharing of those problems by the software vendors.
There's also the issue of creating the necessary interoperability and sharing of information across different HIE's (Health Information Exchanges) - from local to regional to state to national levels. So there's a fair amount of catching up and transformation that needs to happen in this industry. The good news is that these issues are being discussed in depth across multiple disciplines - IT (HIT), Records Management, Legal, Risk Management, and Compliance, just to name a few.
With respect to the summit itself, this was the first time I attended an AHIMA conference. It's been well organized and everyone at AHIMA has been very helpful and friendly. There is definitely a spirit of cooperation and collaboration among everyone here, including the attending HIT, records and risk managers, consultants, and attorneys. Indeed, there is a high degree of interest in addressing and resolving these issues through better understanding of the legal issues by health information professionals, better definition of standards (for instance, what constitutes the "Legal Electronic Health Record"?), and transforming the records management systems and processes.
August 13, 2009
What do e-Iatrogenesis, HIT, CPOE, EHR, and eDiscovery all have in common? They're just some of the many medicolegal and technological terms and issues being discussed next week at the Legal EHR Summit at the Chicago Marriott Downtown. The summit is organized by AHIMA, the American Health Information Management Association.
As our nation's healthcare industry becomes even more computerized and integrated, partly due to ARRA (the American Recovery and Reinvestment Act of 2009), the intersection of healthcare, electronic records, records management, and legal issues (including litigation and eDiscovery) will likely explode as well.
I'll be attending and blogging as time and Wi-Fi access permits. Please feel free to look me up as I enjoy the many opportunities for discussions at these events. For the uninitiated, I've put together a quick cheat sheet for a few select terms below, along with their sources on the Web for more in-depth definitions:
HIT: Health Information Technology
Think of HIT as IT for healthcare-related systems, along with ARRA's goal of establishing a nationwide interoperable Health IT infrastructure.
CPOE: Computerized Physician/Provider Order Entry - An electronic system that healthcare professionals can use to enter drug prescriptions and diagnostic orders, among other things.
EHR (aka Legal EHR): Electronic Health Record
The Electronic Health Record (EHR) is a longitudinal electronic record of patient health information generated by one or more encounters in any care delivery setting. Included in this information are patient demographics, progress notes, problems, medications, vital signs, past medical history, immunizations, laboratory data and radiology reports. The EHR automates and streamlines the clinician's workflow. The EHR has the ability to generate a complete record of a clinical patient encounter - as well as supporting other care-related activities directly or indirectly via interface - including evidence-based decision support, quality management, and outcomes reporting.
e-Iatrogenesis: "Patient harm caused at least in part by the application of health information technology."
Think of e-Iatrogenic events as things that can go wrong for patients when CPOE systems are implemented and involved. For example, consider errors relating to patient drugs or diagnostic tests, including errors of commission or omission.
This succinct discussion of e-Iatrogenesis contains a nice explanation of the types of "unintended adverse events":
These unintended adverse events may fall into technical, human-machine interface or organizational domains. Some e-iatrogenic events will represent the electronic version of "traditional" errors, such as a patient receiving the wrong drug dosage due to a human click-error. But other HIT precipitated or enabled errors may have no exact analog in the non-electronic context. For example, a clinical decision support system (CDSS) embedded within an electronic health record might contribute to a clinician's incorrect diagnosis or treatment plan; this could represent either a "type-one" or "two" error (e.g., making a diagnosis that was not present or missing one that was).
Stay tuned for more blog posts on these topics . . .
August 5, 2009
Yet another twitter post, this one by a Chicago tenant referring to an allegedly moldy apartment, draws a $50,000 lawsuit against the Twitterer for defamation. As the tweet was reposted within Twitter and around the world, it provides a wealth of evidence as to not only the post itself, but its far reach across the Internet.
Both sides are going to lose in this suit, though. According to the article, the original poster could very well lose the suit. Even if she ultimately prevails, it's going to cost her dearly in defense fees. Likewise, the realty management firm's statement to the the Chicago Sun-Times that "We're a sue-first, ask-questions-later kind of an organization" resulted in a "firestorm of criticism." It's a harsh lesson that companies sometimes learn the hard way in responding to customer complaints in the online arena. "This could generate bad press for them for years, and that wasn't (Bonnen's) doing," said Sarah Milstein, co-author of the just-released "The Twitter Book." Who's going to want to rent from or otherwise do business with a a self-admitted "sue-first" company?
There are lessons to be learned from both sides. First, don't make posts on public or social networking sites that are intended for a particular individual, especially when you are peeved at something or otherwise under emotional stress. Public postings on social networking sites amplifies the dangers of bad e-mail decisions by several orders of magnitude. Far too many people are either unaware of or forget to change their privacy settings so that only those users can see the post. You might as well be shouting it to the Washington Post, New York Times, your adversary and their counsel. There is some very good advice in the SFGate article cautioning posters about this.
Likewise, companies also need to be very mindful of their reactions and public responses to such incidents. They often damage themselves in the public's eye far worse by how they responded to such a posting, than the original posting caused in the first place. Sometimes lawsuits can be avoided, and sometimes they can't. Regardless, it's important for businesses to avoid kneejerk responses that only serve to reinforce public opinion that they are the villains. They may win the suit, but then can lose even more business in the process by generating additional reputational harm whether they realize it or not. So which was the better business decision?
July 24, 2009Karthik Kannan, VP of Marketing and Business Development at Kazeon, just published a very helpful article on SC Magazine's site discussing the convergence of eDiscovery and eCompliance. As you'd expect, it's a marketing and business development article, so let's get that out of the way early. But regardless of whichever technology and process solutions one may prefer, I found the following to be an excellent summary of the issues and requirements one is likely to encounter when addressing the litigation readiness, information management, and compliance challenges in many organizations: Due to the confluence of legal and compliance regulations and IT management issues, the perfect ESI storm has emerged -- and with it, the confluence of both eDiscovery and eCompliance. Looking forward, these features are necessary for any workable amalgamation of practical eDiscovery and eCompliance initiatives:Enterprise-class scalability and performance. An eDiscovery/eCompliance suite must be scalable to search across hundreds to thousands of terabytes of electronically stored information, as well as scale into the billions of documents, and have the performance to process that data to keep pace with today's information growth. Auto-discovery of data sources - An eDiscovery/eCompliance suite must have the capability to auto-discover informational sources anywhere on the network, since critical data may reside in the enterprise file, storage file server or a laptop in Shanghai. Holistic and Dynamic Organizational Information Map - Since network topology can change rapidly, having a dynamic and active continuous auto-discovery capability is critical for information indexing, internal investigations, litigation procedures and information capacity planning. Agent-less and agent information management - Organizations have enough critical data running on servers, laptops and desktops today. Having the option to run agent-less or agent searches is a critical capability. Agent-less search has a low impact on the IT infrastructure and is easier to deploy. Search with an agent takes longer to deploy, but can deliver effectively on active data sets. Robust search, analysis, and classification - Searching, analyzing and classifying information is a complex challenge. Having a strong analysis and auto-classification capability that can sort large data sets based on metadata, document content, file type, an so forth is necessary to accurately and quickly reduce the volume of data to a relevant and manageable set. Tagging - Automating the tagging of individual content or grouping content into relevant virtual categories with a robust policy-based engine enables administrators to simplify the review and reporting process by delivering a virtualized organizational information overview. Workflow management - After gaining insight into and classifying critical information, bring the "in the wild" data into the ECM platform for workflow management and preservation. With the ability to automate the move, copy, encrypt, and delete actions; an automated policy-based methodology accelerates the manual processes for processing all the enterprise data. Unified management - With billions of documents and petabytes of storage, corporations can easily be overwhelmed by the volume of data. A robust eDiscovery/eCompliance suite must have a unified management view across the entire network and the ECM platform, to simplify operational management. In-place record hold - Being able to tag and hold potential critical information at the source, i.e., server or laptop, is a capability that separates the efficient eDiscovery/eCompliance suite from an unusable one. It is not reasonable to move all potential critical data back to a repository before review. The in-place hold and review and subsequent collection process streamlines and accelerates the process. Enterprise-wide critical information capture - With 80 percent o[...]
July 9, 2009Setting aside the Mac vs. PC debate for a minute, how about Chrome OS as your next OS choice? News.com reports that Google is moving beyond its Chrome web browser and Android smartphone operating system, and is actively developing a lightweight PC operating system based on Linux and Web standards for personal computers. It will also be based on Google's Chrome browser. More info is available on this earlier report at News.com - that "lower-end PCs called Netbooks from unnamed manufacturers will include it in the second half of 2010. Linux will run under the covers of the open-source project, but the applications will run on the Web itself. In other words, Google's cloud-computing ambitions just got a lot bigger." (my emphasis added) There is certainly a lot of buzz surrounding Netbooks and cloud computing, and tech pundits have been talking about the return to thin clients for years. For better or worse, Netbooks have been the first real manifestation of that prediction for mainstream users, or at least the first commercially successful one. A lot obviously depends on what Google ultimately delivers to us in Chrome OS, and the integration with their online apps. As it is based upon Linux, I can see where Chrome OS could also end up as an alternative OS on mainstream PCs, probably set up by users in a multi-boot fashion much like Ubuntu, another Linux OS that's been designed to be more user friendly. Given the reduced computing ability of Netbooks and the likely phasing out of Windows XP, a lightweight OS such as Chrome could be a compelling Netbook successor - if it offers the right mix of what Netbooks users are looking for. The Netbook market is a great focus for Google for several reasons. First, since Netbooks currently lack sufficient computing power to run heavier applications, they are best used as web clients, aligning with Google's online world and business model. I also think Chrome OS already has too much competition on the mainstream OS front, from Microsoft, Apple, and even other Linux variations. So far, my impressions are that I have yet to see the Android OS take off as a serious smartphone contender (especially in light of the iPhone and Palm Pre, and new BlackBerry offerings coming from RIM). I still see the Chrome browser as somewhat of a tech curiosity rather than a mainstream browser, as most people are still using some flavor of IE or Firefox as their main browsers. That's not to say that Chrome hasn't introduced some nice features, such as tear-away tabs and better stability resulting from improved memory management. But it's been uniformly criticized as having too few features to compete head-on with leading browsers, an observation with which I tend to agree. So, given the "less is more" approach of the Chrome browser, I expect the same philosophy in Chrome OS, particularly as it will be based around its browser namesake. And which computing platforms have capitalized on and appealed to us as "less is more"? That's right - Netbooks and cloud computing. Thus I see Google sensing a critical opportunity in the Netbook OS market in the interim between the aging Windows XP Home and whatever is next from Microsoft. It is an opportunity to tie together two emerging markets that are heavily steeped in the Web - Netbooks and cloud computing - in a way that Google couldn't do as effectively by relying upon others' operating systems. Google has a long history of making great applications that are particularly easy-to-use, whether they are PC or web-based, including Google Desktop, Picasa, Google Maps, Google Earth, and Gmail. It will be interesting to see how Google approaches their OS design, particularly with Linux as it can be daunting for non-techie users under the hood. However, they've certainly had ample dress rehearsal with it in d[...]
July 7, 2009
I've been meaning to blog about this handy tool for some time now. Unless you bought a Netbook or custom ordered your PC online, just about every Windows PC sold within the past two years came preloaded with some version of Windows Vista. While I prefer Vista's interface and built-in search features to the aging XP platform, Vista definitely leaves something to be desired in the performance department. I often found my hard drive thrashing at the most inopportune moments, slowing my system down when I needed it the most. There was seemingly no rhyme or reason for it - until I looked at the system processes and found that the indexer was running amok.
Having indexed files really speeds up performance when you're searching for content on your hard drive - especially when searching within Outlook. But how often do we do that over the course of a day? Perhaps just a few minutes or seconds at a time when we're looking for a specific document or e-mail. The problem is that the rest of the time, we want our apps to launch and run as quickly as possible.
The common cure is to turn off Windows' indexer service altogether, but then you won't be able to search for newly added content since it won't be indexed - which completely defeats the purpose of having an indexer installed in the first place. Sure, one could use an alternate desktop search tool, but if you're otherwise happy with the built-in Windows search service, it's nice to have it run on your terms.
Enter a fantastic Vista gadget that has significantly reclaimed my system performance and my sanity when using Vista:
BrandonTools offers the Windows Indexer Status gadget, which allows you to:
Just so you're aware, if you're running as a standard user account in Vista (which you should for security reasons), you'll need to enter an administrator password when starting or stopping the indexer service (via the "play" and "pause" buttons on the gadget). This is a very small annoyance to eliminate a much bigger one. The nice thing is you only need to run the Windows Sidebar when you want to start or stop the indexer service. Otherwise, you can close the Sidebar to free up memory and increase your CPU performance even further.
It works with Windows Search versions 3.0 and 4.0. Windows Search 3.0 comes built-in with Vista, and version 4.0 is available as a free download from Microsoft.
June 8, 2009
I'm honored that ILTA asked me to contribute a white paper on best practices for legal holds. It's a topic near and dear to my heart, as I advise companies seeking to implement more effective hold policies and procedures. The legal hold process is a critical stage in eDiscovery. Implementing and executing a well-designed legal hold process can significantly reduce the risks and costs associated with eDiscovery and other compliance requirements.
Crafting, adopting and implementing legal hold best practices often raises the following questions:
You can download a PDF reprint here at LTG, which answers these increasingly important questions along with examples from recent key eDiscovery case decisions.
I also recommend downloading and reading the full white paper collection, made possible by the combined efforts of ILTA's Litigation Support, Records Management and Law Department Peer Groups. There are a number of great contributions on the subject which many should find quite helpful:
Litigation Support: Document Forensics and Legal Holds
Articles included in this white paper:
- Overcoming Data Encryption for Forensic Imaging and Collections
- When is Full-Blown Forensic Collection Necessary?
- When "Deleted" Doesn't Mean "Gone"
- Disaster Recovery or Discovery Disaster?
- Legal Hold and Subpoena Compliance Coordination
- Best Practices for Legal Hold Processes
- The Effects of Litigation Holds on the Corporate Lawyer
I frequently hear that what keeps GC's and AGC's awake at night is their legal hold preservation and collection process, or lack thereof along with the fear of sanctions for spoliation and other discovery violations. If your organization has issues with its legal hold and other discovery processes or you'd like to know how you can improve their repeatability and defensibility while reducing cost and risk, please contact me via either the e-mail link on this blog or the e-mail address in the white paper. I'd be happy to discuss.
May 29, 2009According to InformationWeek, the next version of Microsoft's e-mail server, Exchange 2010, "will include integrated archiving and multi-mailbox search capabilities at no extra cost, making it easier for companies to, for example, comply with e-discovery requirements. But Microsoft will have to be careful not to alienate third-party archiving vendors such as Symantec and Quest." "Until this version of Exchange, companies seeking to archive their e-mail centrally have had to rely on third-party software. That costly proposition has hurt adoption, and according to Osterman Research, only 28% of companies currently have central e-mail archives." From this report, Exchange 2010 will also include the ability to view e-mail discussion threads, and a button to ignore those threads. It will also feature speech-to-text transcription of voicemails, something that lawyers have struggled with in advising companies who wanted to implement more convenient services such as universal messaging, where voicemails get sent to your inbox. Another interesting Exchange 2010 feature for legal departments: "There's also new role-based administration, which means that Exchange administrators can delegate responsibility for some non-IT tasks to non-IT workers. For example, human resources managers could update employee information, the legal department could handle e-discovery and audits, and employees could create their own distribution lists." (emphasis added) However, don't get overly excited at these new developments, at least not yet. Microsoft has a long history of working in and dumbing down features from competitors' offerings. The mimicked features often haven't had nearly the same range or depth as a competitor's fuller offering. However, in some cases, companies have recognized that it was "good enough" for their immediate needs and later purchased additional capabilities from other solution providers to fill the gaps as they were identified. A hat tip to ARMA for their post pointing this out: "Analysts note that Exchange 2010 will not provide such advanced features as content analytics and archiving of multiple content types commonly found in higher-end products geared toward e-discovery." (emphasis added) Thus a key question will be: What will cash-strapped organizations lacking e-mail archiving systems opt for in their next round of e-mail management purchase decisions? Some might start off with Exchange 2010 to see if it's "good enough", particularly if their eDiscovery needs are relatively light. E-mail archiving vendors may also need to step up their game by offering enhanced value-added tools such as advance search, deeper and more robust content analytics, and handling of diverse content types, as well as making it easier to identify and export data to other downstream eDiscovery systems for processing, analysis, review, and production. I tend to think that organizations with more diverse, complex, and/or higher volume discovery tasks will still need additional tools and services than simply Exchange 2010. But it's good to see that Microsoft is recognizing the shifting role that e-mail is playing in organizations' compliance, discovery, and risk management programs and beginning to add more data management features. Exchange 2010 is coming right around the corner, per InformationWeek: "The company plans to release Exchange Server 2010 in the second half of this year. The rest of Office is due in the first half of 2010, with limited test releases beginning the third quarter of this year. Outlook 2010 will come as part of the rest of the Office suite, though it's unclear when the next version of Outlook Mobile will be available."[...]
April 9, 2009
The ABA Journal ran this article about a juror who tweeted from his cell phone both during and after his jury service in a trial where the jury awarded a $12.6M verdict. Obviously this is cause for concern and consternation by the losing party and their attorneys, but the judge found that it didn't rise to the level of improper conduct. The lesson learned by one of the plaintiff lawyers is that he will ask potential jurors about cell phone and Internet use. The juror's response: "The courts are just going to have to catch up with the technology."
Bob Ambrogi over at Legal Blog Watch posted some of the juror's more inflammatory tweets. Definitely not so sweet.
April 1, 2009
It's all over Twitter and the web - how a Twitterer made a negative Tweet about her new job offer from Cisco. Naturally, someone who identified himself from Cisco saw it and responded. It's now an urban wegend (web legend), dubbed the "Cisco Fatty" incident, in reference to the "fatty paycheck" comment in her Tweet. There are already YouTube videos parodying and discussing it.
Covered in a DailyTech article, the Twitterer identified as Connor Riley explained her situation and intent in why she turned down the job and sarcastically tweeted about it to her friends. But she didn't protect the tweet from others seeing it. She even authored a thoughtful blog post to explain, apologize, and add her thoughts on the subject of social media. But really, the damage is done to her professional and personal reputation. Not exactly how one wants to gain their 15 minutes of fame in transitioning from college into the workforce. The Chicago Tribune also ran an article, "'Cisco Fatty' incident provides cautionary tale to those who tweet about work".
The moral of the story: Think before you tweet.
March 30, 2009Consider this post as a public service announcement. I've recently been shopping online for a nice big capacity external hard drive, as well as a larger capacity notebook drive. Over the years, I've seen the major hard drive manufacturers go through major problems with quality control and drive failure issues. So naturally I headed on over to Amazon and Newegg to check out the feedback on various drives. It's good to know which zone they're in at the moment before buying. Since my last 3.5" drive was a Seagate that has performed exceptionally well in one of my desktops, I checked the Seagate drives first. However, after reading about their failure rates in both their external FreeAgent series as well as the internal drive models, I would recommend staying away from them for some time, especially in the 1 - 1.5TB range, and even their previously acclaimed Barracuda series. I also read some negative feedback on their 500GB notebook drives - that users have experienced serious performance issues with audio or video media stuttering while trying to play back from these hard drives. I thought I'd share my online findings as a "Buyer Beware" post, based on the following numerous sources: Slashdot: "Seagate Hard Drive Fiasco Grows" (Jan 16, 2009) "Seagate Firmware Update Bricks 500GB Barracudas" (Jan 21, 2009) Newegg User 1-Egg Reviews: Seagate FreeAgent Desk ST315005FDA2E1-RK 1.5TB 7200 RPM Silver External Hard Drive - Retail Seagate FreeAgent XTreme ST315005FPA2E3-RK 1.5TB 7200 RPM USB 2.0 / IEEE 1394a / eSATA Black External Hard Drive - Retail Seagate Barracuda 7200.11 ST31500341AS 1.5TB 7200 RPM 32MB Cache SATA 3.0Gb/s Hard Drive (bare drive) - OEM (This last one is an internal drive, so it seems to illustrate that the problems run across the Seagate 3.5" drive line) Amazon User 1-Star Reviews: Seagate FreeAgent Desk 1 TB USB 2.0 Port External Hard Drive-Silver ST310005FDA2E1-RK Seagate FreeAgent Xtreme 1.5 TB External Hard Drive (Black) Seagate 1TB Barracuda 7200.11 Bulk/OEM Hard Drive ST31000340AS (this is an internal drive) In my book, when the 1-star reviews (the worst rating) constitute the first or second highest category of customer feedback for each drive on multiple sites, this indicates a serious problem, which is backed up by the Slashdot articles and postings above. You see, a number of the 3.5" Seagate drives were/are affected by a firmware issue that makes the drives inaccessible after a very short period of use. While Seagate has issued firmware updates, the feedback from users on their effectiveness is not encouraging at all. In fact, it's downright miserable out there, and I wouldn't be surprised in the least to hear of a class action in Seagate's near future. [3.31.09 - I figured I wasn't the only one, see this law firm's site.] Supposedly the data stored on the drive is still intact, it's just rendered inaccessible. Gee, just what I want to experience with a brand new drive! Others reported the dreaded "click of death" within just days or weeks of use - a sound that usually signals drive failure is imminent. So while Seagate's firmware recommendations page states this "affects a small number" of drives, it would seem that the above Slashdot and negative user feedback pages provide more insight into the scope of the problem(s). So until we hear of users being more successful with a firmware update, it's probably best to steer clear of those drives for a while. Even if Seagate should release an effective firmware update, the average purchaser probably won't know which dealer stock has the fix, and which ones won't. To have to flash a hard drive right out of[...]
March 21, 2009
My latest InsideCounsel article, "Think Before You In-source" is now available online. While there has certainly been a trend to bring eDiscovery in-house, lately I've been hearing from a number of corporate legal and enterprise IT professionals regarding their frustration in this area. I'm not alone, having heard the same from colleagues at LegalTech NY and elsewhere.
As I shared in the article,
I have recently heard from a number of companies who have been dissatisfied that what they've brought in-house from software providers hasn't lived up to the hype, delivered the best results or integrated with all the necessary data systems to address their needs. Some of those acquisitions are even being shelved or curtailed prematurely, well before realizing their return on investment.
Thus I offer seven key factors and issues to consider before deciding to bring various e-discovery services and technology in-house. In addition, often a number of difficulties can be addressed through better process design, since technology isn't a broad spectrum panacea. It's a tool to support and automate those processes, not the other way around, and it's important to keep things in the proper perspective:
Keep in mind, this discussion isn't advocating that various aspects of e-Discovery shouldn't be brought in-house. Obviously, many companies are doing just that with the goal to reduce costs, improve consistency and gain better control over their processes to improve compliance. Thus a better statement is that the decision on whether to bring eDiscovery tasks in-house shouldn't be made lightly or because you heard another company in your industry has done so. It needs to make sense and fit well with your particular company's abilities, goals, resources, culture, business processes, risk management, and more.
Like most things worth doing, it's important to consider a number of critical factors and issues before jumping on the bandwagon and throwing technology at the problems, some of which aren't even technological issues. The more you have done your homework, including having a good handle on the particular issues, gaps, costs, risks, and processes needing to be addressed, the better off you'll likely be when the smoke clears.
In addition, it's important that companies don't just explore the obvious if they want to make meaningful improvements and cost reductions. There are a number of concurrent or alternate cost-saving measures than can offer significant benefits, which should also be explored or they may be otherwise overlooked in all the hype.
March 11, 2009In his latest LTN column, Ball in Your Court, Craig Ball debunks the long-held hard drive multi-pass erasure myth, that goes like this: "Top notch computer forensic examiners have special tools and techniques enabling them to recover overwritten data from a wiped hard drive so long as the drive was wiped less than 3 or 7 or 35 times." The myth also goes that someone using a magnetic force electron microscope would be able to discern the trace magnetic signal left behind on a drive that wasn't wiped enough times, and somehow piece together the underlying wiped data. Which is a leading reason why common file and disk wiping tools have included all kinds of multi-pass wiping options, ranging from the DOD-specified wipes to the massive 35 times Gutmann wipe. One part of the myth also says that one can recover trace magnetic data from the spaces between the tracks as the drive heads don't track exactly the same on each pass when writing data. (Think of this as the space between the grooves on a vinyl record, for those of us who fondly remember them.) To which Craig says, "Nonsense!" and "[i]t's all a lot of hogwash, at least with respect to any drive made this century." He explains how the vastly increased "areal density" of modern hard drives leaves little room for wiped data to be resurrected, even if it's only wiped with a single pass. Areal density simply refers to how closely packed together all the data bits are, which allows manufacturers to place hundreds of GB on a single hard drive platter these days. Like him, I've heard the myth for years and questioned the ability to use a magnetic force electron microscope to resurrect wiped data. First, it would be incredibly expensive to do (but that factor only makes it impracticable). So it was interesting to hear the results, as Craig related from several professionals performing such an experiment, was that it was less successful than a simple coin toss. Thus he concludes: "You only need one complete pass to eviscerate the data (unless your work requires slavish compliance with obsolete parts of Department of Defense Directive 5220.22-M and you make two more passes for good measure). No tool and no technique extant today can recover overwritten data on 21st century hard drives. Nada. Zip. Zilch." While fascinating from a technical perspective, the real take-away from Craig's article is the reminder that: "The most egregious is the assumption that formatting a hard drive is the same as wiping its contents. In fact, formatting obliterates almost none of a drive's contents. Any eBay purchaser of a formatted drive can easily restore its contents." If only I had a Google share for every time I advised someone about this danger and resulting risk. If you are disposing of a hard drive or giving it to someone else to use, use a proper drive wiping tool first, not a simple format command. Another good take-away is Craig's discussion of the "G List" sectors on a hard drive, and why conventional wiping cannot touch that data. So what are those? In essence, modern hard drives have the ability to sense when a sector is going bad (i.e., not able to store information reliably). When that is detected, the hard drive automatically copies the contents of the ailing sector to another unused sector on the hard drive, and remaps (points) to its new location on the drive. This map is kept in the G List on the drive, which stands for Growth List or Growing Defect List. This is a good thing so you don't lose data to bad spots on the hard drive. However, when you use wiping software to wipe the drive's data, it can only [...]