Lighthouse Blog
Read the latest insights from industry experts on the rapidly evolving legal and technology landscapes with topics including strategic and technology-driven approaches to eDiscovery, innovation in artificial intelligence and analytics, modern data challenges, and more.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Blog

Making the Case for Information Governance and Why You Should Address it Now
You know that cleaning out the garage is a good idea. You would have more storage space and would even be able to put the car into the garage, which is better for security, for keeping it clean, and for ensuring an easy start on a frozen winter morning. Even if you don’t have a garage, you likely have an equivalent example such as a loft or that cupboard in the kitchen, yet somehow these tasks are often put off and rarely top of the “to do” list. Information governance often falls in this category; a great idea that struggles to make it to the top ahead of competing corporate priorities.For both the garage and information governance, the issue is the creation of a compelling business case. For the garage, the arrival of a new car or a spate of car thefts in the area is enough to push this task to the front. For information governance, the business case might be that a company is enlightened enough to realize that its data is an under-utilized asset or it might be a question of time and effort being wasted in the struggle to find the information when needed. However, these positive drivers might not be enough. Sometimes you need to look at the risk if nothing is done.In our view, building a strong business case for information governance will be a laconic combination of both the carrot and the stick. This blog will focus on the stick because that is often the hardest factor to spell out in clear terms. We will take you on a journey through the GDPR fines that have been levied since it came into force in May 2018, show how European regulators see information governance as an essential element of a companies’ data protection obligations, and give you the necessary background to prepare your business case.Why address information governance now? It is worth just pausing to ensure we are all talking about the same thing, so let’s define information governance. You can see Gartner’s definition here. For our purposes, we can talk in simpler terms and define information governance as “the people, processes, and technology involved in seeking to ensure the effective and efficient creation, storage, use, retention, and deletion of information.”Now, let’s turn to the GDPR. The total of fines under the GDPR, since it came into force in May 2018, approaches €300m. The big fines usually relate to processing personal data without good reason or consent (e.g. Google - €50m), or for inadequate security leading to data breaches (e.g. British Airways - £20m). As a result, many organizations prioritize this type of work.However, after a thorough trawl, we see a growing body of decisions where fines have been imposed by regulators for information governance failures. In our view, the top 5 reported “information governance” fines are:€15m Deutsche Wohnen (Berlin DPA) – set aside on procedural grounds​€2.25m Carrefour (France)​€290,000 HUF (Hungary)​€250,000 Spartoo (France)​€160,000 Taxa4x35 (Denmark)​GDPR fines, in detailThe largest fine is the Deutsche Wohnen matter. In 2017, the Berlin Data Protection Authority (DPA) investigated Deutsche Wohnen and found its data protection policies to be inadequate. Specifically, personal data was being stored without a necessary reason and some of it was being retained longer than necessary. In 2019, the DPA conducted a follow-up investigation and found these issues were not sufficiently remedied and thus issued a fine of €15m. The Berlin DPA explained that Deutsche Wohnen could have readily complied by implementing an archiving system which separates data with different retention periods thereby allowing differentiated deletion periods, as such solutions are commercially available. In February 2021, Criminal Chamber 26 of the District Court of Berlin closed the proceedings on the basis the decision was invalid and not sufficiently substantiated. The Berlin DPA had not specified the acts by the management of the company that supposedly led to a violation of the GDPR. The Berlin DPA has announced it would ask the public prosecutor to file an appeal.​ It would be a mistake to interpret the nullification of the fine as evidence that information governance / data retention is not an important issue for DPAs. Such an interpretation would be ignoring that fact that there is no criticism as to the substance of the findings made by the Berlin DPA in relation to Deutsche Wohnen’s approach to data retention.Holding data without necessary purpose or not actively deleting data has been a theme of fines by other DPAs as well. In Denmark, the Data Protection Authority recommended fines for similar inadequacies as follows:1.2m DKK (€160,000) on Taxa4x35. A DPA inspection discovered that although customer names were deleted after 2 years, their telephone numbers remained for 5 (as a key field in the CRM database)1.1m DKK (€150,000) on Arp-Hansen Hotel Group. Personal data was being stored longer than was necessary and in breach of Arp-Hansen’s own retention policies​1.5m DKK (€200,000) on ID Design. A routine DPA inspection revealed old customer data not being adequately deleted.​ Although, like Deutsche Wohnen, this fine was subsequently reduced on technical grounds, the commentary on the corporate information governance policies still holds.In France, three fines have been imposed relating to the holding customer data well past what the regulators deemed necessary:In the Carrefour​ matter, there was a fine of €2.25m​ for various infringements including that Carrefour had retained the data of more than 28 million inactive customers, through its customer loyalty programme, for an excessive period.In SERGIC​, there was a fine of €400,000​ for various infringements including that SERGIC had stored the documents of unsuccessful rental candidates beyond the necessary time to achieve the purpose for which the data was collected and processed​.In Spartoo​, there was a fine of €250,000​ for reasons including that Spartoo retained data for longer than was necessary for more than 3 million customers​. In Spartoo, the regulators also called out that the company had not set up a retention period for customer and prospect data​, did not regularly erase personal data​, and retained names and passwords in a non-anonymised form for over 5 years​.Although the authorities in France and Denmark have been the most active, they are not alone. In Hungary, HUF​ was issued with a fine of approximately €290,000​ based on the absence of a retention policy for a database containing personal data. And in Germany, Delivery Hero failed to delete accounts of former customers who had not been active on the company’s delivery service platform for years ​and was fined €195,000.Other authorities may not yet have imposed fines, but their attention is turning in the direction of information governance. A number of DPAs have issued guidance, the scope of which includes data retention (e.g. the Irish DPA, in Sept 2020, on how long COVID contact details should be retained; the French DPA, in October 2020, on how long union-member files should be retained)​.How to get started on your business caseThere is a genuine threat to companies stalling in relation to information governance, particularly around personal data. The decisions to date represent a small percentage of the activity in this area, as many of the violations are dealt with by regulators directly. We don’t know what, if any, settlements have been agreed upon, but the decisions that we have located are helpful and instructive for building the business case for prioritizing this work.The first thing to do is create an internal overview for why this area matters – use the above to show that there is risk and that regulators are paying attention. Hopefully, our overview will help you to identify the size of the stick. As to the carrot, that will be very company-specific, but our clients who have successfully made the case focus on the efficiency gains that can be made if information is properly governed as well as the opportunity to mine more effectively their own information for its real business value. Next, take a look at your policies and areas that may require adjustment based on the above in order to gain some insight into the scale of the activity. Now your business case should be taking shape. You might also consider looking wider than the GDPR, such as the increasing number of state data protection frameworks within the US.We recognize this process is an oversimplification and each step requires a significant time investment by your organization, but spending time focusing on the necessity of retaining personal data, as well as the length of retention (and subsequent deletion), are critical elements in minimizing your risk.information-governancedata-privacy, blog, record-management, information-governance,data-privacy; blog; record-managementlighthouse
Information Governance
Blog

Analytics and Predictive Coding Technology for Corporate Attorneys: Demystifying the Jargon
Below is a copy of a featured article written by Jennifer Swanton of Medtronic, Shannon Capone Kirk of Ropes & Gray, and John Del Piero of Lighthouse for Legaltech News.Despite the traditional narrative that lawyers are hesitant to embrace technology, many in-house legal departments and their outside service providers are embracing the use of what is generally referred to as artificial intelligence (AI). In terms of litigation and internal investigations, this translates more specifically into conceptual analytics and predictive coding (also referred to as continuous active learning, or CAL), which are two of the more advanced technological innovations in the litigation space and corporate America.This adoption, in part, seems to be driven by an expectation from corporate leaders that their in-house counsel must be able to identify and utilize the best available technology in order to drive cost efficiency, while also reducing risk and increasing effective and defensible litigation positions. For instance, in a 2019 survey of 163 legal professionals conducted by ALM Intelligence and LexisNexis, 92% of attorneys surveyed planned to increase their use of legal analytics in the upcoming 12 months. The reasoning behind that expected increase in adoption was two-fold, with lawyers indicating that it was driven both by competitive pressure to win cases (57%), as well as client expectation (56%).Given that the above survey took place right before the COVID-19 pandemic hit, it stands to reason that the 92% of attorneys that expected to increase their use of analytics tools in 2020 may actually be even higher now. With a divisive election and receding pandemic only recently behind us, and an already unpredictable market, many corporations are tightening budgets and looking to further reduce unnecessary spend. Conceptual analytics and CAL are easy (yes, really) and effective ways to manage ballooning datasets and significantly reduce discovery, litigation and internal investigation costs.With that in mind, we would like to help create a better relationship between corporate attorneys and advanced technology with the following two step approach—which we will outline in a series of two articles.This first installment will help demystify the language technology providers tend to use around AI and analytics technology so that in-house teams feel more comfortable with adoption. In our second article, we will provide examples of some great use cases where corporate legal teams can easily leverage technology to help improve workflows. Together, we hope this approach can help in-house legal teams adopt technology that drives efficiency, lowers cost, and improves the quality of their work.Demystifying AI JargonIf you have ever discussed AI or analytics technology with a technology provider, you are probably more than aware that tech folks have a tendency to forget that the majority of their clients don’t live in the world of developing and evaluating new technology, day in and day out. Thus, they may use terms that are often confusing to their legal counterparts (and sometimes use terms that don’t match what the technology is capable of in the legal world). For this reason, it is helpful to level set with some common terminology and definitions, so that in-house attorneys are prepared to have better, more practical real-world discussions with technology providers.Analytics Technology: Within the eDiscovery and compliance space, analytics technology is the ability of a machine to recognize patterns, structures, concepts, terminology, and/or the people interacting within data, and then present that analysis in a visual representation so that attorneys have a better overview of their data. As with AI, not all analytics tools have the same capabilities. Vendors may label everything from email threading identification to more advanced technology that can identify complex concepts and human sentiment as “analytics” tools.Within these articles, when we reference this term, we are referring to the more advanced technology that can analyze not only the text within data but also the metadata and any previous coding applied by subject matter experts. This is an important distinction because this type of technology can greatly improve the accuracy of the analysis compared to older tools. For example, analytics technology that can analyze metadata as well as text is much better at identifying concepts like attorney-client privilege because it can analyze not only the language being used but who is using that language and the circumstances in which they use it.Artificial Intelligence (AI): Probably the most broadly recognized term due to its prevalence outside of the eDiscovery space, AI is technically defined as the ability of a computer to complete tasks that usually would require human intelligence. Within the eDiscovery and compliance world, vendors often use the term broadly to refer to a variety of technologies that can perform tasks that previously would require completely human review.It is important to remember though that the term AI can refer to a broad range of technology with very different capabilities. “AI” in the legal world is currently being used as a generalized term and legal consumers of such technologies should press for specifics—not all “AI” is the same, or, in several cases, even AI at all.Machine Learning: Machine learning is a category of algorithms used in AI that can analyze statistics and find patterns in large volumes of data. The algorithms improve with experience—meaning that as documents are coded in a consistent fashion by humans, the better and more accurate the algorithms should become at identifying specific data types. Note here that there is a common misunderstanding that machine learning requires large amounts of data from which to learn. That is not necessarily true—all that is required for machine learning to work well is that the input it learns from (i.e., document coding for eDiscovery purposes) is consistent and accurate.Natural Language Processing (NLP): NLP is a subset of AI that uses machine learning to process and analyze the natural language humans use within large amounts of data. The result is technology that can “understand” the contents of documents, including the context in which language is used within them. Within eDiscovery, NLP is used within more advanced forms of analytics technology to help identify specific content or sentiments within large datasets.For example, NLP can be used to more accurately identify sensitive information, like personally identifiable information (PII), within datasets. NLP is better at this task than older AI technology because older models relied on “regular expressions” (a sequence of characters to define a search pattern) to identify information. When a “regular expression” (or regex) is used by an algorithm to find, for example, VISA account numbers—it will be able to identify the correct number pattern (i.e., any number that starts with the number 4 and has 16 digits) within a dataset but will be unable to differentiate other numbers that have the same pattern (for example, employee identification numbers). Thus, the results returned by legacy technology using regex may be overbroad and include false positives.NLP can return more accurate results for that same task because it is able to identify not only the number pattern, but can also analyze the language used around the pattern. In this way, NLP will understand the context in which VISA account numbers are communicated within that dataset compared to how employee identification numbers are communicated, and only return the VISA numbers.Predictive Coding (also referred to as Technology-Assisted Review or TAR): Predictive coding is not the same as conceptual analytics. Also, predictive coding is a bit of a misnomer, as the tools don’t predict or code anything. A human reviewer is very much involved. Simply put, it refers to a form of machine learning, wherein humans review documents and make binary coding calls: what is responsive and what is non-responsive. This is similar in concept to selecting thumbs up or down in Pandora so as to teach the app what songs you like and don’t like. After some human coding and calibrations between the human and the tool, the technology uses the human’s coding selections to score how the remaining documents should be coded, enabling the human to review the high scored documents first.In the most current versions of predictive coding, this technology continually improves and refreshes as the human reviews, which reduces or eliminates the need for surgical precision on coding at the start (which was a concern in the former version of predictive coding and why providers and parties spent a considerable amount of time concerned with “seed sets”). This improved and self-improving prioritization of large document sets based on high-scored documents is usually a more efficient and organized manner in which to review documents.Because of this evolution in predictive coding, it is often referred to in a host of different ways, such as TAR 1.0 (which requires “seed sets” to learn from at the start) and TAR 2.0 (which is able to continually refresh as the human codes—and is thus also referred to as Continuous Active Learning or CAL). Some providers continue to use the old terminology, or explain their advancements by walking through the differences between TAR 1.0 and TAR 2.0, and so on. But, speaking plainly, in this day and age, providers and legal teams should really only be concerned with the latest version of TAR, which utilizes CAL, and significantly reduces or totally eliminates the previous concern with surgical precision on coding an initial “seed set.” With our examples in the next installment, we hope to illustrate this point. In a word, walking through the technological evolution around predictive coding and all of the associated terminology can cause unnecessary intimidation, and can cause confusion between providers, parties and the court.The key takeaway from these definitions is that even though all the technology described above may technically fall into the “AI” bucket, there is an important distinction between predictive coding/TAR technology and advanced analytics technology that uses AI and NLP. The distinction is that predictive coding/TAR is a much more technologically-limited method of ranking documents based on binary human decisions, while advanced analytics technology is capable of analyzing the context of human language used within documents to accurately identify a wide variety of concepts and sentiment within a dataset. Both tools still require a good amount of interaction with human reviewers and both are not mutually exclusive. In fact, on many investigations in particular, it is often very efficient to employ both conceptual analytics and TAR, simultaneously, in a review.Please stay tuned for our next installment in this series, “Analytics and Predictive Coding Technology for Corporate Attorneys: Six Use Cases”, where we will outline six specific ways that corporate legal teams can put this type of technology to work in the eDiscovery and compliance space to improve cost, outcome, efficiencies.ai-and-analyticstar-predictive-coding, blog, corporate, ai, ai-and-analytics,tar-predictive-coding; blog; corporate; ailegaltech news
AI and Analytics
Blog

Biden Administration Executive Order on Promoting Competition: What Does it Mean and How to Prepare
On July 9, 2021, President Biden signed a sweeping new Executive Order (“the Order”) with the stated goal of increasing competition in American markets. Like the recently issued Executive Order on Improving the Nation’s Cybersecurity, the Executive Order on Promoting Competition in the American Economy is meant to establish “a whole-of-government” approach to tackle an issue that is typically handled by numerous federal agencies. As such, the Order includes 72 initiatives touching more than a dozen federal agencies and numerous industries, including healthcare, transportation, agriculture, internet service providers, technology, beer and wine manufacturing, and banking and consumer finance.Notably, the Order calls on the Department of Justice (DOJ) and Federal Trade Commission (FTC) to “vigorously” enforce antitrust laws and “reaffirms” the government’s authority to challenge past transactions that may have been in violation of antitrust laws and regulations (even if they were not challenged by previous Administrations). The remainder of this blog will broadly outline the contents of the Order and conclude with a brief summary on possible ramifications for organizations undergoing merger and acquisition activity (as well as the law firms that counsel them) and how to prepare for them.What is in the Executive Order on Promoting Competition in the American EconomySection 1: PolicyThis section broadly outlines the benefits of “robust competition” to America’s economy and asserts the U.S policy of promoting “competition and innovation” as an answer to the rise of foreign monopolies and cartels. This section also announces the Administration’s policy of supporting “aggressive legislative reforms” to lower prescription drug prices and supports the enactment of a public health insurance option.Sec. 2: The Statutory Basis of a Whole-of-Government Competition Policy This section outlines the antitrust laws which form the Administration’s whole-of-government anti-competition policy, including the Sherman Act, the Clayton Act, and the Federal Trade Commission Act, as well as fair competition and anti-monopolization laws, including Packers and Stockyards Act, Federal Alcohol Administration Act, the Bank Merger Act, and others.Sect 3: Agency Cooperation in Oversight, Investigation, and RemediesThis section outlines the Administration’s policy of cooperation between agencies on anti-competition issues, stating that when there is overlapping jurisdiction over anticompetitive conduct and mergers, the involved agencies should “endeavor to cooperate fully in the exercise of their oversight authority” to benefit from the respective expertise of the agencies and to improve Government efficiency.Section 4: The White House Competition Council This section establishes a White House Competition Council to “coordinate, promote, and advance” government efforts to address monopolies and unfair competition. The section also mandates that the Council should work across agencies to provide a coordinated response to monopolization and unfair competition and outlines the Council make up and meeting cadence.Section 5: Further Agency Responsibilities This section mandates that the heads of all agencies must “consider using their authorities” to further the anti-competition policies outlined within the Order, and “encourages” relevant positions and heads of agencies (including the Attorney General, Chair of the Federal Trade Commission (FTC), Secretary of Commerce, and others) to enforce existing antitrust laws “vigorously,” as well as review and consider revisions to other laws and powers, including encouragement to:Enforce the Clayton Act and other antitrust laws “fairly and vigorously.Review merger guidelines to consider whether they should be revised.Revise positions on the intersection of intellectual property and antitrust laws.Review current practices and adopt a plan for the revitalization of merger oversight under the Bank Merger Act and the Bank Holding Company Act of 1956.Consider whether to revise the Antitrust Guidance for Human Resource Professionals of October 2016.Consider curtailing the unfair use of non-compete clauses that may unfairly limit worker mobility.Consider rulemaking in other areas such as: Unfair data collection and surveillance practices that may damage competition, consumer autonomy, and consumer privacy; Unfair anticompetitive restrictions on third-party repair or self-repair of items (aimed at restrictions that prevent farmers from repairing their own equipment);Unfair anticompetitive conduct or agreements in the prescription drug industries;Unfair competition in major Internet marketplaces;Unfair occupational licensing restrictions;Unfair exclusionary practices in the brokerage or listing of real estate; andAny other unfair industry-specific practices that substantially inhibit competition.The section also calls upon the Secretary of Agriculture to address the unfair treatment of farmers and improve competition in the markets for farm products, and for the Secretary of the Treasury to assess the conditions of competition around the American markets for beer, wine, and spirits (including improving the market for smaller, independent operations).Notably, this section also calls for the Chair of the Federal Communications Commission to consider adopting “Net Neutrality” rules and other avenues to promote competition and lower prices across the telecommunications ecosystem.Finally, the section also calls for the Secretary of Transportation to protect consumers and improve competition in the aviation industry, including enhancing consumer access to airline flight information, providing consumers with more flight options at better prices, promoting rulemaking around requiring airlines to refund baggage fees, and address the failure of airlines to provide timely refunds for flight cancellations resulting from the COVID-10 pandemic.ConclusionAs a whole, the result of this Order will be that organizations undergoing mergers and acquisition activity can expect to face more scrutiny from the government – and that law firms that provide counsel for those types of transactions can expect that government investigations of those activities (like HSR Second Requests) will be more in-depth and meticulous. Accordingly, any law firms and organizations preparing for those types of investigations would do well to evaluate their eDiscovery technology now, in order to ensure that they are using the best and most up-to-date legal technology and workflows to help locate the data requested by the government more accurately and efficiently.antitrustprism, blog, antitrustprism; blog; antitrustsarah moran
Antitrust & Regulatory Strategy
Blog

Cybersecurity Defense: Recommendations for Companies Impacted by the Biden Administration Executive Order
As summarized in the first installment of our two-part blog series, President Biden recently issued a sweeping Executive Order aimed at improving the nation’s cybersecurity defense. The Order is a reaction to increased cybersecurity attacks that have severely impacted both the public and private sectors. These recent attacks have evolved to a point that industry solutions have a much more difficult time detecting encryption and file state changes in a reasonable timeframe to prevent an actual compromise. The consequence is that new and evolving ransomware and malware attacks are now getting past even the biggest solution providers and leading scanners in the industry.Thus, while on its face, many of the new requirements within the Order are aimed at federal agencies and government subcontractors, the ultimate goal appears to be to create a more unified national cybersecurity defense across all sectors. In this installment of our blog series, I will outline recommended steps for private sector organizations to prepare for compliance with the Order, as well as general best-practice tips for adopting a more preemptive approach to cybersecurity. 1. Conduct a Third-Party AssessmentFirst and foremost, organizations must understand their current cybersecurity posture. Given the severity and volume of recent cyberattacks, third-party in-depth or red-team assessments should be done that would include not only the organization’s IT assets, but also include solutions providers, vendors, and suppliers. Red teaming is the process of providing a fact-driven adversary perspective as an input to solving or addressing a problem. In the cybersecurity space, it has become a best practice wherein the cyber resilience of an organization is challenged by an adversary or a threat actor’s perspective.[1] Red-team testing is very useful to test organizational policies, procedures, and reactions against defined, intended standards.A third-party assessment must include a comprehensive remote network scan and a comprehensive internal scan with internal access provided or gained with the intent to detect and expose potential vulnerabilities, exploits, and attack vectors for red-team testing. Internal comprehensive discovery includes scanning and running tools with the intent to detect deeper levels of vulnerabilities and areas of compromise. Physical intrusion tests during red-team testing should be conducted on the facility, networks, and systems to test readiness, defined policies, and procedures.The assessment will evaluate the ability to preserve the confidentiality, integrity, and availability of the information maintained and used by the organization and will test the use of security controls and procedures used to secure sensitive data.2. Integrate Solution Providers and IT Service Companies into Plans to Address Above Executive Order StepsTo accurately assess your organization’s risk, you first have to know who your vendors, partners, and suppliers are with whom you share critical data. Many organizations rely on a complex and interconnected supply chain to provide solutions or share data. As noted above, this is exactly why the Order will eventually broadly impact the private sector. While on its face, the Order only seems to impact federal government and subcontractor entities, those entities’ data infrastructures (like most today) are interconnected environments composed of many different organizations with complex layers of outsourcing partners, diverse distribution routes, and various technologies to provide products and services – all of whom will have to live up to the Order’s cybersecurity standards. In short, the federal government is recognizing that its vendors, partners, and suppliers’ cybersecurity vulnerabilities are also its own. The sooner all organizations realize this the better. According to recent NIST guidance, “Managing cyber supply chain risk requires ensuring the integrity, security, quality, and resilience of the supply chain and its products and services.” NIST recommends focusing on foundational practices, enterprise-wide practices, risk management processes, and critical systems. “Cost-effective supply chain risk mitigation requires organizations to identify systems and components that are most vulnerable and will cause the largest organizational impact if compromised.[2]In the recent attacks, hackers inserted malicious code into Orion software, and around 18,000 SolarWinds customers, including government and corporate entities, installed the tainted update onto their systems. The compromised update has had a sweeping impact, the scale of which keeps growing as new information emerges. Locking down your networks, systems, and data is just the beginning! Inquiring how your supply chain implements a Zero Trust strategy and secures their environment as well as your shared data is vitally important. A cyber-weak or compromised company can lead to exfiltration of data, which a bad actor can exploit or use to compromise your organization.3. Develop Plan to Address Most Critical Vulnerabilities and Threats Right AwayThird-party assessors should deliver a comprehensive report of their findings that includes the descriptions of the vulnerabilities, risks found in the environment, and recommendations to properly secure the data center assets, which will help companies stay ahead of the Order’s mandates. The reports typically include specific data obtained from the network, any information regarding exploitation of exposures, and the attempts to gain access to sensitive data.A superior assessment report will contain documented and detailed findings as a result of performing the service and will convey the assessor’s opinion of how best to remedy vulnerabilities. These will be prioritized for immediate action, depending upon the level of risk. Risks are often prioritized as critical, high, medium, and low risk to the environment, and a plan can be developed based upon these prioritizations for remediation.4. Develop A Zero Trust StrategyAs outlined in Section 3 of the Order, a Zero Trust strategy is critical to addressing the above steps, and must include establishing policy, training the organization, and assigning accountability for updating the policy. Defined by the National Security Agency (NSA)’s “Guidance on the Zero Trust Security Model”: “The Zero Trust model eliminates trust in any one element, node, or service by assuming that a breach is inevitable or has already occurred. The data-centric security model constantly limits access while also looking for anomalous or malicious activity.”[3]Properly implemented Zero Trust is not a set of access controls to be “checked,” but rather an assessment and implementation of security solutions that provide proper network and hardware segmentation as well as platform micro-segmentation and are implemented at all layers of the OSI (Open Systems Interconnection) model. A good position to take is that Zero Trust should be implemented using a design where all of the solutions assume they exist in a hostile environment. The solutions operate as if other layers in a company’s protections have been compromised. This allows isolation of the different layers to improve protection by combining the Zero Trust principles throughout the environment from perimeters to VPNs, remote access to Web Servers, and applications. For a true Zero Trust enabled environment, focus on cybersecurity solution providers that qualify as “Advanced” in the NSA’s Zero Trust Maturity Model; as defined in NSA’s Cybersecurity Paper, “Embracing a Zero Trust Security Model.”[4] This means that these solution providers will be able to deploy advanced protections and controls with robust analytics and orchestration.5. Evaluate Solutions that Pre-emptively Protect Through Defense-In-DepthIn order to further modernize your organization’s cybersecurity protection, consider full integration and/or replacement of some existing cybersecurity systems with ones that understand the complete end-to-end threats across the network. How can an organization implement confidentiality and integrity for breach prevention? Leverage automated, preemptive cybersecurity solutions, as they possess the greatest potential in thwarting attacks and rapidly identifying any security breaches to reduce time and cost. Use a Defense-in-Depth blueprint for cybersecurity to establish outer and inner perimeters, enable a Zero Trust environment, establish proper security boundaries, provide confidentiality for proper access into the data center, and support capabilities that prevent data exfiltration inside sensitive networks. Implement a solution to continuously scan and detect ransomware, malware, and unauthorized encryption that does NOT rely on API calls, file extensions, or signatures for data integrity.Solutions must have built-in protections leveraging multiple automated defense techniques, deep zero-day intelligence, revolutionary honeypot sensors, and revolutionary state technologies working together to preemptively protect the environment. ConclusionAs noted above, Cyemptive recommends the above steps in order to take a preemptive, holistic approach to cybersecurity defense. Cyemptive recommends initiating the above process as soon as possible – not only to comply with potential government mandates brought about due to President Biden’s Executive Order, but also to ensure that organizations are better prepared for the increased cybersecurity threat activity we are seeing throughout the private sector. ‍[1]“Red Teaming for Cybersecurity”. ISACA Journal. October 18, 2018. https://www.isaca.org/resources/isaca-journal/issues/2018/volume-5/red-teaming-for-cybersecurity#1 [2] “NIST Cybersecurity & Privacy Program” May 2021. Cyber Supply Chain Risk Management C-SCRM” https://csrc.nist.gov/CSRC/media/Projects/cyber-supply-chain-risk-management/documents/C-SCRM_Fact_Sheet_Draft_May_10.pdf [3] “NSA Issues Guidance on Zero Trust Security Model”. NSA. February 25, 2021. https://www.nsa.gov/Press-Room/News-Highlights/Article/Article/2515176/nsa-issues-guidance-on-zero-trust-security-model/[4] “Embracing a Zero Trust Security Model.” NSA Cybersecurity Information. February 2021. https://media.defense.gov/2021/Feb/25/2002588479/-1/-1/0/CSI_EMBRACING_ZT_SECURITY_MODEL_UOO115131-21.PDFdata-privacy; information-governancecloud, cybersecurity, blog, corporate, data-privacy, information-governancecloud; cybersecurity; blog; corporatelighthouse
Data Privacy
Information Governance
Blog

Cybersecurity Defense: Biden Administration Executive Order a Great Start Towards a More Robust National Framework
On May 12, President Biden issued a landmark Executive Order (“the Order”) aimed at improving the country’s cybersecurity threat defense. This Order is an attempt to create a “whole of government” response to increasingly frequent cybersecurity incidents that have wreaked havoc in the United States in recent months, affecting everything from energy supplies to healthcare systems to IT infrastructure systems. In addition to becoming more frequent, recent cyberattacks have also become increasingly more sophisticated – and even somewhat professional. In response to these attacks, the Biden administration seeks to build a national security framework that aligns the Federal government with private sector businesses in order to “modernize our cyber defenses and enhance the nation’s ability to quickly and effectively respond to significant cybersecurity incidents.” Prior to this Order, there has been no unified system to report or respond to cybersecurity threats and breach incidents. Instead, there is currently a patchwork of state legislation and separate federal government agency protocols, all with differing reporting, notification, and response requirements.In the first of this two-part blog series, I will broadly outline the details of this Order and what it will mean for private sector companies in the coming years. In the second installment, Rob Pike (CEO and Founder of Cyemptive Technologies) will provide guidance on how to set up your organization for compliance with the Order, as well as general best-practice tips for adopting a preemptive cybersecurity approach. What is in President Biden’s Executive Order on Improving the Nation’s CybersecurityThere are nine main sections to the Order, which are summarized below.Section 1: PolicyThis section outlines the overall goal of the Order – namely that, with this Order, the Federal government is intent on making “bold changes and significant investments in order to defend the vital institutions that underpin the American way of life.” To do so, the Order states that the government must improve its efforts to “identify, deter, protect against, detect, and respond to” cybersecurity attacks. While this may sound like a purely governmental task, the Order specifically states that this defense will require partnership with the private sector. Section 2: Removing Barriers to Sharing Threat Information As noted above, prior to this Order, there was no unified system for sharing information regarding threats and data breaches. In fact, separate agency procurement contract terms may actually prevent private companies from sharing that type of information with federal agencies, including the FBI. This section of the Order responds to those challenges by requiring the government to update federal contract language with IT service providers (including cloud service providers) to require the collection and sharing of threat information with the appropriate government agencies. While the Order currently only speaks to federal subcontractors, it is expected that this information-sharing requirement will have a trickle-down effect across the private sector, with purely private companies falling in line to share threat information once federal subcontractors are required to do so. Section 3: Modernizing Federal Government CybersecurityThis section calls for the federal government to adopt security best practices – and is specifically aimed at adopting Zero Trust Architecture and pushing a move to secure cloud services, including “Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS).” It requires that each government agency update plans to prioritize the adoption and use of cloud technology and develop a plan to implement Zero Trust Architecture, in part by incorporating the migrations steps outlined by the National Institute of Standards and Technology (NIST).Section 4: Enhancing Software Supply Chain SecurityThis section deals with increasing the cybersecurity standards of software sold to the government. It specifically calls out the fact that the development of commercial software “often lacks transparency, sufficient focus on the ability of the software to resist attack, and adequate controls to prevent tampering by malicious actors.” It, therefore, calls for “more rigorous and predictable mechanisms for ensuring that products function securely.” Thus, this section calls for NIST to issue new security guidelines for software used by the government. These new guidelines will include encryption requirements, multi-factor and risk-based authentication requirements, vulnerability detection and disclosure programs, and trust relationship audits, among others.Section 5: Establishing a Cyber Safety Review BoardThis section establishes a federal Cyber Safety Review Board, which will convene following significant cyber incidents, providing recommendations to the Secretary of Homeland Security for improving cybersecurity and incident response practices. It will be made up of federal officials, as well as representatives from private sector entities.Section 6: Standardizing the Federal Government’s Playbook for Responding to Cybersecurity Vulnerabilities and IncidentsThis section again speaks to the patchwork of differing vulnerability and incident response procedures that currently exists across multiple federal agencies. The goal here is to create a standard set of operational procedures (or a playbook) for cybersecurity vulnerability and incident response activity. The playbook will have to incorporate all appropriate NIST standards, be used by all Federal Civilian Executive Branch (FCEB) Agencies, and spell out all phases of incident response.Sections 7 and 8: Improving Detection, Investigation, and Remediations of Cybersecurity Vulnerabilities and Incidents on Federal Government NetworksThese two sections focus on creating a unified approach to the detection, investigation, and remediation of cybersecurity vulnerabilities and incidents. Section 7 focuses on improving detection – mandating that all FCEB agencies deploy an “Endpoint Detection and Response (EDR)” initiative to support proactive detection of cybersecurity incidents and establishes a procedure for the implementation of threat hunting and detection, as well as inter-agency information sharing around threat detection. Section 8 is focused on improving the government’s investigative and remediation capabilities – namely, by establishing requirements for agencies and their IT service providers to collect, maintain, and share specified information from Federal Information System network logs.Section 9: National Security SystemsThis section requires the Secretary of Defense to adopt National Security System requirements that are at least equivalent to the requirements spelled out by the above sections in the Order.Who Will This Impact?As noted above, while the Executive Order is aimed at shoring up the federal government’s cybersecurity detection and response systems – its impacts will be felt throughout much of the private sector. That isn’t a bad thing! A patchwork cybersecurity system is clearly not the best way to respond to the increasingly sophisticated cybersecurity incidents currently threatening both the United States government and the private sector. Responding to these threats requires a robust, unified national cybersecurity system, which in turn requires updated and unified cybersecurity standards across both government agencies and private sector companies. This Executive Order is a great stepping stone towards that goal.As far as timing for private sector impacts: the first impacts will be felt by software companies and other organizations that directly contract with the federal government, as there are direct requirements and implications for those entities spelled out within the Order. Many of those requirements come into play within 60 days to a year after the date of the Order, so there may be a quick turnaround to comply with any new standards for those organizations. Impacts are then expected to trickle down to other private sector organizations: as government subcontractors update policies and systems to comply with the Order, they will in turn require the companies that they do business with to comply with the new cybersecurity standards. In this way, the Order actually creates an opportunity for the federal government to create a cybersecurity floor above which most companies in the US will eventually have to comply.ConclusionDetecting and defending against cybersecurity threats is an increasingly difficult worldwide challenge – a challenge to which, currently, no perfect defense exists. However, with this Order, the United States is taking a step in the right direction by creating a more unified cybersecurity standard and network that will encourage better detection, investigation, and mitigation.Check out the second installment of this blog series, where Rob Pike, CEO and Founder of Cyemptive Technologies, provides guidance on how to set up your organization for compliance with the Executive Order, as well as general best-practice tips for adopting a preemptive cybersecurity approach. If you would like to discuss this topic further, please reach out to me at erubenstein@lighthouseglobal.com.data-privacy; information-governancecloud, cybersecurity, blog, corporate, data-privacy, information-governancecloud; cybersecurity; blog; corporateerin rubenstein
Data Privacy
Information Governance
Blog

How to Get Started with TAR in eDiscovery
In a recent post, we discussed that requesting parties often demand more transparency with a Technology Assisted Review (TAR) process than they do with a process involving keyword search and manual review. So, how do you get started using (and understanding) TAR without having to defend it? A fairly simple approach: start with some use cases that don’t require you to defend your use of TAR to outside parties.Getting Comfortable with the TAR WorkflowIt’s difficult to use TAR for the first time in a case for which you have production deadlines and demands from requesting parties. One way to become comfortable with the TAR workflow is to conduct it on a case you’ve already completed, using the same document set with which you worked in that prior case. Doing so can accomplish two goals: You develop a better understanding of how the TAR algorithm learns to identify potentially responsive documents: Based on documents that you classify as responsive (or non-responsive), you will see the algorithm begin to rank other documents in the collection as likely to be responsive as well. Assuming your review team was accurate in classifying responsive documents manually, you will see how those same documents are identified as likely to be responsive by the algorithm, which engenders confidence in the algorithm’s ability to accurately classify documents. You learn how the TAR algorithm may identify potentially responsive documents that were missed by the review team: Human reviewers are only human, and they sometimes misclassify documents. In fact, many studies would say they misclassify them regularly. Assuming that the TAR algorithm is properly trained, it will often more accurately classify documents (that are responsive and non-responsive) than the human reviewers, enabling you to learn how the TAR algorithm can catch mistakes that your human reviewers have made.Other Use Cases for TAREven if you don’t have the time to use TAR on a case you’ve already completed, you can use TAR for other use cases that don’t require a level of transparency with opposing counsel, such as: Internal Investigations: When an internal investigation dictates review of a document set that is conducive to using TAR, this is a terrific opportunity to conduct and refine your TAR process without outside review or transparency requirements to uphold. Review Data Produced to You: Turnabout is fair play, right? There is no reason you can’t use TAR to save costs reviewing the documents produced to you to while determining whether the producing party engaged in a document dump. Prioritizing Your Document Set for Review: Even if you plan to review the entire set of potentially responsive documents, using TAR can help you prioritize the set for review, pushing documents less likely to be responsive to the end of the queue. This can be useful in rolling production scenarios, or if you think that eventual settlement could obviate the need to reduce the entire collection.Combining TAR technology with efficient workflows that maximize the effectiveness of the technology takes time and expertise. Working with experts who understand how to get the most out of the TAR algorithm is important. But it can still be daunting to use TAR for the first time in a case where you must meet a stringent level of defensibility and transparency with opposing counsel. Applying TAR to use cases first where that level of transparency is not required enables your company to get to that efficient and effective workflow—before you have to prove its efficacy to an outside party.ediscovery-review; ai-and-analyticstar-predictive-coding, ediscovery-review, ai-and-analyticstar-predictive-codingmitch montoya
eDiscovery and Review
AI and Analytics
Blog

Productizing Your Corporate Legal Department’s Services: Making Build vs. Buy vs. Outsourcing Decisions
For years, general counsel have weighed the pros and cons of doing a task internally versus sending the work to outside counsel – this is not a new dichotomy. What is newer, however, is the proliferation of technology available for legal and the business savvy now being applied to internal legal departments. This has opened up more choices for legal departments. First, you have to figure out whether you can apply technology, then whether you should build or buy that technology, and finally if you should outsource any portion of the process.Before you start down the path of buy vs. build vs. outsource, I would recommend assessing your department’s offerings. In the earlier parts of this series, I outline how you can do that. Once you understand your services and your gaps, you can better determine where you may need to apply build vs. buy decisions. Whether you are a general counsel or a legal operations professional, this blog will outline four key aspects to include in your framework as you make these decisions.1. Problem/Solution ListStart with a list of services your company needs and possible solutions. If you followed the productization process, you will have a good list. If you have not yet done this, you can at least jot down a list of your company’s legal needs, how pervasive and urgent they are, whether they further the company strategy, as well as any potential solutions.Next, order that list from most pervasive to least pervasive. Where there is a tie, look to the problem’s relationship to company strategy.Next, work through all of the items in box A. You want to be able to answer the following questions:Is there an existing solution?Is there a software solution that may apply?What are the costs/benefits of all possible solutions?Is there typically urgency around the request?All other things being equal, do we have the expertise to handle this in house?If you have gaps in A, B, or C, I would recommend addressing those before process improvement items.2. Cost-Benefit AnalysisNext, for any change (either addressing a gap or a process improvement) you should do a cost-benefit/return on investment analysis. Note that if you are just trying to get a sense of which problem on your list to address, you can do a high-level analysis by categorizing the solutions into low, medium, or high financial impact. If, however, you are getting to the point of suggesting a change internally and asking for budget, you want to do a much more in-depth quantitative analysis. On the benefit side, you want to consider any revenue acceleration for the company (e.g., customers’ revenue hits a quarter earlier) as well as costs reduced and avoided (e.g. outside counsel fees). If there are other quantifiable benefits, you should include them as well. On the expense side, make sure to consider licensing, annual maintenance, user fees, implementation, infrastructure, training, hourly support/expert charges, and any ongoing costs. You should predict these benefits and costs for the next 3 years, as that is a common period to see whether there is a return on your investment. You can also prepare a version of this document showing the same cost/benefit of building the solution internally as well as outsourcing it to outside counsel.3. Additional Factors: Urgency and ExpertiseOnce you have the cost-benefit analysis for the various solutions, you usually have a preferred direction. However, don’t forget to account for time and expertise. You should then consider how urgent the requests are. The more urgent a request, the more likely it should be handled by technology or outsourced, as those solutions typically can bring more resources to bear. You should then consider expertise. More specifically, does one need specific knowledge about the company to solve this problem or will there be a lot of need to liaise internally? If so, the solution should likely stay with the internal corporate legal department. Conversely, does this require niche expertise and is it better handled by an outside counsel with that expertise? Make notes of these considerations with your cost-benefit analysis, as these factors can sway a decision in one direction or another.4. Decision TimeUltimately, making these decisions is more of an art than a science. They are also decisions that can and should be revisited as things change in your business and legal department. The above should give you the right information to make an informed decision. Ultimately, you will want to share your decision with others and get input before finalizing a direction.By following the productization process, orienting your solutions towards your customers, streamlining how you deliver services, and applying the right sets of resources through build versus buy decisions, your legal department will operate more efficiently. legal-operationslegal-ops, blog, legal-operations,legal-ops; bloglighthouse
Legal Operations
Blog

Productizing Your Corporate Legal Department’s Services: Internally Marketing Your Solutions
In my last two blogs, I discussed how your legal department can productize services to become more efficient as well as shared some tips for how to determine the legal needs within your organization. Now that you know the added benefits and understand the legal needs, the natural next step is to determine what legal service “products” to offer, as well as any gaps. However, if nobody knows what these repeatable solutions are, what good are they? This is where creating an internal marketing plan to get the word out about your department’s legal services is critically important. In this blog, we’ll talk about how to do that by answering who, what, when, where, and why.Who?When you create your internal plan, the first thing you need to do is understand who you are marketing to. The easiest way to do this is to create some simple “personas.” You can easily do this based on the interviews you conducted as part of your earlier search. You should build a persona for each distinct type of user coming to you – typically this aligns with internal departments. In detailing each persona, you should include the following:Typical day-to-day work of your personaTypical interaction with legalTop of mind issues/challengesOther notesWhat?Next, you will need to decide what you are going to market to these personas (i.e repeatable workflows). Common ones in the legal arena are contract, litigation, HR investigation, and patent workflows. Once you have the workflows applicable to your company identified, detail the features of each workflow. For example, it is automated; has six common template documents, a clause library, and contract status; and leverages existing company technology.Once you have your personas, workflows, and features, you’re ready to create a positioning document. You should create one document for every problem/solution set (i.e. workflow). This will form the basis of how you share the information with others. The goal of this document is to position your solution in a way that resonates with the internal users. Below is a format that I find helpful to follow and I have inserted an example based on a contract workflow.PROBLEM: There is a problem in the company today. Contract negotiations are long, cumbersome, and not transparent. This can delay revenue opportunities. In addition, final contracts are difficult to locate and manage.SOLUTION: The ideal solution to this problem is an easy-to-use process, with some contracts being able to avoid legal review. The solution would allow easy access to status for interested parties and would allow those, or other, interested parties to access the contractual information at a later date.PRIMARY MESSAGE (SHORT - 1 SENTENCE): The Corporate Legal Department delivers a business-driven model for negotiating and managing contracts that accelerates, not hinders, company growth.SERVICE DESCRIPTION (2-3 SENTENCES): By leveraging an intake form, employees are directed to a self-service, spectra portal for template contracts or put in touch with an attorney for more complex matters. The status of their request, as well as information about all finalized contracts, is displayed in our JIRA system giving users full access to contract status as well as important contractual data of finalized contracts.HIGHLIGHTS (THESE SHOULD BE PROBLEM-ORIENTED FEATURES):Reduces contract turnaround by leveraging templated contracts and clausesAllows users access to contract status anytime, anywhereNo new systems (i.e. leverages existing company tools)Etc.The above will create a lot of different worksheets and information. Since I like to keep things a little simpler, I also create a cliff notes version of this to show the all-up view of your corporate legal department’s services.Once you have completed your positioning, don’t be afraid to run the messaging by some of the people you interviewed. You want to make sure that it is clear how legal will be helping them get their work done. I would suggest selecting people who are friendly to your department and who you have a good working relationship with since you are running draft information by them and not a final product.Where, When, and Why?Third, you need to think about where, when, and why you are getting the message out. The goal is to get it out wherever your users are, often, and in a way that they like to consume the information. At a minimum, I would suggest doing a launch of the updated services and including information about that launch on:The company wiki page/internal siteAny internal ticketing toolA company newsletter (or a company meeting if appropriate)Any onboarding materials/presentations your company does for new hiresOr even a “roadshow,” where you present to each department within your organization what services the legal team offersDuring any presentation, it is always helpful to inject some fun into the presentation. I have heard of some legal departments doing humorous videos or skits to capture the attention of their employees. Partner with your internal marketing team, as they may have some great suggestions on how you can get the word out.Finally, don’t forget about post-launch messaging. Though you may see an uptick in users after a launch, some people will have missed the information the first time around or will have forgotten it by the time they get to an issue that they want to bring to legal. To that end, make sure you have a plan for continued marketing. I like to showcase successes in follow-up marketing (e.g. a contract turnaround case study showing the reduced times or some metrics on impact). This information can be shared in an employee newsletter or as a quick email to leaders asking them to share it in their department meetings.This is quite a robust process and you should expect it will take several weeks, or even months, to complete. You will also likely continue to refine this marketing plan as you address gaps by adding services and gathering feedback. The benefit of going through this process is that it brings clarity to what legal does, brings efficiency by advertising repeatable workflows, and gives everyone in legal visibility into the challenges in the business and how legal addresses those.legal-operationslegal-ops, blog, legal-operations,legal-ops; bloglighthouse
Legal Operations
Blog

An Introduction to Managing Microsoft 365 Updates that Present Legal and Compliance Considerations
Increasingly, opportunities for cloud-based collaboration and efficiencies, and challenges presented by the rapid proliferation of complex data, are incentivizing organizations to transform their corporate data governance and eDiscovery operations from traditional self-managed infrastructure to the Microsoft 365 (M365) Cloud. Benefits in terms of convenience, security, robust functionality, and native capabilities related to eDiscovery and compliance are the primary drivers of this move.While there are many benefits to moving into the M365 ecosystem, it requires legal and compliance teams to take on new considerations regarding the constant evolution that characterizes cloud software. With continually changing applications, establishing static workflows for eDiscovery, legal holds, data dispositions, and other legal operations is not enough. As the M365 software and functionality changes, workflows must be constantly evaluated to ensure their validity, relevance, and defensibility.Exacerbating this challenge is the reality that the traditional IT change management paradigm designed to preemptively address cross-organizational considerations (including impacts to legal, compliance, and eDiscovery operations) does not fit the Cloud/SaaS framework. Organizations must now rethink their change management approach as they modernize with M365.This is the first in a series of blog posts devoted to highlighting key changes that have been released into the M365 production environments. One of the biggest challenges for organizations is identifying which of the myriad of updates pose potential risks to eDiscovery operations. Distinguishing the changes that do and do not pose a significant eDiscovery impact can be extremely difficult unless the reviewer has some level of subject-matter expertise and understands the specific workflows deployed within the organization. Here are some common scenarios with potential eDiscovery impact that could easily go unnoticed by the untrained eye:Updates that create a new data sourceUpdates that change a backend data storage locationUpdates altering the risk profile of features that were previously disabled due to legal / privacy riskUpdates that render an existing eDiscovery process obsoleteEach subsequent blog post in this series will highlight an example of a software update related to our key software scenarios, detailing the nature of the change, the potential impact, as well as when and why organizations should care.microsoft-365; chat-and-collaboration-data; information-governancemicrosoft, compliance-and-investigations, blog, cloudcompass, advisory-services, microsoft-365, chat-and-collaboration-data, information-governance,microsoft; compliance-and-investigations; blog; cloudcompass; advisory-serviceslighthouse
Microsoft 365
Chat and Collaboration Data
Information Governance
Blog

Productizing Your Corporate Legal Department’s Services: Understanding the Needs of the Business
Many law departments are reactionary. Someone comes to legal with a “legal” question and they help that person. Although this makes a lot of sense, as legal is a support department, it makes it very difficult to thematically explain the value legal is driving as well as understand the work the department is doing. As legal operations matures and legal departments look to be more efficient, productizing the services in the department is a natural progression. This approach was a central discussion at the 2021 CLOC conference and the subject of this blog series. In order to productize something effectively, however, you need a very good understanding of your customer and prospective customers’ needs. In this article, I will give you an overview of how to get that.A central theme in product management is building resonators – products that resonate with the buyers. You may have the best idea but, if it doesn’t meet a pervasive market need, nobody will buy it. There are many great examples of products that failed and dozens of lessons we can learn from those failures. Most of the lessons come back to misunderstanding the customer's need and the nature of that need. For example, people may say they want a better mousetrap but if you don’t ask how much they would pay for that mousetrap, whether they would replace any current mousetraps with a better one, and whether it matters if the new mousetrap gives off an odor of chemicals, you can see how you might not make a best seller. To give an example in the legal services space, in my first general counsel role, I heard from many people how it was frustrating that they could never find contracts when they needed them. I immediately set upon a mission to create a contracts database. After investing a lot of time, we had a wonderfully organized database, and the only person who ever used it was the legal team. So what happened to all the frustrated employees from other departments? It turns out I didn’t ask them how often they needed to look up contracts and whether that need was part of another legal request (meaning that legal was the one actually looking up the contract anyway). In the end, the contract database was extremely helpful for the legal department but I could have saved myself the time of making it self-service, spectra and figuring out permissions for different users had I asked some questions upfront. To avoid the same fate, there are four principles you can use when asking your company about its legal needs.1. Don’t rely on the users to define the needs. Instead, be curious about their day-to-day and in that curiosity, you will be able to see the legal needs. The theory is this: if you ask someone what they need from legal, they will overlay their belief system about what legal should provide before they answer. Instead, when you ask them about their role, their goals, how they are measured, and what their biggest challenges are, you are more likely to be able to understand them and see where legal may be able to help.2. Create a template interview form and use it religiously with each person.When you do 10-15 interviews, you want to be able to discern themes and compare interviews. When multiple people are conducting interviews, you want to be sure you are all hitting the same topics. This is much easier to do when you start from a template. For a 30-minute interview, I would suggest 3-5 template questions. Always get background information before the interview starts including their name, title, department, and contact information. Put this information at the top of your interview summary. Do not include this in your 3-5 questions. Having this information clearly labeled and available allows you to easily follow up later. Next, move on to background and devote 2-3 questions to this area including what are their main goals for the year, how is their department measured, what are their biggest pain points. Finally, go on to any specific areas you may want to ask about. For example, you may want to know how they have used the legal department in the past, how much they interact with overseas colleagues, etc. Here is a list of common questions:What are your department’s goals for the year?How is your department measured?What are your biggest roadblocks in achieving your goals?What are your biggest roadblocks in getting your job done?If you had a magic wand and could change one thing about your job, what would it be?What are your most common needs outside your department?What is your perception of what the legal department does?What kinds of things have you come to legal for?3. Interview a diverse group. It may seem obvious that you need a good sample size, however, you will be surprised at how varied the needs are at different levels and across different departments. If you are only interviewing one person to represent a specific level or department, you should ask them “how representative do you think your pain points/goals are of the department?” This will give you a good idea of whether you can rely on this person’s interview as representative of the department or whether you will have to do some follow-up interviews with others.4. Always ask follow-up questions.The guidance for limiting your template to 3-5 questions above ensures you have time for follow up on each response. More specifically, you want to be sure you are really understanding the responses and quantifying the level and frequency of any relevant pain points. I would set a goal to ask 2 follow-up questions for every first response. For example, if your first question is “what are your goals for 2021?” then you should expect to ask 2 follow-up questions after your interviewee responds. If at any point the person you are interviewing mentions a challenge that you think legal can help to solve, this is your queue to follow up around the pain and pervasiveness. Here are some questions you can ask to get into how big a problem they are facing:How often do you run into this roadblock: daily, weekly, monthly, quarterly?When you run into this roadblock, how much time do you spend resolving it: 1-2 hours, 2-5 hours, 5-10 hours, 10+ hours?Does this roadblock impact multiple people? If so, how many?Does this roadblock (or a stoppage in you moving towards your goals) impact other departments?Are there workarounds for this roadblock? If so, how cumbersome are they on a scale of 1-5?If you had to reach out to another department and work with someone to remove this roadblock each time it came up, would you do that or would you continue with the workaround?How long would you wait for an outside resource to help before you proceed with your current workaround?Does the challenge have an impact on revenue?Whether you are a general counsel just getting to know your organization, a legal operations professional tasked with making your department more efficient, or a lawyer who is interested in ensuring you are providing great services, the above should give you a good place to start to understand your customer. Once you understand your customer, you’re able to provide great resonating services and position your existing solutions. legal-operationslegal-ops, blog, legal-operations,legal-ops; bloglighthouse
Legal Operations
Blog

New Rules, New Tools: AI and Compliance
We live in the era of Big Data. The exponential pace of technological development continues to generate immense amounts of digital information that can be analyzed, sorted, and utilized in previously impossible ways. In this world of artificial intelligence (AI), machine learning, and other advanced technologies, questions of privacy, government regulations, and compliance have taken on a new prominence across industries of all kinds.With this in mind, H5 recently convened a panel of experts to discuss the latest compliance challenges that organizations are facing today, as well as ways that AI can be used to address those challenges. Some key topics covered in the discussion included:Understanding use cases involving technical approaches to data classification.Exploring emerging data classification methods and approach.Setting expectations within your organization for the deployment of AI technology.Keeping an AI solution compliant.Preventing introducing bias into your AI models.The panel included Timia Moore, strategic risk assessment manager for Wells Fargo; Kimberly Pack, associate general counsel of compliance for Anheuser-Busch; Alex Lakatos, partner at Mayer Brown; and Eric Pender, engagement manager at H5; The conversation was moderated by Doug Austin, editor of the eDiscovery Today blog.Compliance Challenges Organizations Are Facing TodayThe rapidly evolving regulatory landscape, vastly increased data volumes and sources, and stringent new privacy laws present unique new challenges to today’s businesses. Whereas in the recent past it may have seemed liked regulatory bodies were often in a defensive position, forced to play catch-up as powerful new technologies took the field, these agencies are increasingly using their own tech to go on the offensive.This is particularly true in the banking industry and broader financial sector. “With the advent of fintech and technology like AI, regulators are moving from this reactive mode into a more proactive mode,” said Timia Moore, strategic risk assessment manager for Wells Fargo. But the trend is not limited to banking and finance. “It’s not industry specific,” she said. “I think regulators are really looking to be more proactive and figure out how to identify and assess issues, because ultimately they’re concerned about the consumer, which all of our companies are and should be as well.”Indeed, growing demand by consumers for increased privacy and better protection of their personal data is a key driver of new regulations around the world, including the General Data Protection Regulation (GDPR) in the European Union and the California Consumer Privacy Act (CCPA) and various similar laws in the United States. It’s also one of the biggest compliance challenges facing organizations today, as cyber attacks are now faster, more aggressive, and more sophisticated than ever before.Other challenges highlighted by the panel included:Siloed departments that limit communications and visibility within organizationsA dearth of subject matter expertiseThe possibility of simultaneous AI requests from multiple regulatory agenciesA more remote and dispersed workforce due to the pandemicUse Cases for AI and ComplianceIn order to meet these challenges head on, companies are increasingly turning to AI to help them comply with new regulations. Some companies are partnering with technology specialists to meet their AI needs, while some are building their own systems.Anheuser-Busch is one such company that is using an AI system to meet compliance standards. As Kimberly Pack, associate general counsel of compliance for Anheuser-Busch, described it: “One of the things that we’re super proud of is our proprietary AI data analyst system BrewRight. We use that data for Foreign Corrupt Practices Act compliance. We use it for investigations management. We use it for alcohol beverage law compliance.”She also pointed out that the BrewRight AI system is useful for discovering internal malfeasance as well. “Just general employee credit card abuse…We can even identify those kinds of things,” Pack said. “We’re actively looking for outlier behavior, strange patterns or new activity. As companies, we have this data, and so the question is how are we using it, and artificial intelligence is a great way for us to start being able to identify and mitigate some risks that we have.”Artificial intelligence can also play a key role in reducing the burden from alerts related to potential compliance issues or other kinds of wrongdoing. The trick, according to Alex Lakatos, partner at Mayer Brown, is tuning the system to the right level of sensitivity—and then letting it learn from there. “If you set it to be too sensitive, you’re going to be drowned in alerts and you can’t make sense of them,” Lakatos said. “You set it too far in the other direction, you only get the instances of the really, really bad conduct. But AI, because it is a learning tool, can become smarter about which alerts get triggered.”Lakatos also pointed out that when it comes to the kind of explanations for illegal behaviors that regulators usually want to see, AI is not capable of providing those answers. “AI doesn’t work on a theory,” he said. “AI just works on correlation.” That’s where having some smart people working in tandem with your AI comes in handy. “Regulators get more comfortable with a little bit of theory behind it.”H5 has identified at least a dozen areas related to compliance where AI can be of assistance, including: key document retention and categorization, personal identifiable information (PII) location and remediation, first-line level reviews of alerts, and policy applicability and risk identification.Data Classification, Methods, and ApproachesThere are various methods and approaches to data classification, including machine learning, linguistic modeling, sentiment analysis, name normalization, and personal data detection. Choosing the right one depends on what companies want their AI to do.“That’s why it’s really important to have a holistic program management style approach to this,” said Eric Pender, engagement manager at H5. “Because there are so many different ways that you can approach a lot of these problems.”Supervised machine learning models, for instance, ingest data that’s already been categorized, which makes them great at making predictions and predictive models. Unsupervised machine learning models, on the other hand, which take in unlabeled, uncategorized information, are really good at data pattern and structure recognition.“Ultimately, I think this comes down to the question of what action you want to take on your data,” Pender said. “And what version of modeling is going to be best suited to getting you there.”Setting Expectations for AI DeploymentOnce you’ve determined the type of data classification that best suits your needs, it’s crucial to set expectations for the AI deployment within your company. This process includes third-party evaluation, procurement, testing, and data processing agreements. Buying an off-the shelf solution is a possibility, though some organizations—especially large ones—may have the resources to build their own. It’s also possible to create a solution that features elements of both. In either case, obtaining C-suite buy-in is a critical step that should not be overlooked. And to maintain trust, it’s important to properly notify workers throughout the organization and remain transparent throughout the process.Allowing enough time for proper proof of concept evaluation is also key. When it comes to creating a timeline for deploying AI within an organization, “it’s really important for folks to be patient,” according to Pender. “People who are new to AI sometimes have this perception that they’re going to buy AI and they’re going to plug it in and it just works. But you really have to take time to train the models, especially if you’re talking about structured algorithms and you need to input classified data.”Education, documentation, and training are also key aspects of setting expectations for AI deployment. Bear in mind, at its heart implementing an AI system is a form of change management.“Think about your organization and the culture, and how well your employees or impacted team members receive change,” said Timia Moore of Wells Fargo. “Sometimes—if you are developing that change internally, if they’re at the table, if they have a voice, if they feel they’re a meaningful part of it—it’s a lot easier than if you just have some cowboy vendor come in and say, ‘We have the answer to your problems. Here it is, just do what we say.’”Keeping AI Solutions Compliant and Avoiding BiasWhen deploying an AI system, the last area of consideration discussed by the panel was how to keep the AI solution itself compliant and free of bias. Best practices include ongoing monitoring of the system, A/B testing, and mitigating attacks on the AI model.It’s also important to always keep in mind that AI systems are inherently dependent on their own training data. In other words, these systems are only as good as their inputs, and it’s crucial to make sure biases aren’t baked into the AI from the beginning. And once the system is up and running—and learning—it’s important to check in on it regularly.“There’s an old computer saying, ‘Garbage in, garbage out,’ said Lakatos. “The thing with AI is people have so much faith in it that it is become more of ‘garbage in, gospel out.’ If the AI says it, it must be true…and that’s something to be cautious of.”In today’s digital world, AI systems are becoming more and more integral to compliance and a host of other business functions. Educating yourself and making sure your company has a plan for the future are essential steps to take right away.The entire H5 webcast, “New Rules, New Tools: AI and Compliance,” can be viewed here.ai-and-analytics; data-privacyccpa, gdpr, blog, ai, big-data, -data-classification, fcpa, artificial-intelligence, compliance, ai-and-analytics, data-privacyccpa; gdpr; blog; ai; big-data; data-classification; fcpa; artificial-intelligence; compliancemitch montoya
AI and Analytics
Data Privacy
Blog

Productizing Your Corporate Legal Department’s Services: Getting Started
The 2021 CLOC conference focused a lot on applying product principles to legal services. General Counsel are often in the position of having to show the value of their team’s services and why, as a cost center, it makes sense to continue to grow their department or to buy technology to support their department. In addition to showing that value, there is pressure to be more efficient while providing excellent customer services. By productizing services, you can provide repeatable, measurable solutions that address the needs above. There is also the great benefit of being connected to your client’s needs by providing the services that match the most pervasive and urgent needs. However, if you don’t have a background in product management, how does one go about productizing legal services, and what does that even mean? As someone who is Pragmatic Marketing Certified through the Pragmatic Institute, I am here to help. This blog, and the blog series to follow, will show you how to get started, interview people internally to understand the needs, position your existing solutions internally, and make build vs. buy vs. outsourcing decisions. Let’s start with a high-level overview of where to begin.What does productizing legal services mean? Productizing your legal services focuses on creating solutions that apply to multiple customers in a repeatable way. This means that you first have to understand your customers’ problems by listening, asking, and observing. It then means that you create several repeatable processes to address those problems. Finally, it means you market those solutions internally and show how they bring value to the business. Taking it one step further, it also means that you leverage technology to support these services and continue to develop and improve the services based on feedback.So how does one go about creating these solutions inside a legal team? The first step is all about understanding the needs of the business. You can look internally at the requests the legal department receives to get an understanding of what the business is coming to the legal department for. Next, you want to speak to leaders from different groups in the business to understand what legal needs exist that are not coming into the legal department but should be addressed. Which leaders to speak to will depend a bit on your organization but I would recommend connecting with the following, at minimum: sales, finance, engineering (or product) as well as regional leaders in any key regions. More on this to come in my next blog on interviewing people internally to understand the organization’s needs.Once you have the information, it is helpful to create a list. I like to use the format below:Problems to SolveOnce you have a pretty solid list, you should brainstorm high-level recommended solutions (not the detailed how). This will include things like solving a certain need through documentation (e.g. a “how-to guide” or a template contract). It may include things like facilitating the intake of legal requests or facilitating access to contract information. Once you have your list of potential solutions, there are two next steps. For the set of existing solutions, you should group those into categories and make sure that you are adequately marketing and reporting on those (more on this in a future post). For the set of solutions that are future state, identify how you are going to address this need. When looking at the gaps, I like to categorize the gaps in the following ways so I can understand the budget impact and the division of work.Note that urgency speaks to how quickly the need needs to be solved overall and not necessarily the urgency of a specific request. For example, it speaks to how urgently people need a contract database as opposed to how quickly someone needs information about a specific contract. Pervasiveness addresses how many internal departments/employees have this need. Is it centered around just a small group within one department or is it a need expressed by multiple departments? The relationship to the company strategy should be focused on how much this need moves the business forward. Does it facilitate the company’s #1 strategy? When you complete this list, I recommend grouping it into like needs. If there are overlapping needs, you may want to create a consolidated item but make sure you capture the pervasiveness of it.Recommendations for Filling The GapsBy going through the above process you will have a good understanding of the various needs and solutions in your organization. In the next blog in the series, I will overview how to interview people internally to understand the organization’s needs.legal-operationslegal-ops, blog, legal-operations,legal-ops; bloglighthouse
Legal Operations
Blog

Why do Lawyers Demand More Transparency with TAR?
Since Judge Andrew Peck’s ruling over nine years ago in Da Silva Moore v. Publicis Groupe & MSL Group, the use of Technology-Assisted Review (TAR) for managing review in eDiscovery has been court approved. Yet many lawyers and legal professionals still don’t use machine learning (which, for many, is synonymous with TAR) in litigation. In the eDiscovery Today 2021 State of the Industry report, only 31.1% of respondents said they use TAR in all or most of their cases; 32.8% of respondents said they use it in very few or none of their cases. So, why don’t more lawyers use TAR?Transparency and TAROne possible reason that lawyers avoid the use of TAR is that requesting parties often demand more transparency with a TAR process than they do with a process involving keyword search and manual review. Judge Peck (retired magistrate judge and now Senior Counsel with DLA Piper) stated in the eDiscovery Today State of the Industry report: “Part of the problem remains requesting parties that seek such extensive involvement in the process and overly complex verification that responding parties are discouraged from using TAR.”In the article Predictive Coding: Can It Get A Break?, author Gareth Evans, a partner at Redgrave, states: “Probably the greatest impediment to the use of predictive coding has been the argument that the party seeking to use it should agree to share its coding decisions on the documents used to train the predictive coding model, including providing to the opposing party the irrelevant documents in the training sets.”Lawyer training vs. “black box” technologyWhy do lawyers expect that they are entitled to more transparency with TAR? Perhaps a better question might be: why do they demand less transparency for keyword search and manual review? One reason might lie in the education and training that they receive to become lawyers. Many lawyers cut their teeth on the keyword search used for resources like Westlaw and Lexis. Consequently, keyword search is part of their experience and they feel comfortable using it.Those same lawyers see keyword search and manual review for discovery as an extension of what they learned in law school. But it’s not. Search (aka “information retrieval”) is an expertise. Effective keyword search for discovery purposes is an iterative process that requires testing and verification of the search result set and the discard pile to confirm that the scope of the search wasn’t too narrowly focused. The end goal is to construct a search with both high recall and high precision; to identify those documents potentially responsive to a production request without also capturing non-responsive information, which can significantly increase review costs. This is very different from the goal of identifying a handful of documents that can assist in a case precedents argument.With regard to TAR, many lawyers still see the technology as a “black box” that they don’t understand. So, when the other side proposes using TAR, they want a lot more transparency about the particular TAR process to be used. It’s simply human nature to ask more questions about things we don’t understand. But, truth be told, lawyers should probably be just as vigilant in seeking information about the opposing’s use of keyword search as they are when TAR is the approach being proposed.TAR technology in daily livesWhat many lawyers may not realize is that they’re already using the type of technology associated with TAR elsewhere in their lives — albeit with a different goal and lower stakes than in a legal case. TAR is based on a supervised machine learning algorithm, where the algorithm learns to deliver similar content based on human feedback. Choices we make in Amazon, Spotify, and Netflix influence what those platforms deliver to us as other choices we might want to see in terms of items to buy, songs to listen to or movies to watch. The process of “training” the algorithms that drive these platforms makes them more useful to us — just as the feedback we provide during a predictive coding process helps train the algorithm to identify documents most likely to be responsive to the case.ConclusionWhat should lawyers do when opposing counsel makes transparency demands regarding TAR processes to be used? Certainly, cooperation and discussion of the protocol as soon as possible — such as the Rule 26(f) “meet and confer” between the parties — can help everyone get “on the same page” about what information can or should be shared, no matter what approach is proposed.However, if the parties can’t reach an accord regarding TAR transparency, perhaps another case ruling by Judge Peck — Hyles v. New York City — can be instructive here, where Judge Peck cited Sedona Principle 6. This principle states: “Responding parties are best situated to evaluate the procedures, methodologies, and technologies appropriate for preserving and producing their own electronically stored information.” Ironically, in Hyles, the requesting party was trying to force the responding party to use TAR, but Judge Peck, despite being an acknowledged “judicial advocate for the use of TAR in appropriate cases” denied the requesting party’s motion in that case. Transparency demands from requesting parties shouldn’t deter you from realizing the potential efficiency gains and cost savings resulting from an effective TAR process.For more information on H5 Litigation Services, including review for production with the H5 unique TAR as a Service, click here.ediscovery-reviewediscovery-reviewblog; tar; litigation; technology-assisted-review; predictive-coding; ediscovery; machine-learningmitch montoya
eDiscovery and Review
Blog

Big Data Challenges in eDiscovery (and How AI-Based Analytics Can Help)
It’s no secret that big data can mean big challenges in the eDiscovery world. Data volumes and sources are exploding year after year, in part due to a global shift to digital forms of communication in working environments (think emails, chat messages, and cloud-based collaboration tools vs. phone calls, in-person meetings, and paper memorandums, etc.) as well as the rise of the Cloud (which provides cheaper, more flexible, and virtually limitless data storage capabilities).This means that with every new litigation or investigation requiring discovery, counsel must collect massive amounts of potentially relevant digital evidence, host it, process it, identify the relevant information within it (as well as pinpoint any sensitive or protected information within that relevant data) and then produce that relevant data to the opposing side. Traditionally, this process then starts all over again with the next litigation – often beginning back at square one in a vacuum by collecting the exact same data for the new matter, without any of the insights or attorney work product gained from the previous matter.This endless cycle is not sustainable as data volumes continue to grow exponentially. Fortunately, just as advances in technology have led to increasing data volumes, advances in artificial intelligence (AI) technology can help tackle big data challenges. Newer analytics technology can now use multiple algorithms to analyze millions of data points across an organization’s entire legal portfolio (including metadata, text, past attorney work product, etc.) and provide counsel with insights that can improve efficiency and curb the endless cycle of re-inventing the wheel on each new matter. In this post, I’ll outline the four main challenges big data can pose in an eDiscovery environment (also called “The Four Vs”) and explain how cutting-edge big data analytics tools can help tackle them.The “Four Vs” of Big Data Challenges in eDiscovery 1. The volume, or scale of dataAs noted above, a primary challenge in matters involving discovery is the sheer amount of data generated by employees and organizations as a whole. For reference, most companies in the U.S. currently have at least 100 terabytes of data stored, and it is estimated that by 2025, worldwide data will grow 61 percent to 175 zettabytes.As organizations and individuals create more data, data volumes for even routine or small eDiscovery matters are exploding in correlation. Unfortunately, court discovery deadlines and opposing counsel production expectations rarely adjust to accommodate this ever-growing surge in data. This can put organizations and outside counsel in an impossible position if they don’t have a defensible and efficient method to cull irrelevant data and/or accurately identify important categories of data within large, complex data sets. Being forced to manually review vast amounts of information within an unrealistic time period can quickly become a pressure cooker for critical mistakes – where review teams miss important information within a dataset and thereby either produce damaging or sensitive information to the opposing side (e.g., attorney-client privilege, protected health information, trade secrets, non-relevant information, etc.) or in the inverse, fail to find and produce requested relevant information.To overcome this challenge, counsel (both in-house and outside counsel) need better ways to retain and analyze data – which is exactly where newer AI-enabled analytics technology (which can better manage large volumes of data) can help. The AI-based analytics technology being built right now is developed for scale, meaning new technology can handle large caseloads, easily add data, and create feedback loops that run in real time. Each document that is reviewed feeds into the algorithm to make the analysis even more precise moving forward. This differs from older analytics platforms, which were not engineered to meet the challenges of data volumes today – resulting in review delays or worse, inaccurate output that leads to critical mistakes.2. The variety, or different forms of dataIn addition to the volume of data increasing today, the diversity of data sources is also increasing. This also presents significant challenges as technologists and attorneys continually work to learn how to process, search, and produce newer and increasingly complicated cloud-based data sources. The good news is that advanced analytics platforms can also help manage new data types in an efficient and cost-effective manner. Some newer AI-based analytics platforms can provide a holistic view of an organization’s entire legal data portfolio and identify broad trends and insights – inclusive of every variety of data present within it. These insights can help reduce cost and risk and sometimes enable organizations to upgrade their entire eDiscovery program. A holistic view of organizational data can also be helpful for outside counsel because it also enables better and more strategic legal decisions for individual matters and investigations.3. The velocity, or the speed of dataWithin eDiscovery, the velocity of data not only refers to the speed at which new data is generated, but also the speed at which data can be processed and analyzed. With smaller data volumes, it was manageable to put all collected data into a database and analyze it later. However, as data volumes increase, this method is expensive, time consuming, and may lead to errors and data gaps. Once again, a big data analytics product can help overcome this challenge because it is capable of rapidly processing and analyzing iterative volumes of collected data on an ongoing basis. By processing data into a big data analytics platform at the outset of a matter, counsel can quickly gain insights into that data, identifying relevant information and potential data gaps much earlier in the processes. In turn, this can mean lower data hosting costs as objectively non-responsive data can be jettisoned prior to data hosting. The ability of big data analytics platforms to support the velocity of data change also enables counsel and reviewers to be more agile and evolve alongside the constantly changing landscape of the discovery itself (e.g., changes in scope, custodians, responsive criteria, court deadlines).4. The veracity, or uncertainty of dataWithin the eDiscovery realm, the veracity of data refers to the quality of the data (i.e., whether the data that a party collects, processes, and produces is accurate and defensible and will satisfy a discovery request or subpoena). The veracity of the data produced to the opposing side in a litigation or investigation is therefore of the utmost importance, which is why data quality control steps are key at every discovery stage. At the preservation and collection stages, counsel must verify which custodians and data sources may have relevant information. Once that data is collected and processed, the data must then be checked again for accuracy to ensure that the collection and processing were performed correctly and there is no missing data. Then, as data is culled, reviewed, and prepared for production, multiple quality control steps must take place to ensure that the data slated to be produced is relevant to the discovery request and categorized correctly with all sensitive information appropriately identified and handled. As data volumes grow, ensuring the veracity of data only becomes more daunting.Thankfully, big data analytics technology can also help safeguard the veracity of data. Cutting-edge AI technology can provide a big-picture view of an organization’s entire legal portfolio, enabling counsel to see which custodians and data sources contain data that is consistently produced as relevant (or, in the alternative, has never been produced as relevant) across all matters. It can also help identify missing data by providing counsel with a holistic view of what was collected in past matters from data sources. AI-based analytics tools can also help ensure data veracity on the review side within a single matter by identifying the inevitable inconsistencies that happen when humans review and categorize documents within large volumes of data (i.e., one reviewer may categorize a document differently than another reviewer who reviewed an identical or very similar document, leading to inconsistent work product). Newer analytics technology can more efficiently and accurately identify those inconsistencies during the review process so that they can be remedied early on before they cause problems. Big Data Analytics-Based MethodologiesAs shown above, AI-based big data analytics platforms can help counsel manage growing data volumes in eDiscovery.For a more in-depth look at how a cutting-edge analytics platform and big data methodology can be applied to every step of the eDiscovery process in a real-world environment, please see Lighthouse’s white paper titled “The Challenge with Big Data.” And, if you are interested in this topic or would like to talk about big data and analytics, feel free to reach out to me at KSobylak@lighthouseglobal.com.ai-and-analytics; ediscovery-reviewcloud, analytics, ai-big-data, ediscovery-process, prism, blog, ai-and-analytics, ediscovery-reviewcloud; analytics; ai-big-data; ediscovery-process; prism; blogkarl sobylak
AI and Analytics
eDiscovery and Review
Blog

Managed Services for Law Firms: The Six Pillars of a Successful Managed Service Relationship
By Steven L. Clark, E-Discovery and Litigation Support Director, Dentons and John Del Piero, Vice President, LighthouseWhether your firm is just beginning to consider a move to a managed service eDiscovery model or you’re a managed service veteran, it is imperative to understand what makes this type of eDiscovery program model successful. After all, if you don’t know how to measure success, it will be difficult to know what to look for when selecting a provider, and equally as hard to monitor the quality of the services provided once you have selected one.However, measuring success can be complex. There are many different metrics that could be used to measure success and each may be of a varying level of importance to different firm stakeholders, as the priorities of these stakeholders will be determined by their particular role and focus. However, a successful managed service partnership can be based on a foundation of six core pillars. These pillars can be used as guideposts when evaluating whether a managed service partner will truly add value to a law firm’s eDiscovery process.Pillar 1: Access to Best-of-Breed Technology and Teams of Experts to Help Leverage ItA managed service partnership should always make a law firm (and its clients) feel like the best eDiscovery technology is right at their fingertips. But more than that, a successful managed service relationship should enable a law firm to stay technologically agile, while lowering technology costs.For example, if an eDiscovery tool or platform becomes obsolete or outdated, the firm’s managed service partner should be able to quickly move the firm to better technology, with little cost to the firm. In other words, in a successful managed service partnership, gone are the days where a litigation support team was stuck using an obsolete platform simply because the law firm purchased an enterprise license for that technology. Rather, the managed service partner should bear the cost burden of leveraging continuously evolving technology because the partner can easily spread that technological risk across its client base. In assuming this burden, the managed services partner ultimately provides law firms much greater flexibility in terms of leveraging the most appropriate technology to meet their clients’ needs.In addition to simply providing access to the best technology, a successful managed service partnership should also provide teams of experts who are wholly dedicated to helping law firms leverage that technology for optimal impact. These experts should be continuously vetting new applications and technology upgrades, enabling litigation support teams to stay up to date on evolving applications and tools. These teams will also be able to create and test customized workflows that enable law firms to handle how data flows through technically robust collaborative platforms like Microsoft Teams or Slack, as well as keep firms apprised of any updates to cloud-based platforms that may affect existing eDiscovery workflows.This type of devoted technological expertise and guidance can provide firms a significant competitive boost, as internal litigation support teams rarely have the resources available to devote staff solely to testing new technology and building customized workflows.Pillar 2: A Scalable and More Diversified eDiscovery Team In comparison to a traditional law firm litigation support team which, naturally, is somewhat static in size, a successful managed service relationship allows law firm teams to quickly and seamlessly scale up or down, depending on case needs. For example, when a large matter comes in, a managed service provider should have the ability to quickly pull a project manager in to help manage the case while the internal law firm team still retains day-to-day control of the matter. This alleviates the firm from having to choose between hiring additional staff (only to be faced with too big of a team once the larger matter ends) or outsourcing the case to an external, inflexible eDiscovery provider (where the firm may be unable to retain full control of the matter and will undoubtedly have to adapt to different processes and workflows).A managed service partner’s bench should also be deep, allowing a law firm to pull from a diverse pool of expertise. Whether the law firm needs a review workflow expert or a processing expert, an analytics expert or a migration and normalization expert, a quality managed service provider should be able to swiftly provide someone who knows the teams involved and has the qualifications and technological background to ensure that all stakeholders trust their expertise and guidance.Pillar 3: eDiscovery Expertise 24/7/365A managed service provider should not only provide law firms with top-notch eDiscovery expertise but also provide access to that expertise whenever it is needed. Unfortunately, most litigation support teams are all too familiar with the fact that eDiscovery is almost never a 9 to 5 job. The nature of litigation today means that a Monday production deadline involving a terabyte of data may be doled out by a judge on a Friday morning, or that data for a pressing production may arrive at 9:00 p.m. The list of eDiscovery off-hour emergencies is somewhat endless.Unfortunately, most internal litigation support teams at law firms are located in one geographic area (and therefore, one time zone), meaning that even when internal teams have the required expertise, they may not have those resources available when they’re needed.A quality managed service partner, however, will be able to provide resources whenever they are needed because it can structure its hiring and team assignments with team members located across multiple time zones. Access to full-time eDiscovery expertise and coverage enables law firms to swiftly handle any eDiscovery task with ease, with no permanent increase in staffing overhead.Pillar 4: Less Talent Acquisition RiskA successful managed service relationship should also significantly lower law firm risk related to talent acquisition and training. While hiring in today’s job climate may seem like a simple task, the cost of sufficiently vetting candidates and then providing the appropriate training can be incredibly time consuming and expensive.If law firm vetting misses a candidate red flag or even if a candidate just needs more training than expected, staffing costs and time expenses can skyrocket even further. For example, the task of having to substantially re-train a new hire from the ground up can take up the valuable time of other internal experts. In this way, even the most routine hire can often slow productivity and lower the morale of the entire internal team (at least in the short term) until the hire can be fully integrated into the department’s daily workflow.In a successful managed service relationship, however, the law firm can transfer those types of hiring and training risks directly to the provider. The managed service provider is already continuously evaluating, vetting, and training talent across different geographies in order to hire the best eDiscovery experts. Law firms can simply reap the benefit of this process by partnering with the service provider and leveraging that talent once the vetting and training process has been completed.Pillar 5: Lower Staffing Overhead To put it simply, all of the above means that moving to a managed service model should allow a law firm to significantly lower its overhead costs related to staffing and management. In addition to taking on the hiring risks, a managed service provider should also take on much of the overhead related to maintaining staff. From payroll, to benefits, to overtime costs, a quality managed service provider handles those costs and time expenses for their own on-staff experts, leaving the law firm free to reap the benefits of on-demand expertise without the staffing overhead costs.Pillar 6: Better Billing MechanicsMost law firms are not set up to bill eDiscovery services efficiently. eDiscovery billing has evolved over the last few years, and a quality managed service provider should be following suit and offering simplified, predictable cost models in order for law firms to pass that predictability on to their clients. This kind of simplified pricing enables all parties to understand exactly how much they are going to spend for the eDiscovery services provided. However, this billing structure differs significantly from the way traditional legal work is billed out, and most law firms’ billing infrastructures have not evolved to offer the same level of predictability or cost certainty. This is where a quality managed service provider can provide another benefit, by heavily investing its own resources into building out automated reporting, ticketing, and billing systems that can generate proformas and integrate into the firm’s existing billing systems.If a managed service provider can take care of these billing tasks, law firm teams can spend more time in furtherance of client work, rather than devoting resources into eDiscovery billing metrics and workarounds.SummaryAccess to and expertise in appropriate technology, flexible staffing models, lower overhead, and simplified pricing are the six pillars of a successful managed service partnership in a law firm setting. When all six of these pillars are in place, the managed service partnership will result in more satisfied internal and external law firm customers and an increasing caseload year after year. For more information or to discuss this topic, reach out to us at info@lighthouseglobal.com.legal-operations; ediscovery-reviewmanaged-services, blog, law-firm, legal-operations, ediscovery-reviewmanaged-services; blog; law-firmlighthouse
Legal Operations
eDiscovery and Review
Blog

Legal and Compliance Should Use Chatbots to Their Advantage
Most of you are pretty familiar with using website chatbots in your daily lives – whether to assist in your online banking or to help with a product issue. But what if you went to report sexual harassment at work and you were greeted by a chatbot? That may seem a little unusual, however, there are a couple of advantages to this approach, including a better customer service experience for internal customers and allowing the compliance professionals to take on more complex work. For several years the legal and compliance industry discussions around chatbots have focused on how law firms can use chatbots. In this blog, I will focus on three ways in-house legal and compliance departments should use them to their advantage.1. As a legal intake tool.A common challenge for legal departments is how to intake matters and manage the work in the legal department. Legal operations teams are always looking for ways to understand what people are doing and how to make the process more efficient. There is a lot of discussion on how forms and/or workflow tools can be leveraged to solve this issue – and they are very helpful – but you can take this one step further with a chatbot. When someone inside your organization comes to the legal team, you can have a chatbot gather basic, or even more detailed, information about what they need. You can train a chatbot to understand the category of their need – advice, contract, patent, litigation, eDiscovery – and then take them through a series of questions to better understand the need. You can then even have the request routed through your workflow tool so it gets assigned to the right person (e.g., assigned to an attorney, a paralegal, or an eDiscovery project manager). As your chatbot gets familiar with the questions, you can have it ask deeper questions and take the request even further.2. To answer common legal questions.Legal departments tend to run lean. As a former general counsel who still speaks with a lot of legal department leaders, I know these leaders are always looking for ways to do more with less (or the same). They want to ensure their teams are spending time on substantive legal issues and not answering common questions that come up and can be handled differently. For example, answering questions about where to find the sexual harassment training or how to send over or sign a standard NDA, are questions that come into the legal department and lawyers spend their time answering them. These questions could easily be answered by a chatbot trained with common questions. This would provide a better user experience because the information is shared instantaneously with the user and it also frees up time for legal resources to spend their time on more unique issues. Finally, legal team members also feel more productive and engaged because their time isn’t being spent on more administrative tasks!3. In place of a hotline.This is one of the more unique use cases I have heard recently but it makes a lot of sense. Compliance hotlines work well because of the anonymity available but there is not an opportunity to share information back with the person reporting. For example, the person reporting an incident may want to know what the next steps might be, where they can find a certain policy, or where they can find additional resources. None of that is available via a hotline or even a form. With a chatbot, however, you can keep the anonymity but mimic a more personal conversation where additional resources can be shared. As shared on the Women in Compliance podcast, one organization has trained chatbots to be their first line of intake and support on sexual harassment complaints. The internal response has been very positive.legal-operationscompliance-and-investigations, legal-ops, blog, legal, legal-operations,compliance-and-investigations; legal-ops; blog; legallighthouse
Legal Operations
No items found. Please try different search parameters.