Part II: 'Robo Lawyer'-Challenges to Adopting Legal Tech in Law Firms

Lalantika Arvind is a penultimate year law student at Jindal Global Law School. Her interests include Gender Justice, Intellectual Property, Tech Law and ADR. She is intrigued by and invested in the intersections of these fields and the way they can be used to transform the legal sector.
- Wed Jun 23 2021

The first part of the series provided a brief background into the emergence of and need for legal tech, before discussing the social and technical challenges faced by law firms in its adoption. Part II begins by discussing the ethical and legal challenges in adopting legal tech, and then delves into understanding the potential that legal tech has in ensuring accessibility to justice. It finally concludes by discussing how A.I could be more ethical, and the possible solutions to the challenges that are posed. 

Legal challenges

The ethical and legal challenges in adopting legal tech stem from security and data privacy concerns as well as from the incompatibility of the law to accommodate legal tech. There is of course the legitimate concern that the more reliant one is on electronic information, the greater risk there is, especially from a cybersecurity perspective.[1] Such a concern, especially about data privacy is brought up by the European Commission’s white paper on Artificial Intelligence.[2] Lawyers deal with highly confidential and sensitive information, thus, data breaches are not only ethical violations, they would also lead to damaging the firm’s reputation and opening it up to lawsuits and penalties from regulatory authorities.[3] 

The first legal issue that arises is that of cross-jurisdictional compliance of data protection laws.[4] For firms with an international presence, legal tech that does not comply with a jurisdiction’s data protection laws would become a huge deterrent given that it would lead to greater penalties. The second legal issue that arises is the regulation of legal tech. In most jurisdictions, only authorized personnel can provide legal advice and legal tech that is non-compliant with regulations becomes a hinderance for its adoption. In a German Regional Court decision,[5] the Court held the service provided by a legal contract generating software to be violative of the German Legal Services Act. The software provided templates of contracts specific to the information given by the user, that were not for hypothetical scenarios. However, the software was not registered as a ‘service provider’ under the requisite law. The Court found it irrelevant that the software was not a ‘person’, since the nature of the service provided was such that could only be provided with authorization. Engstrom and Gelbach[6] provide a similar analysis about the United States ‘Model Rule of Professional Conduct’ that outlaws ‘unauthorized practice of the law’. According to the authors, given how vaguely defined ‘practice of law’ is, there is scope for legal tech, especially that which is used in litigation, to be deemed violative of the law.[7]

The regulatory challenge of legal tech is not limited only to unregistered services. For example, the case study of ‘Avvo’ legal technology in the United States[8] shows how even when a legal tech tool adheres to regulatory guidelines, there will always be room for error. In this case, the technology was held to be unethical by multiple state bar associations for not adhering to proper guidelines for payment of lawyers’ fees.[9] Thus, a general mistrust of legal tech as well as complexities regarding jurisdictional and regulatory laws becomes an added challenge to the adoption of legal tech. 

Legal tech and accessibilty

It is no secret that the legal systems are incredibly inaccessible to the general public. This can be attributed to a variety of factors, inter alia, high legal fees, exclusionary nature of the system that does not accommodate disabled people, lack of legal services in remote areas etc. However, the recent trends in legal tech are geared towards making the legal system more accessible. In a paper published in the Harvard Journal of Law and Technology[10], the authors establish that the use of legal tech can indeed provide for a more accessible legal system. This especially if it adheres to simple provisions such as ensuring that the technology is accessible by self-representing litigants, rural residents, persons with disabilities and persons with limited English proficiency. 

The issue with such an adherence isn’t necessarily limited only to regulatory implications, but also that it is seen as a real threat by lawyers to the loss of their business to technology. In the wake of the COVID-19 pandemic, accessibility has become a key focus for many, given that the legal systems moving online proves that accessibility can indeed be ensured. In light of the same, legislation that promotes the ethical and accessible use of legal tech is the need of the hour. The German Federal Court of Justice (hereinafter, “Court”) took a leap forward for this in 2019. In a recent case[11] the Court held that Lexfox’s legal tech, that of offering information on rent control laws in order to check the legal validity of your rent, was not in violation of the Legal Services Act (hereinafter Act”) as it was correctly registered as ‘legal services’. Thus, the Court held that software, when registered properly, can indeed provide informatorily legal services and would not be violative of the Act. The Court held that the objective of the Act is to protect consumers of legal services from unqualified legal services, but given that the Plaintiff was registered with competent authority, its services are not illegal. 

Ethical AI and possible solutions

The European Commission in its High Level Expert Group Report on Ethics Guidelines for Trustworthy A.I[12] mentions three essential components for trustworthy, ethical A.I. These are that the A.I should be lawful (i.e., compliance with all applicable laws and regulations), ethical (i.e., adherence to ethical principles and values is ensured), and it is robust (both technologically and socially). An essential requirement to achieving ethical A.I is ‘explainability’. According to the guidelines, explainability is when A.I has a significant impact on people’s lives, there should be an explanation for the A.I’s decision making process.[13] A further nuanced take on trustworthy and legal A.I is seen in the European Commission’s (EC) white paper on Artificial Intelligence.[14] The white paper discusses how development of A.I is already subject to European legislation on fundamental rights (data privacy etc), however, features of A.I such as opacity can make the enforcement of the legislation difficult.[15] Thus, legislation regulating A.I must be able to effectively address the risks of A.I and enforce the fundamental rights. 


The legal and ethical challenges to adoption of legal tech is ultimately rooted in legislation. When a legislative change is brought about that allows for the advancement of legal tech, it will allow the legal system to become inclusive and accessible. Moreover, when legal tech is created keeping in mind the three essential components mentioned by the EC (lawful, ethical, robust), it would improve the human understanding of A.I, make its use more efficient and increase human trustworthiness of the A.I. That combined with regulatory legislation that is able to effectively address the risks of legal tech but also accommodate its dynamic needs would significantly reduce the hinderances of the adoption of legal tech in law firms. 

Further, as of April 15, 2021, there are 4.4 crores pending cases in India, across all Indian courts.[16] The brunt of this pending litigation is faced most by the rural section of the society, for whom courts are often physically inaccessible, and who do not have the means to afford long drawn litigation. Thus, in such a case, legal tech has the potential to bridge the inequality in legal justice between the rural and urban India. However, since only 35% of internet users in India are from rural India[17], issues of digital literacy and accessibility gap require immediate focus. As Alarie et al emphasise[18], “the true benefits of artificially intelligent tools in the legal profession may be realized only once lawyers completely rethink the provision of legal services”. The challenges to the adoption of legal tech that law firms face, are not overtly complicated. With the right incentives, accurate information, illustrations of the success of adoption (or reliable and verifiable predictions of adopting), and adherence to regulations, these challenges can be overcome to create a more accessible, simplified and robust legal system. 

[1]Chay Brooks, Cristian Gherhes, Tim Vorley, ‘Artificial Intelligence in the Legal Sector: Pressures and Challenges of Transformation’, Cambridge Journal of Regions, Economy and Society, Volume 13, Issue 1, March 2020, Pages 135–152 at 148, available at See also Lexis Nexis, 16th April 2020 available at
[2]European Commission’s White Paper on Artificial Intelligence ‘A European Approach to Excellence and Trust’, 2020, available at
[3]Brooks et all, supra note i at 148
[5]33 O 35/19 (08.10.2019), Regional Court of Cologne. See also: Jenny Gesley, ‘Germany: Court Prohibits Legal Contractor Generator Platform’, Library of Congress, 31st October 2019, available at 
[6]David Freeman Engstorm and Jonah B. Gelbach, ‘Legal Tech, Civil Procedure, and the Future of American Adversarialism’, University of Pennsylvania Law Review (2020), available at 
[7]Id., at 14
[8]Benjamin H. Barton & Deborah L. Rhode, ‘Access to Justice and Routine Legal Services: New Technologies Meet Bar Regulators’, 70 HASTINGS L.J. 955 (2019), available at 
[9]Id., at 974-976. 
[10]James E. Cabral, Abhijeet Chavan, Thomas M. Clarke, John Greacen, Bonnie Rose Hough, Linda Rexer, Jane Ribadeneyra & Richard Zorza, ‘Using Technology to Enhance Access to Justice’, Harvard Journal of Law and Technology (2012), available at 
[11]Jenny Gesley,‘Germany: Federal Court of Justice Rules on Legal Tech’, Library of Congress, 23rd December 2019, available at 
[12]European Commission’s High Level Expert Group on Artificial Intelligence (AIHLEG), ‘Ethics Guidelines for Trustworthy AI’, 2019 available at: 
[13]Id., at 18
[14]European Commission, supra note ii at 10
[15]Id., at 10
[16]Pradeep Thakur, ‘Pending Cases in India Cross 4.4 Crore, up 19% Since Last Year’, Times of India, 15th April 2021, available at,4.4%20crore%20as%20on%20Thursday
[17]Osama Manzer and Kriti Singh, ‘How to Bridge the Gaping Digital Divide in India’, Digital Empowerment Foundation, 7th May 2020, available at 
[18]Alarie, B., Niblett, A. and Yoon, A. H, ‘How Artificial Intelligence Will Affect the Practice of Law’ University of Toronto Law Journal (2018), 106–124