Advertise Here


IconAccounting & Tax
IconAccreditation Bodies
IconAssociations and Institutes
IconBBBEE Consulting and Verification Agencies
IconBusiness Process Management
IconBusiness Process Outsourcing
IconCompany Secretarial Services
IconCompare Medical Scheme Benefits
IconConsumer Protection
IconCorporate Governance
IconCredit Bureaus
IconDebit Order Collection Facilities
IconEducation and Training
IconEmergency Medical Rescue
IconExpatriate Cover
IconHealthcare Consultants
IconHuman Resources
IconInformation Technology and Software Partners
IconManaged Healthcare Service Providers
IconMedical Aid Administrators
IconMedical Aid Schemes
IconMedical Schemes Trustees Liability Insurance
IconMedical Service Providers
IconPolicy Administration
IconRegulatory Authorities
IconSurveys & Research
IconTraining Courses & Workshops
IconWellness Programs
Advertise Here
  Subscribe To »

Legal chatbots: something for nothing?







By Jeff Pettit (US)
Norton Rose Fulbright

In June, we introduced the topic of chatbots and highlighted some key risks and concerns associated with this growing area of technology. One business in particular, DoNotPay, made headlines recently by announcing that it would begin building legal chatbots for free.

The claim? In a July 14, 2017, posting to the online publishing platform Medium, Joshua Browder, founder of UK-based DoNotPay, writes, “Starting today, any lawyer, activist, student or charity can create a bot with no technical knowledge in minutes. It is completely free.” Sound too good to be true? To be sure, DoNotPay is not the first company to develop law-related chatbots—these bots are already popping up all over the world. But because this technology is still fairly new, chatbots that are attempting to automate services previously performed by licensed attorneys will almost certainly attract scrutiny.

While platforms such as DoNotPay and ChatFuel seek to lower barriers to entry by providing free tools to get a chatbot up and running, corporations and attorneys who want to integrate these chatbots into the realm of legal services could create costly legal exposure. Before submitting your idea for automating legal services, it is important to stop and consider a few threshold questions:

  1. Who owns “your” bot? Property rights in the digital realm can be a murky area, especially when a user’s original content is mixed with a digital platform’s intellectual property. Previously on the blog we have discussed US IP ownership concerns as they relate to APIs, Instagram, and Facebook “likes.” Generally, the terms and conditions of the platform will address ownership rights and define the relationship between platform and user. Currently, DoNotPay does not appear to require that a user agree to any particular terms and conditions when submitting an idea for a chatbot. In the July 14 Medium posting, Browder asks and answers the ownership question this way: “Who owns the rights to the document? You do. However, you give us permission to make a bot for you and send the link.” Would this simple statement be sufficient under US law to allow you to claim a copyright interest in the resulting software code that DoNotPay generates? The answer is far from certain. What if a DoNotPay employee were to take your submission, develop it into a commercial product, and then monetize that product without your consent or involvement? Would you have any recourse? Again, there is very little certainty.
  2. Once the bot is up and running, who will own the input from users? Legal chatbots may need to collect sensitive personal data from the intended users. In this case, who owns (and is legally responsible to protect) the personal information collected by the bot? Privacy compliance should be a serious concern, as this is an area of heavy regulation that is fraught with liability. In today’s digital economy, “big data” and the “internet of things” are valuable commodities. Will the platforms claim any right to profit from the information collected by these bots? Are these types of arrangements being properly disclosed to the intended users so that consumers understand how their personal information is being used? These issues should be carefully addressed with the platform host upfront.
  3. Does this chatbot put you at risk for the unauthorized practice of law? In the legal realm, chatbots are sometimes praised for adding an element of efficiency to routine legal documents with a limited number of variables provided by the client. But this kind of legal document production could violate a jurisdiction’s unauthorized practice of law (“UPL”) rules. Most, if not all, jurisdictions require legal service providers to be licensed, but what constitutes UPL varies greatly from jurisdiction to jurisdiction. For instance, a jurisdiction could provide that a deed conveying an interest in real property may only be prepared by an attorney licensed in that jurisdiction. In the U.S., the American Bar Association reported in 2012 that 23 states are actively enforcing UPL rules. Courts are also expressing skepticism for attempts at automating the generation of legal documents. See, e.g., Janson v., Inc., 802 F. Supp. 2d 1053, 1065 (W.D. Mo. 2011) (“A computer sitting at a desk in California cannot prepare a legal document without a human programming it to fill in the document using legal principles derived from Missouri law that are selected for the customer based on the information provided by the customer.”). Broadly speaking, the more interactive the service in question, the higher the risk that the service may run afoul of UPL rules. And this sort of personalized interaction seems to be a primary goal of the chatbot technology.

The questions raised in this post are just the tip of the iceberg. Chatbots certainly have the potential to be a major technological disrupter in the legal industry, but there remains a host of risks to consider before integrating this technology into practice.

Source: Norton Rose Fulbright
« Back to previous page Print this page » |

Breaking News »

Breaking the cycle of health inequalities

Courier pharmacy delivers on accessibility to enhance medicine adherence   As South Africa prepares for the full implementation of National Health Insurance, the structural barriers impeding our society’s ...
Read More »


Associated Compliance Newsletter 068 – August 2018

The August 2018 Associated Compliance Newsletter has been published in the Insurance Gateway® Knowledge Base. The compliance landscape has become increasingly time consuming and complex to navigate. It will ...
Read More »


Importance of RE Qualifying Criteria

In response to a recent enquiry from a candidate, it is important to highlight the absolute necessity to ensure that you take into account the qualifying criteria when preparing for the regulatory exams. Firstly, ...
Read More »


The practice of “Defensive Medicine” grows alongside the rise of medical malpractice claims

        Aneesa Bodiat, Head of Legal  Natmed Medical Defence (Pty) Ltd           The number of medical malpractice claims in South Africa is rising ...
Read More »


More News »


Investment »


Life »


Retirement »


Short-term »

Advertise Here

From The Glossary »


Market capitalisation:

Mid cap: Next 60 JSE All Share stocks by market capitalisation.
More Definitions »

By using this website you agree to the Terms of Use.
Copyright © Stoker Risk & ICT (Pty) Ltd 2004 - 2019.
All Rights Reserved.





Contact IG


Media Pack


RSS Feeds