By Tom Jarvis

In recent years, artifi­cial intelligence (AI) has been swiftly permeating the legal world. Legal research tools like Westlaw and Lex­isNexis have used a type of AI called Natural Language Processing (NLP) for more than a decade, and now ChatGPT has passed the bar exam (for more on that, see Misty Griffith’s article on this page). But when I first heard the term “robot lawyer,” I just had to find out what that was all about. Enter DoNotPay, a New York-based tech company that dubbed itself the “world’s first robot lawyer.”

As a child of the ‘80s – when I heard about a pos­sible robot lawyer – I recalled several science fiction films that featured “futuristic” technology like self-driv­ing cars, video calls, and military drones, which have all come true. This line of thought inevitably brought to mind the film, The Terminator, and its tech corporation, Cyberdyne Systems, which was responsible for the de­velopment of Skynet, a self-aware AI bent on eradicat­ing humanity. I envisioned an Arnold Schwarzenegger lookalike wearing a suit bursting through the courtroom doors, which called to mind some questions: Would his titanium alloy endoskeleton set off the metal detec­tors? When court breaks for recess, will he say, “I’ll be back?”

However, after a bit of research, I learned that DoNotPay’s robot lawyer is just a legal service chatbot (aka lawbot) app, and its creator pulled a publicity stunt on Twitter.

DoNotPay CEO and founder, Joshua Browder, cre­ated the company in 2015 as a web-based software to help consumers contest parking tickets. It later became an app which adopted the use of the GPT-3 platform – which has become a hot topic as of late with OpenAI’s use of the platform with ChatGPT – and expanded to include other legal services, such as generating demand letters and tracking down money from unclaimed inheri­tances and forgotten refunds.

The landing page of the DoNotPay website boasts they can help consumers “fight corporations, beat bu­reaucracy, find hidden money, and sue anyone.”

In January 2023, Browder announced that on Feb­ruary 22, DoNotPay’s “robot lawyer” would represent a defendant fighting a parking ticket in an actual court­room. Through the use of Apple AirPods in the defen­dant’s ears, the AI would listen to the case and provide real-time advice to its client.

A few days later, Browder took to Twitter to raise the stakes with the following statement:

“DoNotPay will pay any lawyer or person $1,000,000 with an upcoming case in front of the Unit­ed States Supreme Court to wear AirPods and let our robot lawyer argue the case by repeating exactly what it says. We have upcoming cases in municipal (traffic) court next month. But the haters will say ‘traffic court is too simple for GPT.’ So, we are making this seri­ous offer, contingent on us coming to a formal agree­ment and all rules being followed. Please contact me if interested!”

Browder’s plans never came to fruition, though. In late January, he Tweeted that he was pulling the plug after receiving “threats” from “State Bar prosecutors.” He claimed one of the prosecutors told him that if he proceeded, he could face six months of jail time for the unauthorized practice of law. He later told the Twitter­verse that the company is “postponing our court case and sticking to consumer rights.”

Attorney John Weaver, who wrote a book called, Robots Are People Too: How Siri, Google Car, and Ar­tificial Intelligence Will Force Us to Change Our Laws, says he doesn’t think robot lawyers will become a thing anytime soon. Weaver is on the Board of Editors for RAIL: The Journal of Robotics, Artificial Intelligence & Law and writes a column, “Everything Is Not Termina­tor.”

“People are overestimating what it [AI software] can do in the next two years and underestimating what it can do in the next ten,” Weaver says. “But I suspect that leap from using technology as an as­sistant in court to actually licensing attor­neys that are not human beings is going to be a bridge too far for the foreseeable future.”

Weaver

Weaver says there are a lot of ways using soft­ware like DoNot­Pay for legal ser­vices could go wrong.

“Chef’s kiss as a publicity stunt – very well done,” Weaver says of DoNotPay’s robot lawyer scheme. “But I would say there’s a word of caution – lessons from this story for two groups. One, for bar associations and courts, there’s a certain population where these services and products are appealing. Courts and bar associations should think about how they want to re­spond to that. The other cautionary tale is for the consumers that use these to think carefully about the quality of the service or representation they are receiving. What data is being used to train it? Do the par­ties behind these services and applications have ulterior motives? Are they really in the business of providing legal services or are the legal services they claim to pro­vide just a loss leader to fund and support their actual business model?”

On March 3, Chicago-based law firm, Edelson PC, filed a complaint against DoNotPay in San Francisco Supe­rior Court, seeking a class action lawsuit. The complaint, filed on behalf of former DoNotPay customer, Jonathan Faridian, alleges the company is practicing law without a license and that it misleads the public with respect to its services.

In the complaint, Attorney Jay Edel­son says, “Unfortunately for its custom­ers, DoNotPay is not actually a robot, a lawyer, nor a law firm. DoNotPay does not have a law degree, is not barred in any jurisdiction, and is not supervised by any lawyer.”

Browder denies any wrongdoing and says he will vigorously fight the lawsuit. He subsequently took to Twitter once again saying Faridian’s claims have “no merit,” and that DoNotPay is “not going to be bullied by America’s richest class action lawyer.”

“This is just a guy playing PT Barnum with some­thing, and it sounds like it backfired on him,” Attorney Kirk Simoneau says. “I don’t think we are close to Sky­net.”

Simoneau

Simoneau is all for AI that helps with more immediate access to informa­tion, such as Westlaw with its NLP, but for certain practice areas like his own that involve persuasion and being in court, he believes the effectiveness of AI starts to wane.

“Here’s a really good example,” Simoneau says. “I was at the law school this morning with the Webster Schol­ars. We do a training every year on the DOVE Project and today was the day the students presented their cases. They had a pretend trial, pretend witnesses, the whole shooting match. All those students had the exact same information. They were all given the same intelligence – if you will – the same law, the same stat­utes, the same exact fact pattern, the same cases. Every single one of those students presented it differently and with different levels of effectiveness.

Simoneau adds, “AI can be super helpful in the legal profession to quickly search through every case that’s out there for the relevant precedence. But then what do I do with it? I don’t think ma­chine learning and AI are going to be able to take over the ‘what you do with it’ part very effectively.”

In the personal injury field, people have been using AI for decades with a system called Colossus. The program uses algorithms to look for prior verdicts and uses them to place a value on the in­jury.

“The computer program says, ‘well, your client has a broken arm? Here is what the broken arm is worth – we are go­ing to pay you based on what the medical costs are across the country.’ It’s all AI-driven,” Simoneau says. “But what does the computer do when you say, ‘well, wait a minute. My client with a broken arm is deaf, and they use their arm to communi­cate using sign language.’ The computer program doesn’t know what to do with that.”

As for me, I’m just happy that the ro­bot lawyer is not programmed for termi­nations, so humanity is safe…for now.