Where does AI in legal work go from here?
Hype and fads might be giving way to more exciting things.
AI has seen several cycles of hype over the past couple of years.
Remember when everyone wanted to build new models? The recent quest for AGI?
ChatGPT converted language processing, once a niche in the world of computing into a headline startup fad.
Everyone can use an “intelligent” language model. So intelligent, it is difficult to see where the “intelligence” stops and the “language” aspect kicks in.
This has led expectation mismatches across potential users of AI products.
In the legal world, this would translate to customer expectations like:
Build a model to do my legal research for me.
Make an AI that will solve my clients problems.
The AI should tell me whats in the data room.
Make the AI read everything I wrote and then it should write based on my knowledge and style.
You see, people riding horses, having seen cars, will buy nothing less than a rocket.
Those who of us who were already trying to solve language processing problems (in our case legal drafting) cheered language models for very different reasons and we had very different expectations.
For example, many things that couldn’t be built before, could now be built.
Computers using traditional logical (versus statistical) methods struggle with language processing.
This made the things we were doing hard.
Automating drafting was a hard problem. Review was a hard problem.
Are these now easier to solve?
We now know that not only are drafting automation problems easier to solve but also provide a competitive advantage to those who have strong domain expertise.
Example:
If you do not know at all how to code, ChatGPT (or similar models) will not help you write an app.
The more you know to code AND use ChatGPT, the better your efficiency and mastery in building apps.
In this manner, it isn’t very different from, say, drafting a legal document.
But legal drafting is different in other ways.
Given that code is universal, there is little need for the user to prompt intelligently. You ask for help in simple language, and iterate to a result. Once you see something that makes sense, copy-paste, tweak and make it work.
And it works. For everyone, everywhere.
Legal drafting is in this way more complicated.
Drafting is more than just substance, it is also about style, localisation and applying a constantly in flux knowledge base.
Therefore there are things that matter to lawyers (or legal drafters) that don’t to coders:
Quality, specificity and knowledge infused in prompting along with choice of models; and
Integration with workflow.
Are there tools that achieve this well? Yes.
Are enough professionals using them? No.
Why not?
Isn’t the next wave of innovation going to be drive by professionals empowered by the best of AI?
That is what we believed.
The change isn’t going the way, and as fast as expected.
Metaphorically, you are trying to sell cars to horse riders.
People will buy cars if they want to drive something faster, better and more comfortable.
Horse riders will resist it, and their bosses are now dependant on them, and the horses. They may buy the car, but they will continue expending on them.
This changes if - more people know how to drive a car, and there are more cars on the road.
This is why many AI companies are now choosing to sell a taxi service rather than sell cars outright.
What use is selling cars to people who leave them in the driveway, not knowing how to drive it while their staff sabotage it?
AI, particularly in legal tech, may need to be delivered differently.
Instead of fighting irrational resistance to innovation, what if we sidestep the resistance altogether?
Until next time.
Hi Suhas,
I work with a legal AI startup and I fully agree with you regarding your analogy with AI companies selling taxi services and not cars. In fact, the tech company itself does a lot of legal work (research done by AI, and formatting, looking for statutes by associates) for clients which are themselves law firms.
This was a good read!