> In one case, a judge imposed a fine on New York lawyers who submitted a legal brief with imaginary cases hallucinated by ChatGPT — an incident the lawyers maintained was a good-faith error.
They need to be disbarred. Submitting legal filings that contain errors because you used ChatGPT to make up crap is the opposite of a "good-faith" error.
In that NY case, they were only fined $5000 each, which seems like a slap on the wrist.
I think the court was sympathetic to them, as was I. They were the first lawyers to get burned by ChatGPT. How were they supposed to know that a highly publicized product from a major tech company would just make shit up?
But the penalty for lawyers who do it now, after the first case got so much publicity, should be more severe.
When we had to register the ownership transfer of our family home, despite taking YEARS to prepare the documents, the notary hadn't even bothered to look up the name of the street on google maps and wrote the wrong name on the contract. So we had to waste 1h just to fix that.
They get paid in % of the value of the property being transferred, and can't even spend 2 minutes to write the name of the correct street.
By "taken a look at what they were submitting", I assume you mean use a different tech product, like LexisNexis? In the end you're still trusting technology.
And technology was usually trustworthy, until AI. If Google Scholar cited a legal case, I wouldn't doubt its existence.
It took some time for people to recalibrate for this new world where technology lies.
Whatever tool they used, even if it’s microfiche or a roll of sheepskin, their eyes should have looked at the contents of the case they were citing, before they submitted it to a judge.
If I had to submit code to a judge that decided the life of a person, I’d be reading through my NPM dependencies file by file, if not line by line.
In an idealized world, with infinite time and no financial constraints. In the real world, clients get screwed over all the time because they can't afford to pay lawyers that much.
It's a shame that ChatGPT isn't trustworthy, because if it were, it could really help reduce the cost of legal representation and create a fair legal system.
I'm not sure if disbarring is appropriate here, but then again I'm no legal professional.
I don't know what the appropriate response would be to a lawyer lying to the court and making up facts. I don't think something as bad as making up lawsuits even happens in normal legal proceedings. I'd presume fines and other types of punishment, depending on if the lawyer is stupid enough to lie about using ChatGPT like in the American case.
The person who made the mistake of hiring this lawyer will probably have grounds to sue them for malpractice, especially if they end up losing this case. I know I'd want my money back if my lawyer didn't even bother to read the paperwork they were filing.
This lawyer will now have "lawyer lied to the court" show up the moment you Google their name. I think that, plus a hefty fine, is more than enough punishment. Whether or not their future clients will trust them after this is up to them.
A fine is appropriate. There's no reason to destroy someone's life because of this. There are numerous forms of disciplinary actions available that don't involve needlessly and permanently destroying a person's livelihood.
Does disbarment really destroys someone's life? It considerably reduces their career options and it means they'll probably get much less wealthy than they would otherwise have been, but they can still find another job.
By contrast, if a lawyer makes a mistake that gets someone a criminal record they didn't deserve, that person has much more ground to say it destroyed their life.
I don't think cases like these are comparable to bumping into another car. Legal proceedings can have life altering effects, and lawyers are trusted to take that responsibility. In this case, we're talking about a family that wants to visit China, and I'm not sure if they'll be punished for their lawyers incompetence, but civil lawyers will also deal with life-changing amounts of money. This isn't just their own reputation and livelihood they're putting on the line.
I don't think this particular lawyer should be disbarred, but I do think submitting lies and confabulations to the court should be punished strictly. Attempts to deceive the court should not be tolerated, especially not when the lawyer didn't even do to the work they put their signature under.
The justice system is screwed up more than enough, we don't need professionals getting away with this crap to make it even worse.
Imagine if doctor did some operation from first result in google search. Without verifying that it is correct one or even presented correctly. Should they not lose their medical license?
Using LLM to produce document and then not verifying each part is wilful negligence. Either they do not care to do right thing. Or they are too ignorant. In both cases disbarring seems reasonable. They can always go to fast food or something after it.
I found out that people are more evil and cruel in general than what I believed. The internet just made it first explicitly and later, socially acceptable.
If the user didn't understand that ChatGPT make up crap sometimes, despite the warnings everywhere about it that they may not read, it could still be a good-faith error to me. ChatGPT was just released.
If the user doesn't understand their tool then they should review what it spits out.
> I found a guy who claims to have the sum of all knowledge, and I've just copied and pasted code from him that I haven't reviewed.
Then either your an idiot and should be disbarred or you are negligent and should be disbarred. As a lawyer you have people's lives in your hands, this is not the time for "woops, sorry I just couldn't be arsed to read what I submitted".
If the user is a lawyer who officially referenced a case that a) they never read and b) never existed, then the user can’t be allowed to practice law anymore.
They need to be disbarred. Submitting legal filings that contain errors because you used ChatGPT to make up crap is the opposite of a "good-faith" error.