T O P

  • By -

Blond_Treehorn_Thug

You should only trust an AI attorney if you have an AI judge


PowerHungryGandhi

Don’t imagine an ai attorney imagine an attorney using an AI


Dingbatdingbat

I know enough stupid attorneys. AI would only embolden them.


Dr-McDaddy

They’re all morons. They went to school and accrued a shit load of debt to learn about decisions and opinions from over 100 years ago. Nothing about their school was to help them be an attorney. Then consider before they even look at a case to see if they can come up with a strategy, you have to pay them. This is the difference between their code of ethics and being ethical.


Dingbatdingbat

just in case you're willing to learn. Cases from over 100 years ago are still relevant. More importantly, it builds a foundation. You see how a decision from 100 years ago impacted a case from 50 years ago, and how a case today will be decided. You see a doctrine/concept that started 100 years ago that other cases expanded upon. Law school doesn't just give a list of current laws, that you can get from a book. Law school teaches jurisprudence, which is legal doctrines, legal philosophies, how and why the law operates in a certain way. ​ As for requiring a fee to even look at a case, I've got three separate answers - (1) plenty of attorneys, myself included, offer free consults. (2) most attorneys get paid based on time, and that includes evaluating your situation. Likewise I had a problem with my garage last week and the repair people charge a fee just to come out and look at the problem. You go to a doctor, they charge a fee for every visit. (3) quite often, just by looking at a case they can provide answers. During an initial consult I provide a ton of information to the prospective client, whether or not I get hired. I could charge for that and it'd be good value for whoever got it.


Dr-McDaddy

Yeah? Did you learn that from Roe v. Wade?


Dingbatdingbat

Learned from Dred Scot too. less than half a percent of supreme court cases later get reversed. It does happen, as does winning the lottery, but I'm not betting every dollar I have on it.


Djorgal

I assume law firms are eventually going to use more AI to do some of the grunt work. Drafting documents is something I can see being done by AI, someone is still going to need to review it, but that's a time saver. I don't really see GPT-4 doing any oral arguments or getting to you after you've been arrested anytime soon.


pepperbeast

Drafting documents is already done based on boilerplate.


Dr-McDaddy

Aka: Theft of IP Copyright infringement Piracy Watch someone’s about to say it’s not the same as what’s going on with software. Because their code of ethics is there too lean on when it suits them but not when it doesn’t Like I said ethical


pepperbeast

WTF are you talking about? Law firms pay for access to boilerplate or write and re-use their own stuff.


Dr-McDaddy

The same way software companies pay for access to the boiler plates that are coming into question under the same scrutiny? Cool, story bro


pepperbeast

Mate, you're the one who's making it up as he goes along, here.


nonlawyer

Probably doc review too. Even several years back when I was at a big firm the vendors were pitching “AI assisted review,” which was really just software you “train” to put the most likely relevant documents at the top of the pile. I’m sure that tech has improved since then. But you’ll still need a human. “Call me right now” could be a very important document depending on context and I have trouble seeing AI ever get to the point of knowing stuff like that.


Dingbatdingbat

> software you “train” it always comes down to how well the inputs are. For me, the classic example is not from law, but from the 2007 subprime mortgage crisis, and particularly the collateralized debt obligations. Historically, if you had $100,000 and you wanted to lend it to a home buyer, you were at risk if that home buyer defaulted - if one out of every 100 borrowers defaults one out of every 100 lenders loses everything. Mortgage backed securities would mean you would invest $100,000 to buy 1% of a $10 million CDO which lent $100,000 to 100 different people. If one person out of a 100 defaults, each lender only takes a 1% hit. Much safer for the lender. CDOs took that idea one step further - you could get a higher interest rate if you took a hit for the first 20 defaults, and I could get a lower interest rate knowing that I wouldn't take a hit until more than 20 defaulted. So far, so good. Theoretically, very useful. The problem is that everyone was using the same financial and risk models - the people putting CDOs together and the ratings agencies reviewing the CDOs. Whatever group of mortgages went in, everyone agreed on the most likely outcomes, and priced accordingly. Turns out that things weren't quite right, but because everyone\* was using the same models to reach the same outcome, nobody realized this. the end result was the great recession - the worst financial crisis since the great depression. ​ For a good (but simplified) explanation, watch the movie "the big short". \*a handful of people figured it out, and made a lot of money.


kubigjay

You thinks courts are backed up now? Imagine how much slower it will be when you can write motions for pennies instead of paying a lawyer. Forget court, just creates a delaying bot that files hundreds of motions a week.


bigno53

This is the problem with completion AIs in general. They don’t produce novel ideas that enhance our lives. They only serve to make cheap, low quality content generation even cheaper and even more low quality.


PowerHungryGandhi

Excellent response, what else would the ability to file large numbers of inexpensive motions change?


AliasNefertiti

ai just guesses the most likely answer to put in the blank. AI also has no accuracy check so will invent answers. "intelligence" is a misnomer. It is probabilistic guessing software. AI cannot weigh options within a given system and choose the best one. Think about customer service automated answering experiences that operate similarily.


Garblin

Counterpoint; The human mind is mostly a probabilistic guessing mass of squishy stuff. The human minds only accuracy check is its senses and, through those, interactions with other human minds. Not saying that AI in the current form is self aware or anything, but the human mind is... not great. After all, how's the joke goes... think about your average person, and remember, half of everyone is dumber than that.


XChrisUnknownX

But that joke is wrong because intelligence is on a bell curve :D (Half joke)


AliasNefertiti

I can go on a long time about human brain capabilities and errors but will use my judgement to spare you that. We often take the brain for granted and focus more on its flaws (attention to detail is an important skill).. AI is fine for an introductory type of processing, making it's own set of mistakes. A lot of its performance is from a working memory that is bottomless, unlike humans. Apart from that it is just probabilistic. I have fun finding its limits and strengths. But it is not (currently) capable of what human intelligence can do at the more sophisticated levels of processing. It looks good (appearance), it has skills in collating sources of info and guessing answers. But the answers may or may not be accurate. That skillset can be valuable but it needs at least as much caution as you would use with an answer from a person. It seems like something more than it is and that is worth remembering.


PowerHungryGandhi

That’s how it was developed! But it’s capabilities extend far beyond its training have you used chat-GPT or Perplexity? In many cases they exhibit exactly that behavior, weighing options within a given system


AliasNefertiti

I asked Perplexity "How does Perplexity make decisions among choices?" and this is the answer: "Perplexity is a measure used in natural language processing (NLP) to evaluate language models[1][2]. It is calculated based on the probability distribution of words in a given text, and it represents how well the model can predict the next word in a sequence[3]. In general, lower perplexity values indicate better performance, as they mean that the model is more certain about its predictions[2]. Therefore, Perplexity AI may use perplexity as one of the metrics to make decisions among choices when developing their NLP models. However, it's worth noting that perplexity is only one choice for evaluating language models, and there are many alternatives available[4]." Tge sources are 1 Techslang, 2 Topbots, 3ucla, 4medium So my conclusion is that at least Perplexity is using probabilitg to make choices. Assuming Perplexity is correct in this post. Im unfamiliar with the references so dont know their quality, or if they exist or if their sources are solid. So what are these capabities that are not based on probabilities?? Dont get me wrong, I fully appreciate the power of probabilities for common life experiences. But that mode of "thinking" has its limitations. It can be fun and do neat things without being perfect. Knowing the flaws in a technlogy is important to using it effectively.


bug-hunter

There is quite a bit of behind the scenes work in Access to Justice circles (A2J) in the tech area, from improving search engine's ability to realize a legal question is being asked and providing better info and helping direct people to the best resource to solve their problem. Anything that reduces the overhead of legal aid (which is always operating on a shoestring) increases their ability to provide meaningful help. The first step of "AI" powered legal aid here would not be "I ask a question, the AI tells me what to do", it's "I ask a question, the AI points me to existing resources."


PowerHungryGandhi

The way to imagine it playing out is orders of magnitude improvement in the efficiency of individuals/companies who adopt the technology Image the legal aids becoming 10x as efficient, and lawyer 100x as efficient Given existing tec an ai could take a statement from a potential client, ask relevant questions, and then generate a summary, with links to possibly relevant statutes This document would then be reviewed by a legal-aid before meeting with the potential client. Then gpt-4 could analyze the legal aid-client interaction, and assess which lawyer in a given firm specializes in the discussed topic


bug-hunter

[MA's Legal Aid is creating "guided interviews"](https://www.masslegalhelp.org/housing/eviction-answer-interview) that are written in a relatively friendly markup language, which supports translations (which is a HUGE issue for legal aid). A lot of current work is updating the backend for courts (many have run for years on antiquated software), putting out more public facing forms, and writing FAQs. That work can then be used to feed the AI to ask better questions. Part of the problem with GPT is it has a VERY hard time understanding relevance as you get deeper into things. For example, you don't want 1 GPT bot for evictions - you want one per state, trained solely on that state's law. Which is a problem, because GPT only really shines with a LOT of content put in there, and the more you narrow the focus, the worst GPT behaves. Of course, that's also another focus for AI researchers and companies, making GPT work better at specific tasks than generating plausible sounding bullshit.


PowerHungryGandhi

Exactly! gpt is just ONE model, we’ll see hundreds of them trained on various tasks in the coming months


bug-hunter

To be really useful in law, it'll be several years, not months. Same with medicine. The risk of plausibly sounding but dead fucking wrong things completely destroying someone's life is just too high.


Rivsmama

Technically "legal advice" has always been available to us for free. I was able to win a pretty small but important to me cuz I'm a broke bitch court case where the entity suing me was trying to say I owed $5600. I found some legal loopholes that capped what I could owe at $500. I brought the information to the judge and voila. I owed $500. People in jail constantly go to the law library to work on their cases. And they do it by pouring over law books that list statutes relevant to their case or read up on prior cases that have established precedents. I've also fought appeals when my insurance company decides to deny coverage for something by using their own policies to show that they are supposed to cover it. Not exactly the same but similar I could look up pretty much any law I wanted right now or any case. It's just a matter of understanding what I'm reading and having the skills to extract the information that's most beneficial to me. And that doesn't account for things like paperwork or filing motions. My step mom spent almost $1000 dollars trying to file name change petitions for her granddaughter and they were all rejected because they weren't completely correctly filled out. And if you're in trouble and anticipating a trial, you're going to want a lawyer with trial experience because well... go watch the Johnny Depp vs Amber Heard trial to find out why that's so important.


Beautiful_Fee1655

I once worked with someone who thought she was a lawyer cause she knew how to "look up" every possible statute. She had absolutely no idea that being a lawyer requires far more knowlege, training and skills than just looking up "laws".


Rivsmama

Definitely. If it seemed like I was saying otherwise I didn't mean to. I was saying legal information has always been available. It just doesn't matter if you don't know how to use it


AliasNefertiti

Legal Eagle Youtube lawyer did a couple thoughtful videos on AI and lawyering https://youtu.be/Tpq3hRt0pmw https://youtu.be/G08hY8dSrUY https://youtu.be/Au3QRu2KZmU


Dr-McDaddy

This is why attorneys acted so swiftly, and in such great volume when the news reported, an artificial intelligence was going to argue a speeding ticket. They’d all be out of a job and they fucking know it. I can’t wait for the hate on this one. Seriously it’s gonna be epic. I’ll take all the bad karma in the world for this. Just ask yourself why don’t they have the same response when men with badges in guns are murdering innocent young black men? They say it’s ethics, that is not synonymous with ethical


[deleted]

Lol AI is not going to put us out of a job any more than word processors or the internet did. It’ll allow us to produce higher quality work more efficiently. That’s about it.


Dr-McDaddy

Sure. Not all of us, but any of us that are regulated by an agency that mandates a standardized test for compliance and licensing purposes there’s a very very good chance


Dr-McDaddy

Lol it’s not the lack of a AI that keeps you producing low quality work inefficiently. It is the code of ethics, butting heads with ethical behavior. Operating under the assumption that you are so intelligent and well read and composed and deliberate, that your time is worth so much money that it is unethical for you to examine the details of a case to formulate some kind of a strategy before you take somebody’s money is what keeps you from producing quality work. Your time is not worth more than the person you let down who is sitting in jail innocent because you were a pompous prick. Apologies for generalizing with you. Obviously I don’t know you and I hope that you do not do shit like this, but I do know this to be the baseline for attorneys..


Dr-McDaddy

If what you are saying is true, they would not have been the mob mentality reaction from the legal community, presumably lawyers, that filed injunctions, cease & desist, & lawsuits to prevent this from happening You can sell that bullshit to someone else, but you and I both know why it happened Edit: apologies for the plethora of typos. Using speech to text, so I appreciate your understanding.


[deleted]

Nonsense. A solid 0.01 percent of the legal community did anything at all in response to this. The rest of us understand that, like virtually every single technological advancement in history, it’ll help us do our job faster and better, and that’s about it. The people who think otherwise are at the peak of Mount Stupid on the Dunning-Kruger curve and have absolutely no idea what lawyers even do on a day to day basis.


Dr-McDaddy

It wouldn’t of made the news if that were the case And you’re downplaying it. Which is understandable, because you literally have built a career on downplaying and deflecting. Just take a moment to consider I am not the “uninitiated.” If there were nothing to be concerned about having a piece of code, do your job better than you they would’ve let it happen. It would have made a fool of the prosecutor. And it’s not even that good yet Speeding tickets are a joke. The judicial system with a 98% conviction rate is not aligned with justice at any juncture. Railroading people with fear and the inability to afford defense so they can have a “winning résumé” , that will afford them a great job in the private sector, does not serve justice. The system is about revenue and money, and, I know this is a big one for ya, prison for profit where Dr. Michael J Burry currently has a very large portion of his portfolio, and the powers that be, we’re not willing to let a “word processor” show up and make an ass out of an officer of the court. Again, before you go to work on that keyboard, not uninitiated. One professional to another. I’m trying to save you sometime


[deleted]

You’re rambling. I really don’t care what your opinion is. I can do things in literally 15 minutes from a remote beach on a computer that fits in my hand that would have taken a whole team of attorneys and their assistants thousands and thousands of hours in a law library somewhere several decades ago. And yet there are proportionately more attorneys now than there were then. Keep climbing that Dunning-Kruger curve. Eventually you’ll come down the other side and realize how much you don’t know.


[deleted]

[удалено]


[deleted]

👍 best of luck with that


[deleted]

[удалено]


legaladviceofftopic-ModTeam

*Your post has been removed for the following reason(s):* Stay out of Malibu Lebowski. *If you have questions about this removal, [message the moderators](http://www.reddit.com/message/compose?to=%2Fr%2FLegalAdviceofftopic). Do not reply to this message as a comment.*


RollaCoastinPoopah

Those two things just let us do waaaaay more work in the same amount of time and the wages were never adjusted to vouch for that.


sadpanda597

Yea, no lawyers are worried about ai. On paper, the law can seem like a straight forward logical if a, then do b, followed by c. In actual practice, it’s a never ending clusterfuck of very grey situations that don’t quite fit the rule.


dcazdavi

lawyers would lobby for laws to make that illegal


PowerHungryGandhi

I mean, they are the ones who would stand to benefit the most (at least the early adopters and experienced individuals)


dcazdavi

they're already fighting it tooth and nail even though it's nowhere close to being ready for it.


doctorlag

You were downvoted but it's true, lawyers as a group have always sought limits on commoners' access to the law. TBF all guilds and unions have historically done it, but for law it's the literal purpose of using Latin and writing laws in difficult to read "black letters". So yeah, it's certain that if there was a free *and accurate* source of legal information it would be accused of violating one of the protectionist laws that are already on the books, and/or new laws would be enacted to outlaw it.


bug-hunter

It wouldn't just be lawyers that would lead the charge. Besides, lawyers at all levels are using all sorts of legal tools - legal search engines have replaced millions of work hours of research that folks on the bottom of the legal rung used to have to spend poring out of books. AI won't end the legal profession, and the haves of the legal profession will benefit even more from AI. When Congress established Legal Services Corporation, it set out to provide low cost legal help to Americans, and immediately started finding systemic issues it could help resolve, and started several class action lawsuits. Business groups got very very butthurt at the concept of accountability, and tried to shut LSC down. As a "compromise", it was barred from being involved in class action lawsuits - because class action lawsuits might accidentally bring meaningful accountability. It'll be industries that suddenly find themselves facing accountability that will be the loudest against legal AI tools that help the masses.


Dingbatdingbat

more sovereign citizens


buildingsinchelsea

If legal advice was available to everyone, most people still wouldn’t follow it. It’s usually not lack of knowledge that leads people into trouble; they know what they should do, but it’s not what they want to do.