Groups | Search | Server Info | Login | Register


Groups > comp.ai.shells > #255

Chatbots produce bogus citations (was: Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT

From vallor <vallor@vallor.earth>
Subject Chatbots produce bogus citations (was: Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT
Newsgroups comp.ai.shells
References <1qbffa5.1adff801ndrrcbN%snipeco.2@gmail.com> <y4OdM.3421888$iS99.3297960@fx16.iad> <1qbnhi9.3p966e1nlu1yjN%snipeco.2@gmail.com>
Message-ID <ue6eM.2302478$gGD7.405554@fx11.iad> (permalink)
Organization blocknews - www.blocknews.net
Date 2023-06-01 19:23 +0000

Show all headers | View raw


On Thu, 1 Jun 2023 14:33:13 +0100, Sn!pe wrote:

> vallor <vallor@vallor.earth> wrote:
> 
>> On Sun, 28 May 2023 06:17:15 +0100, Sn!pe wrote:
>> 
>> > Lawyer admits using AI for research after citing 'bogus' cases from
>> > ChatGPT.
> 
> ---
> <https://www.telegraph.co.uk/world-news/2023/05/27/lawyer-chatgpt-made-
up-cases/>
> 
> The above, bypassing paywall:
> <https://12ft.io/proxy?q=https%3A%2F%2Fwww.telegraph.co.uk%2Fworld-
news%2F2023%2F05%2F27%2Flawyer-chatgpt-made-up-cases%2F>
>   
> TinyURL of above:  <https://tinyurl.com/yntupbe4>
> ---
>  
>> Poor example of "don't trust, do verify".
>> 
>> 
> Hence my earlier (unchallenged) point that ChatGPT does not provide
> citations.
> 

It does provide citations sometimes.  And sometimes, those citations
are bogus, which is what happened to our hero the cyberlawyer...

> 
> 
> 
> 
> 
>      ---[remainder left unsnipped for context]---
> 
> 
>> > ---
>> > Steven Schwartz used program to 'supplement' his work for a 10-page
>> > submission to the Manhattan federal court.
>> > ---
>> > A New York lawyer has been forced to admit he used the artificial
>> > intelligence tool ChatGPT to carry out legal research after it
>> > referenced several made-up court cases.
>> > ---
>> > Steven Schwartz, who works for Levidow, Levidow and Oberman, is on a
>> > team representing airline passenger Roberto Mata who is suing the
>> > firm Avianca for injuries suffered when a serving cart hit his knee
>> > during a flight from El Salvador to JFK airport in New York in 2019.
>> > Mr Schwartz used the AI program to "supplement" his research for a
>> > 10-page submission to the Manhattan federal court outlining why his
>> > client's case should not be thrown out.
>> > 
>> > The legal brief, submitted in March, cited six previous cases dated
>> > from 1999 to 2019 to bolster his argument for why the case should be
>> > heard despite the statute of limitations having expired.  But neither
>> > the airline's lawyers nor the judge could find the decisions or
>> > quotations summarised in the brief.   [continues]
>> > ---
>> > <https://www.telegraph.co.uk/world-news/2023/05/27/lawyer-chatgpt-
made-
>> up-cases/>
>> > 
>> > The above, bypassing paywall:
>> > <https://12ft.io/proxy?q=https%3A%2F%2Fwww.telegraph.co.uk%2Fworld-
>> news%2F2023%2F05%2F27%2Flawyer-chatgpt-made-up-cases%2F>
>> >   
>> > TinyURL of above:  <https://tinyurl.com/yntupbe4>
>> > ---
>> >   
>> >   
>> > - and so the nightmare begins...
>> 
>> It (the nightmare) been around for at over a year.
>> 
>> There's talk about the licenses for code (or text) that these gadgets
>> auto-generate, since they might take snippets of code (or text) right
>> off the net.  But on the other hand:  sometimes they just make up stuff
>> that sounds plausible.
>> 
>> Another example: mrs. vallor got it to generate a story about a rabbit,
>> then googled the resulting text.  She found similiar text online, and
>> thought she'd found evidence of potential plagiarism -- but it was
>> dated *after* ChatGPT's cutoff date.
>> We figured that they might be using ChatGPT to write those children's
>> stories and post them to the web.

-- 
-v

Back to comp.ai.shells | Previous | NextPrevious in thread | Next in thread | Find similar


Thread

Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT snipeco.2@gmail.com (Sn!pe) - 2023-05-28 06:17 +0100
  Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT Nic <Nic@none.net> - 2023-05-28 08:29 -0400
  Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT Blue-Maned_Hawk <bluemanedhawk@gmail.com> - 2023-05-28 15:53 -0400
  Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT vallor <vallor@vallor.earth> - 2023-05-31 20:27 +0000
    Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT snipeco.2@gmail.com (Sn!pe) - 2023-06-01 14:33 +0100
      Chatbots produce bogus citations (was: Re: Lawyer admits using AI for research after citing 'bogus' cases from ChatGPT vallor <vallor@vallor.earth> - 2023-06-01 19:23 +0000
        Re: Chatbots produce bogus citations snipeco.2@gmail.com (Sn!pe) - 2023-06-01 20:45 +0100
          Re: Chatbots produce bogus citations vallor <vallor@vallor.earth> - 2023-06-07 23:32 +0000
            Re: Chatbots produce bogus citations Andy Burns <usenet@andyburns.uk> - 2023-06-08 06:07 +0100
              Re: Chatbots produce bogus citations snipeco.2@gmail.com (Sn!pe) - 2023-06-08 10:37 +0100
              Re: Chatbots produce bogus citations Nic <Nic@none.net> - 2023-06-08 10:46 -0400

csiph-web