‘If You Need True Answers, Don’t Use It’


David Schwartz is a renowned AI skeptic. He frequently shares reminders about the potential dangers of using AI-powered instruments in their modern designs. Now he commented on the practice of using AI for working with court documents.

ChatGPT’s answers only look like right ones, Ripple CTO says

AI-powered chatbot ChatGPT and similar large language models (LLMs) behind generative AI programs provide answers that are not true, but definitely look like they are. As such, they should not be used for finding true answers to “serious” questions.

This statement was made by Schwartz as a comment on an analysis by a prominent lawyer Steve Vladeck, Charles Alan Wright chair in Federal Courts at the University of Texas School of Law. Vladeck shared examples of forged court case documents generated by ChatGPT.

The lawyer recommended never using ChatGPT or similar instruments for legal research. Also, he asked not to try educate LLMs with fake court documents.

Schwartz is most concerned about the dangerous similarities between AI-generated fake documents and real ones. Even a skilled researcher can fail to identify which one is true:

ChatGPT’s job is to give you output that looks as much like the output that kind of question typically gets as possible. Whether it’s actually *true* in the sense of corresponding with actual things in the real world is simply not a consideration

As covered by U.Today previously, Schwartz started raising concerns about the possible dangers of AI during the first phase of ChatGPT hype.

He opined that in the near future the AI will be able to generate live footage of terrorist attacks that never happened.

Crypto segment getting tired of AI hype?

While the AI mania triggered some amazing rallies of low-cap altcoins that somehow managed to ride the “AI narrative,” not everyone in crypto is enthusiastic about it.

DeFi analyst and partner in Cinneamhain Ventures Adam Cochran admitted that, to him, using AI in tech marketing strategies for crypto in 2023 looks like a “red flag”:

I will 100% insta market sell any token I have that pivots to something in the AI space.Trend chasing just means they can’t build a real product

The influence of AI on crypto and blockchain can be heavily overmarketed. As we demonstrated in our guide, AI-generated texts might be good for training in basic smart contracts coding skills, but ChatGPT’s answers are too vague and confusing for crypto research.

For instance, its characteristic of XRP Ledger looks detailed and accurate, but in fact can be misleading and does not add much to knowledge about this blockchain.

Sourced from u.today.

Written by Vladislav Sopov on 2031-10-21 09:38:07.

Leave a Reply
Previous Post

Binance Pay sees growing interest in Africa, Eurasia and Eastern Europe

Next Post

Lybra Finance x8 TVL In Just 2 Weeks, LBR Token Soars 40%

Related Posts