AI Writing Bots Will Change the Lives of Students and Professionals Says University of Houston Law Center Scholar Salib
The use of artificial intelligence platforms such as ChatGPT will have widespread implications on education and in professional settings, according to University of Houston Law Center Assistant Professor Peter Salib. While ChatGPT has the potential for abuse and inaccuracies, Salib added that when used properly, it can benefit students and professionals.
Recommended AI: Microsoft 365 Security Features Protect Business Data from Evolving Threats
“For everybody whose main work is writing things on a computer, this is a tool that is going to change how you work, especially as it gets better,” Salib said. “ChatGPT produces mediocre content in response to complex questions. There might be some incentive to plagiarize, but probably not if a student wants an A.”
“On the other hand, I’m not sure it’s right to think of using those kind of language models in the classroom just through the lens of plagiarism. They’re extremely useful tools, and they’re going to be extremely useful tools for real people doing real work. I think we do a disservice to a student if we said these tools are not part of education, and they’re forbidden to use them as they work their way through law school or their undergraduate education.”
Salib said students should learn to use resources like ChatGPT, and that they can contribute to how a student thinks through issues and writes ideas. When it comes to universities making policy, Salib said it becomes more complicated.
“There probably shouldn’t be just one policy for all kinds of assignments,” he said. “We probably need something that’s not one size fits all. There should be some kinds of assignments where students are told not to use a language assistant at all, so they develop the chops of writing something from scratch, thinking of something from scratch. There probably should be some assignments for which the requirements are use whatever tools you would like to produce something, but the final product should be more than 70% words you wrote.
“As professors, we have to ensure that when students do work for us that doing it well requires that you actually learn something. If an essay question is just copied and pasted into a ChatGPT prompt and the answer earns a B or B+, then we’re not teaching students to use ChatGPT as a tool that helps them think, we’re teaching them to use it as a replacement for thinking, and that’s not good either.”
Salib said using ChatGPT can help students formulate a first draft or an outline. But using it as the sole source for an assignment may not translate into good grades. A potential flaw in the use of such technology is that it can produce fabricated information.
Recommended AI: Top 10 Martech Platforms Every Marketing Team Love Having in their Stack
“In most fields, especially in a field like law, you need what you write to be true,” Salib said. “Most of what ChatGPT writes is true, but some of it is made up. If you’re a lawyer, you have to see if the cases that ChatGPT cites and characterizes exist and say what the platform says it does.”
In addition to his role at the Law Center, Salib is an associate faculty member at UH’s Hobby School of Public Affairs. Prior to joining the Law Center’s faculty, Salib was a Climenko Fellow and Lecturer on Law at Harvard Law School. After graduating from law school, Salib clerked for the Honorable Frank H. Easterbrook and practiced law at Sidley Austin, LLP, where he specialized in appellate litigation.
Recommended AI: Consider Your DOOH Buying Methods Wisely: Direct Sales vs. Programmatic Buying
[To share your insights with us, please write to sghosh@martechseries.com]
Comments are closed.