Story ‘written’ by Leeds professor a hit on social media, but the tech has him ‘oscillating between extreme excitement and abject terror.’
If you’ve ever taken a Zoom call with Kai Larsen from his home, you’ve probably met—or at least heard—Dewey.
Dewey is Larsen’s pet cockatoo, and since disrupting video calls is Dewey’s expertise, Larsen thought maybe his pet could showcase some truly disruptive technology. So he tasked ChatGPT with for his son, Alex, starring Dewey.
Dr. Seuss it is not, but while “The Cat in the Hat” is supposed to have taken 18 months to write, ChatGPT generated Dewey’s story in seconds. Larsen continued to feed it prompts to help the software perfect the story; all told, he spent about 10 minutes working on it, using a similar platform, DALL-E, to generate appropriate images.
At 15, Alex is a little old for a bedtime story, “but he was very excited about it,” Larsen said. When his child was born, Larsen ordered a “custom” book celebrating the start of a new life, “but all they did was print the name into the same book, right? But I noticed that it really resonated with my child at that time—you know, seeing yourself in a story. It isn’t high art by any means, but it can be more personal and more targeted.”
ChatGPT is a chatbot that OpenAI launched in the fall. It’s not the first of its kind, but ChatGPT has generated incredible buzz for its natural-sounding dialog—generated in response to user prompts—that has been used in everything from writing , to composing , to teaching novices how to code, to helping students write term papers.
“It’s hard to think of a place it will not create impact. I think that’s the much easier, if incomplete, way of looking at it.”
Professor Kai Larsen
For Larsen—an expert in machine learning and natural language processing, and faculty director of Leeds’ master’s program in business analytics—ChatGPT, and the possibilities it presents for business, have him “oscillating between extreme excitement and abject terror.”
“We have never found a technology that didn’t create opportunities as well as risks,” Larsen said. “But this is as powerful a technology as I’ve seen in a long time, and we need to be very thoughtful about how we implement it.”
Take something as straightforward as email. Larsen said he may engage in as many as 50 conversations a day, “and it feels like all day, I’m making decisions in Zoom, email, and LinkedIn and Slack messages, instead of directly furthering my research or teaching,” he said. “For industry, we’re going to see even more communications that will be harder to distinguish from something a genuine person would create. And as that takes up more of our bandwidth, we will need A.I. tools to manage our lives. It will function as a feedback loop.”
And unlike the custom book he bought for his child all those years ago, created and printed at significant cost, A.I. can quickly generate and customize a video message for thousands of viewers. A business could send a personalized thank-you video for purchasing a product, then recommend other services you might like based on your purchase history, at a speed previously unimaginable.
Reinforcing the digital divide?
However, as A.I. is entrusted with more responsibilities, being fluent in working with these tools will become not just crucial to how we work, but how we learn—in higher education and even K-12 schools.
For students who can afford services like ChatGPT—free for now, but not forever—an understanding of how these tools work, and how they can help them manage a digital deluge in the age of A.I., will have a major advantage over those who cannot. Call it another crack in the digital divide.
“These tools are going to teach students better than they could learn from humans alone,” Larsen said. “Students going to be smarter and more efficient because they have access to these tools. But if we don’t have equal access to these technologies, we might just further cement inequality in our society.”
And the growing investment by companies in this space, along with how quickly it can learn and adapt, means it is evolving and improving all the time.
“You have to remember, at its core, ChatGPT does not know a single sentence. It’s generating this content from scratch based on an understanding of language, rather than an understanding of the underlying facts,” Larsen said.
So, if you ask ChatGPT to come up with a list of ways to help manage a troubled employee, the suggestions you get will not necessarily be based in science or ranked by statistical importance—or be true to the story of the subject matter.
In “Dewey the Unhappy Cockatoo,” Larsen noted the protagonist, after being rescued, is happy and sociable in the story, but as an actual rescue bird, “Dewey never really made friends. He was sort of a loner. But I accepted that the story was good enough for the purpose.”
Will it be good enough if the usage is for high stakes? Or will the tools, and our ability to use them more effectively, improve to the point where these discrepancies disappear?
Yes, Larsen said:“Look to the areas where people and language models work together towards faster and better outputs.”
On the question of where it will affect our lives, he suggested a different approach. “It’s hard to think of a place it will not create impact,” he said. “I think that’s the much easier, if incomplete, way of looking at it.”