鶹ӰԺ

Skip to main content

Is it Times for limits on A.I. and the news?

By Joe Arney

If you’re like most people, when you saw an update to the terms and conditions to use The New York Times’ website last week, you just accepted them and moved on. 

But there was something unusual in this particular update—a prohibition on artificial intelligence. 

Headshot of Robin Burke
“A lot of folks who are creating content—reporters and writers, but also artists and others—are discovering that their work is essential for these models to function,” said Robin Burke, professor and chair of the information science department at CU 鶹ӰԺ’s College of Media, Communication and Information. “And yet there’s this business model for which this is the input, but there’s no compensation for it.”

Generative A.I. platforms like ChatGPT create content based on user input. If you ask it to write a thank-you note to your grandmother for the sweater she knitted for your birthday, it draws upon all the text it has “read” online and generates a fairly convincing note. But there is no recognition for the writers whose prose generated the source material that make the A.I.’s output possible. 

The Times’ action forbids A.I. systems from scraping its content to train machine learning systems. So far, it’s the most influential shot fired as A.I.’s perceived impact looms in newsrooms, creative fields and beyond.  

“The first round with A.I. has kind of been a free ride, because nobody was paying attention to what they were doing,” Burke said. “Now, I think it makes sense that the organizations producing content are thinking, ‘Do I really agree with this as a usage of my work?’”

A unique perspective on news, A.I.

Burke has unique expertise in this arena. He’s the son of a newspaper publisher and a scholar who is part of a team that’s creating tools for the close study of news recommender systems and their impacts on users, including journalists and editors. 

The Times is facing the same challenges as other papers in this new chapter of the digital age. But with a very robust subscriber base and a global audience, it is not really in the same category of daily newspapers that have been constricted by technologies that have moved audiences online and siphoned away significant advertising revenue. It’s easy to read journalists’ concerns over A.I. as a chance to correct what the industry got wrong at the dawn of the internet—when publishers made their news free to everyone online, counting on the new technology of digital advertising to pay the bills. 

  “The first round with A.I. has kind of been a free ride. ... Now, I think it makes sense that the organizations producing content are thinking, ‘Do I really agree with this as a usage of my work?’”
    Robin Burke, professor, information science

“In the early days of the Internet, people had a lot of different crazy ideas,” Burke said. “And certain models came out of that—some thrived, some failed—but as it relates to A.I., we’re not far enough along to understand who the winners are.”

Need proof? A month before the Times changed its terms, The Associated Press signed a deal to allow ChatGPT to scrape its archive going back to 1985.

“AP is a little different, in that their model is very different from the Times—they get their money mostly from publishers for using their content,” Burke said. “It might also be the case that OpenAI saw the writing on the wall and looked to AP as a reliable source, especially in case other publishers start to lock them out.”

It’s something Burke feels is worth watching as he continues his research, particularly as those smaller papers face the choice of whether to restrict access to their reporting or consider A.I.’s role in a newsroom. If you’ll task A.I. with analyzing government records in search of scandal, it’s not a far leap to just ask an algorithm to write the story, leaving out human judgment altogether. 

“Part of that recommendation equation is this question of credibility,” Burke said. “So when an article is recommended to you, what does the system need to do to ensure it’s credible—even if I might prefer some version of the news that suits my ideal ideological inclinations better?

“It’s why I think it’s such an important research goal to explore more of this space.” 

  Visit