For the second time in less than a month, Luke Lafreniere was awoken in the middle of the night to a company crisis. Tired from a lack of sleep, Lafreniere said he decided to have ChatGPT, an artificial intelligence-powered chatbot developed by OpenAI, help him write a difficult company-wide email explaining what had happened and that everything was OK.
“I was just like, ‘Man I’m wiped, I don’t want to make sure this is very eloquent right now,’” Lafreniere said. “So, I just fed all the notes to ChatGPT … had it write everything out, and then I edited it from there.”
Lafreniere is the chief financial officer ofLinus Media Group, or LMG, a tech-focused media company that operates seven YouTube channels with a combined total of over 27.5 million subscribers. He is also the chief operating officer ofFloatplane Media, a subsidiary of LMG that distributes paid-subscription-based content made by creators.
While he ultimately still had to rewrite portions of the AI’s draft to match his style and tone, Lafreniere said the chatbot allowed him to complete the email in less time.
“The original message from it was extremely formal and not in a way that I would talk at all,” Lafreniere said. “It did take editing, but it got me a very significant portion of the way there.”
While this use case for ChatGPT is not very exciting, Lafreniere said it displays how commonplace the technology could become if widely adopted.
“I’ve been interested in these types of tools for a long time, but the only things that I could get my hands on were very, very rudimentary,” Lafreniere said. “When ChatGPT itself came out … the thing that really was amazing with that was the barrier of entry being dropped by a huge amount.”
The modern-day: The rise of AI
Released to the public on Nov. 30, 2022, ChatGPT made waves throughout the tech world and into the mainstream. The AI-powered chatbot was developed byOpenAI, a company that has developed several varieties of AI software each with its special use case.
Part of what made ChatGPT so revolutionary was the support it received from large corporations, said Ricardo Gutierrez-Osuna, Ph.D., professor of computer science and engineering at A&M.
“I think that the main advantage of OpenAI is [the] injection of money and access to lots of data,” Gutierrez-Osuna said. “In this type of game, or industry, you need lots of data. If you have access to lots of data, you have an unfair advantage over the competition.”
Research into how AI technology can assist journalists has already begun with funding from organizations like theKnight Foundation, a nonprofit founded in 1950 by John S. and James L. Knight. The foundation’s goal is to support informed and engaged communities through journalism in the modern age, according to the company’swebsite.
In May 2021, the foundation began a $3 million initiative to assist local newsrooms in implementing AI into their reporting, according to the website. One such program included a $750,000 endeavor with The Associated Press, or AP, to begin AI workshop training within local newsrooms.
The AP itself also partnered with five local newsrooms across the country to further develop the use of AI in journalism, according to theAP’s website. One of these five newsrooms is the radio station WUOM-FM at the University of Michigan.
The project at WUOM-FM was started around four years ago and was developed to better cover city council meetings throughout Michigan that reporters might not otherwise have been able to attend, WUOM-FM reporter Dustin Dwyer said.
“The idea was to use the videos that are online, [and] create a program that could spot when there was a new meeting video for a city council meeting … and then generate a transcript of that meeting,” Dwyer said.
While the program originally received funding from theGoogle News Initiative — an initiative to support local newsrooms — Dwyer said the AP got involved to help develop the tool into something that more journalists could access.
“What we’re really focused on right now is really two things,” Dwyer said. “Improving the transcription model that we use … [and] creating a system of alerts so you can go onto our site, you can log in, create your profile and then you specify the keywords that you’re interested in as a reporter.”
The new transcription model WUMO-FM is switching to is called Whisper, Dwyer said. Whisper is a free audio transcription software developed and released by OpenAI, the same company that created ChatGPT, according to OpenAI’swebsite.
“I think everyone has experienced these automated transcripts and how they’ve developed these last few years,” Dwyer said. “They’ve gone from being kind of crappy to now being potentially pretty good … With Whisper’s largest model, the transcripts can be pretty decent.”
Even though AI writing has only recently garnered widespread attention, Alex Miller, the digital content coordinator for The Eagle newspaper in Bryan-College Station, said he remembers AI being used in journalism as far back as 2019.
“I heard about AI being used back when I was interning at the San Antonio Express-News,” Miller said. “At that time, we would pull just news briefs on pro sports games … and sometimes the AI would come up with a paragraph summary of a baseball game with the results.”
These short briefs were generated by a company called Automated Insights in partnership with the AP to save sports journalist time, according to the Automated Insightswebsite. While Miller said these early automated recaps were simple and short, he said the technology impressed him at the time.
“It was very generic, but it got the job done to encapsulate what happened [in a game],” Miller said. “I was like, ‘Oh wow, that’s kind of wild that they do that.’ That was the first time I had ever heard about anything like that.”
The potential of AI in applications like these cannot be ignored, Lafreniere said, and its influence might rival that of the widespread introduction of the internet.
“There was a massive transformation in how people work,” Lafreniere said. “There was a massive transformation in how efficient people were. … We have no idea where [AI] is going to be in six months, let alone five years.”
This does not mean, however, that media companies should start replacing their workforce with AI, Lafreniere said.
“I think a lot of the companies that use this as an opportunity to trim their workforce are going to be the companies that fall off, not the companies that succeed necessarily,” Lafreniere said. “I think the more aggressive, and therefore more successful approach, is going to be to use this [for] extreme efficiency to outperform other companies.”
A similar sentiment was shared by ChatGPT itself, which said AI should not substitute human input outright.
“AI can assist in various aspects of journalism such as analysis, fact-checking, language translation and even content creation,” ChatGPT said. “However, it’s important to note that AI should not be viewed as a replacement for human intelligence and judgment … it should be seen as a tool.”
Another comparison of AI as a tool is a calculator, Gutierrez-Osuna said.
“This happened in my lifetime, you know, calculators arrived, they became available to the public in the 1970s when I was in school,” Gutierrez-Osuna said. “We learned to use calculators and there was a lot of hesitation with people learning to use the calculator and forgetting to be able to do math by hand.”
The introduction of AI into journalism is just the next step in the continued digitization of media, according to the 2019 article, “Artificial Intelligence and Journalism” by Meredith Broussard and Nicholas Diakopoulos in the journal “Journalism & Mass Communication Quarterly.”
Even though this next step in digitization has begun, humans still need to be at the heart of journalism, according to the article.
“Journalism is a deeply human endeavor,” the article reads. “Whether we are researching how Humans use machines in journalism, or we are using machines to research or produce a story, the point is to report on and find insights into humanity.”
The future of AI in the newsroom, as envisioned by Broussard and Diakopoulos in their article, is one that includes both humans and AI. In their article, they advocate for a future in which AI is approached from a human-centric perspective regarding the technologies used in journalism.
“In short, the future of AI in journalism has a lot of people around,” the article reads. “Scholarship and practice should, therefore, seek to undertake an agenda for studying human-centered AI in journalism.”
The history of AI: Where it came from and how it works
Although AI technology, such as ChatGPT, has gained mainstream attention only recently, its origins date back to the 1950s, as highlighted by the 2019 journal article “A Brief History of Artificial Intelligence” by Michael Haenlein and Andreas Kaplan.
The term “artificial intelligence” first emerged in New Hampshire in 1956, according to the article. It appeared in the title of the “Dartmouth Summer Research Project on Artificial Intelligence”, or DSRPAI. As the article details, the DSRPAI was created to bring together researchers to begin exploring the possibility of creating computers that could emulate human intelligence.
One of the largest advancements in the field of AI came with the idea of neural networks, Gutierrez-Osuna said, which enables AI to accomplish more tasks with the addition of added layers.
“Neural networks have been known for decades,” Gutierrez-Osuna said. “The very early neural networks were proposed in the late 1950s … people knew how to train them, but these were networks with one layer … [and] the community realized that these single layer networks were not all that useful.”
As the name suggests, a neural network is based on the function of the human brain and Gutierrez-Osuna said a “cartoon version” of a single neuron simply takes a bunch of inputs and produces an output.
“They call them neurons because of the similarity at a very, very high level with how [human] neurons work,” Gutierrez-Osuna said. “When you have all these neurons connected together and you squint hard enough, it kind of looks like a neural network even though the analogy breaks very, very quickly.”
The implementation of multiple neuron layers was revolutionary to the field, Gutierrez-Osuna said, and is where the term “deep learning” comes from — a key component of how AI learns and functions.
“The advantage of this was if you have two layers or more, then these neural networks are universal approximators, meaning that you can approximate any function with them,” Gutierrez-Osuna said. “In practice, deep networks, meaning networks with multiple layers, were not always used because it was very difficult to train them.”
In the last 20 years, however, algorithmic innovations allowed more complex neural networks to be more easily trained, Gutierrez-Osuna said.
“They figured out algorithms that they could use for training networks with a large number of layers, hence deep networks or deep learning.”
The future: Growing pains to come
The use of AI as an aggregator of information would be revolutionary to the current way media companies market the use of their works, Lafreniere said.
“It’s going to be a nightmare,” Lafreniere said. “What [AI] is doing right now is distilling information from sources then delivering it directly to you instead of getting people to go there.”
It is almost the complete opposite of the way many publications receive their revenue, Lafreniere said.
“It’s going to be a very interesting world of trying to get people to actually come to your service if [companies] want to stick with these ad-supported models,” Lafreniere said. “But if you don’t stick with ad-supported models, and if it’s paid access, then it might just grab the information from somewhere else.”
If lawmakers and governments don’t react quickly, Lafreniere said AI might become just another lawless technology.
“We need to get politicians and regulatory bodies and stuff to get off their asses and actually do something relatively quickly for once,” Lafreniere said. “Instead of us being in this situation like we are with 3D printing, with drones, with the internet, with all this type of stuff where regulation is like 10 years, 20 years, behind.”
It is too late to try and stop or pause AI development because it is already here, Lafreniere said, and the best thing that individuals can do is learn to use it rather than ignore it.
“What I’m trying to figure out is how people in my life are going to be able to have careers in five to 10 years,” Lafreniere said. “I think if people are concerned, they should put their effort in trying to guide us in a brighter direction … because [AI] is not going anywhere”
While the rapid advancement of AI does present the media industry with a new challenge, Miller said he believes the steady decline of local reporting jobs is a greater threat to journalists’ job security than AI currently is.
“I think sure, AI and ChatGPT programs like that will make an impact on the writing world,” Miller said. “Somebody’s going to have to ask follow-up questions, somebody’s going to have to make sure that all these facts line up. … Maybe that’s kind of an OK thing because it frees up time for journalists to go out and do bigger stories.”
The introduction of AI is just the next step in a long line of information revolutions that journalists are going to have to adapt to, Dwyer said.
“I’m not afraid of AI replacing journalists at a time when the journalists have already been lost,” Dwyer said. “I’m worried about people in communities being informed in a world where there just aren’t as many journalists. In that sense, I think AI can potentially give us an opportunity to serve our communities in a way that we’re not serving them now.”