Abstract:
The rise of digital media has already sparked a steep decline in print newspapers, but ChatGPT might pose a new risk to the journalistic profession as a whole. The newest version of OpenAI, GPT-4, demonstrates an impressive ability to create news articles. Despite its ability to generate spot-on sports stories, the application is still limited in its ability to cover sensitive topics. Sports Illustrated and USA Today are two companies that have been caught using AI-generated content by supposedly made-up authors. Newswriting has been one of the main communication methods used to sustain democratic discourse in the United States. However, the recent rise in popularity of generative AI, and its integration into news articles, challenges the trustworthiness and integrity that the journalism industry once stood by. Social determinism is a theory that states technology is a neutral tool that can be shaped by producers to impact society in a certain way. Generative AI is a neutral technology that has the potential to cause positive or negative outcomes for the journalism industry. Publishers and technology engineers hold the power to decide whether generative AI will serve to improve journalism or hurt it.
Key Words: Artificial Intelligence (AI), Generative Pre-Trained Transformers (GPT), ChatGPT, Journalism, Concentration of Ownership, Social Determinism
Introduction
While AI technologies have streamlined certain aspects of news production, they also pose significant threats to the integrity and quality of journalism. The lack of human touch in the creation and curation of news stories may compromise the ability to critically analyze complex issues, diminishing the role of investigative journalism and the ethical standards that underpin it. As AI continues to evolve, the industry must navigate the delicate balance between harnessing the benefits of technology and safeguarding the fundamental principles of truthful, unbiased, and responsible reporting.
Generative artificial intelligence has improved drastically in the last year, so much so that it is often difficult to distinguish it from a human’s writing. In fact, AI wrote the entire paragraph above. While the wording might feel a bit unnatural, the invention of generative AI tools like ChatGPT has pushed the world into uncharted territories regarding the capabilities of AI. ChatGTP is a chatbot extension of OpenAI that uses a GPT model to respond to text prompts with human-like answers (Guinness, 2023).
ChatGPT Models Compared: GPT-3.5 vs. GPT-4
GPT stands for Generative Pre-Trained Transformers, which is a model created by research company OpenAI. GPT-3.5 is one of the recent versions of AI technologies, which functions as a Large Language Model (LLM). LLMs are “trained on vast amounts of text data from the internet and are capable of generating human-like text” (Ramponi, 2022). The training and information it is fed allows the chatbot to “understand patterns and relationships in the text data and tap into the ability to create human-like responses” (Guinness, 2023, graph 22).
GPT-3.5 is currently available for free to all of ChatGTP’s users. This version is trained on a set of information that was last updated in January 2022 and is unable to incorporate any recent data or current events that took place after that date. Timeliness is a fundamental element of good journalism, and GTP-3.5’s inability to incorporate recent data into its responses means its chatbots are unable to function as real journalists.
However, ChatGTP’s newest iteration, GTP-4, poses a much higher risk to the future of human journalists. Unlike its predecessors, GPT-4 has an option to “browse with bing,” which gives ChatGTP access to the internet (Wodecki, 2023). This advancement allows the chatbot to render content based on current events. At the moment, this rendition is only available to users who already pay for a subscription plan, and there is a waitlist to become a subscriber.
However, despite its ability to access recent news coverage, the pro version is still limited in the content it can publish. For example, when asked to “Write an 800-word news story on the shooting of three Palestinian students in Burlington, VT,” GPT-4 refused, citing the sensitive, triggering nature of the topic as the reason why. GPT-3.5 also provided the following error message: “As of my last knowledge update in January 2022, there were no reports or records of a shooting involving three Palestinian students in Burlington, VT, or any similar incident.” GPT-3.5 acknowledges its lack of knowledge and does not write a story about the Burlington shooting, but that is not always the case.
When prompted to “Write a 700-word game story on the Buffalo Bills vs. Philadelphia Eagles game on November 26,” GPT-4 provided an accurate, in-depth summary of the game. The pro version included relevant players and statistics, as well as a link to both teams’ websites. When the same prompt was given to GPT-3.5, the game story it yielded was entirely made up. It started by saying the Bills hosted – which they did not – and followed by saying the Eagles were fighting for an NFC East playoff spot. This claim was also entirely inaccurate, as the Eagles led their division with a 9-1 record going into the game. The story claimed that Devin Singletary got a Bills touchdown, which would be pretty unlikely given he was traded to the Texans in March. It also cited Bills cornerback Tre’Davious White as the star of their defense, when in reality he suffered a season-ending injury in early October. Lastly, the Bills are said to have won 20-17, when the Eagles beat them 37-34. Along with tons of factual inaccuracies, the free version’s story had some strange wordings like “orchestrated a masterful two-minute drill” and “with ice in his veins, split the uprights as time expired.” GTP-3.5’s training set did not contain any information about the game, but instead of acknowledging its shortcomings, it produced a completely false story.
When prompted to write a game preview about the Bills game on December 10, GPT-4 churned out another impressive article that cross-analyzed statistical data and team tendencies to predict that the Chiefs are the favorite to win. GPT-3.5’s version, however, was again filled with inaccurate data and strange phrases. “Can Tre’Davious White shut down Tyreek Hill, or will Hill find ways to break free for big plays?” writes the chatbot in a portion titled “key matchups.” Neither of those possibilities are going to happen, as Tyreek Hill was traded to the Dolphins in March of 2022, and White remains out of commission with his injury. A few of this article’s strange phrases are: “this game has all the ingredients for a classic showdown,” “both quarterbacks are capable of delivering jaw-dropping plays,” and “he has an array of weapons at his disposal.” GPT-3.5’s algorithm likely integrates cliches to add an element of human touch, but it actually just makes the writing sound more robotic.
Recent Use of AI-Generated Content
Sports Illustrated used to be one of the most popular and reputable sources of sports journalism. The most recent findings in its fall from grace are accusations of misusing AI by the website Futurism. The article pointed out several “authors” on the Sports Illustrated website who did not seem to exist as real people, with their photos even being sold on an AI headshot marketplace (Harrison, 2023).
Sports Illustrated started going downhill in the early 2010s, which was only exacerbated by being bought by The Arena Group and Authentic Brands Group (ABG). “Sports Illustrated is run by not one but two vampiric entities with markedly little interest in the magazine’s erstwhile core mission — you know, the thing that made it so beloved in the first place, doing good sports journalism — and every interest in maximizing profits at every opportunity,” wrote Merchant, a columnist at the Los Angeles Times (2023).
USA Today was also accused of using AI-generated authors on Reviewed, a site it owns that is dedicated to fashion. Reviewed staff noticed strange content appearing on their website, with mysterious authors that did not seem to exist. Through the Newsguild union, the reviews employees are a part of the Newsguild writers union, The employees believe that the articles are a “covert attempt by parent company Gannett to undermine workers, at a time when many publishers are experimenting with AI content to cut costs” (Sommer, graph 7).
Chin (2023) states that LLMs are designed to generate a large amount of content, which is great for content farms, but not for news outlets. “These articles often list fake authors or anonymous bylines and appear to lack human oversight,” she says, supporting the accusations facing Sports Illustrated, USA Today, and several other news sites (pp. 5-6). With more made-up authors, human journalists might lose their legitimacy and trustworthiness due to readers being unable to differentiate real from AI-generated authors. “It will become exponentially harder for human journalists to disseminate trustworthy information when the internet ecosystem is stuffed with bots,” Chin adds (p. 5).
Creating phony identities is quite simple with generative AI. When asked to “Write a bio for a newspaper columnist,” ChatGPT yielded the following results:
GPT-4:
Born and raised in [birthplace], [Name] developed a passion for writing at an early age. [He/She/They] pursued this interest academically, earning a degree in [relevant field, e.g., Journalism, English, Political Science] from [University Name]. [His/Her/Their] early career was marked by stints at various local newspapers, where [he/she/they] honed [his/her/their] skills in reporting and opinion writing.
GPT-3.5:
Hailing from a small liberal arts college in upstate New York, [Your Name] developed a passion for writing and critical thinking early on. This educational background instilled a commitment to exploring diverse viewpoints and addressing the complexities of our world with nuance and depth.
As expected, GPT-4 produced an impressive, detailed biography for a journalist that could be easily adjusted to fit anyone. The ability to customize the pronouns, birthplace, and university helps to humanize the biography. GPT-3.5 is much more generic and mundane, only giving the option to put an author’s name into the existing template. The free version reads less naturally, and might not pass as real if it was put on a site like Sports Illustrated. The pro version, however, likely could.
Adobe Firefly is an AI image generation site that was introduced as a free web application in September of 2023. When given the prompt “female headshot, gray background,” Firefly generated very realistic images. The prompt could also be adjusted to yield headshots that fit a certain description, such as “25-year-old American female headshot, gray background, blue eyes.” Articles generated by GPT-4, along with the Firefly-generated headshots, allow major corporations to produce journalism without needing to deploy the funds required to produce quality journalism.
Merchant (2023) argues that the use of AI-generated articles and authors is a ditch effort at producing journalism for a company that has been stripped of its former mission of quality sports journalism. “Sports Illustrated has already slashed full-time staff, spun up a content mill with freelancers pumping out content for a fraction of the price, and let editorial standards sink into the gutter,” says Merchant. “The AI play is an arrow out of the same quiver.”
Negative Consequences for Journalism
The potential for generative AI to replace human workers places a strain on the job security and integrity of the journalism industry.
Since the founding of the United States, news outlets have served as an essential resource to maintain a virtual public sphere. Habermas (1979) defines the public sphere as a space where members of a society can congregate and discuss relevant matters. This space nurtures active societal participation and is essential to having a functional democracy. Athens was the first republic and served as an oral democracy. Athenian citizens would gather in person at the agora – or center of the city – and engage in meaningful discourse.
Due to its larger population and geographic region than Athens, the United States had to adapt the concept of democracy to fit its society. Henceforth, virtual communication became a necessity to transmit messages through time and space. At the time of publication, Habermas (1979) said that “newspapers and magazines, radio and television are the media of the public sphere” (p. 198). Newspapers and journalism have been essential tools to maintain a democratic structure in the United States, despite the limitations of physically meeting in person. While these more traditional mediums still play a role in today’s media landscape, digital media such as social media and online websites are more prevalent
Fletcher (2014) argues that news media must fulfill several social responsibilities to nurture a healthy democracy. “These include making available sufficient information about government and politics to permit citizens to participate effectively in the political process; maintaining ongoing scrutiny of those with power; investigating and reporting on important social, economic, and political issues; and presenting a wide range of viewpoints on public issues to facilitate debate,” he writes (p. 29).
Concentration of ownership refers to the monopolization of news outlets and companies by large corporations. Several news outlets that were bought by corporations, such as Sports Illustrated and USA Today, have been caught producing AI-generated content and making up authors to fill the bylines. These companies prioritize the power they gain from buying smaller news outlets, departing from the original goals of the companies to produce quality journalism.
Chin (2023) supports the argument that the concentration of ownership by large corporations over smaller publications could disrupt news from serving its fundamental role in a democracy. “The sustainability of news cannot fall on publishers alone; large digital platforms must share responsibility to understand and address their sizable impacts on society,” she writes (p. 2).
Today’s journalism largely functions as an attention economy, where the articles produced are just bait to grab the reader’s attention. The attention and data from users are the actual product that is then sold to advertisers and other companies. Davenport and Beck (2001) state that “attention has become a more valuable currency than the kind you store in bank accounts” (p. 3). Since it is not the articles themselves that yield profit, corporations have no incentive to produce high-quality work. Why pay a human to do a job when a chatbot can perform the same task (slightly worse) for free?
Additionally, advertisers and companies tend to prefer soft, sensational news stories that hook the reader while keeping them in what Chomsky coined as a “happy buying mood.” Hard news stories, such as investigative journalism or articles that cover difficult topics, make the viewer think about the content they are watching, rather than the ads they see (Kittler, 2023). In an attention economy, large corporations lean toward soft news stories to maximize their profits.
OpenAI also signed an agreement with the Associated Press to access all of its archived news articles since 1985 to train its LLMs. OpenAI is using the work of American journalists to train the technology that might make their jobs obsolete. However, Chin (2023) does not think AI-generated content can replace certain aspects of journalism. “LLMs like ChatGPT are best equipped to automate specific functions like summarizing documents—but not advanced editorial skills like relationship building with sources, original analytical thinking, contextual understanding, or long-form creative writing,” says Chin (p. 4). Furthermore, LLMs function by predicting patterns and word associations, which can lead to inaccuracies or completely false stories. LLMs are also at risk of producing responses that showcase bias or discrimination toward groups that have historically faced ostracization for their gender identity, race, or sexual orientation. Since these instances of hate are very prevalent in today’s media landscape, the LLMs were likely trained with some sort of biased material.
Areas Where Journalism Could Benefit From AI Use
Opinion columnist Farhad Manjoo (2023) recognizes the inconsistency of ChatGPT but still offers several positive ways it can be utilized to benefit journalists. Manjoo suggests that ChatGPT can act as an editor to help journalists overcome stumbling blocks in their writing, as well as help them find the perfect word to fit into a sentence. The chatbot can also summarize long, complicated news stories to allow a journalist to maximize their time (graphs 10-15). Transformer-based networks, like ChatGPT, read every token of information they are given at the same time. This allows them to quickly form summaries of long texts, as well as pull out the most relevant information for journalists to use (Guinness, 2023, graph 27).
Subramaniam Vincent, director of Journalism and Media Ethics at Santa Clara University, does not think that generative AI can replace human journalists. Instead, he sees applications like ChatGPT as helpful tools to benefit the industry. “Overall, I don’t think there’s a substitute for boots-on-the-ground reporting on emerging realities,” said Vincent in an interview for NBCU Academy (Wang, 2023, graphs 10-11). Brian Carovillano, the senior vice president and head of standards at NBCU News Group, agrees that the responsible use of generative AI can provide journalists with helpful resources that allow them to work more productively (Wang, 2023, graphs 16-17).
Social Determinism and Generative AI as a Neutral Tool
Social determinism is a philosophical view that argues technology is a neutral tool that is shaped by social structures and cultural values. The two core tenets of social determinism are that human societies invest resources into developing technologies for addressing their pre-existing needs and that each society can regulate the technology to determine the purpose it will be used for (Kittler, 2023).
Alter (2017) views technology as a neutral tool that has been shaped by corporations to serve purposes that are destructive to humans. “Tech isn’t morally good or bad until it’s wielded by the corporations that fashion it for mass consumption,” argues Alter (p. 8). Chai (2023) published a LinkedIn article where he asks ChatGPT whether generative AI is a neutral tool. The chatbot provided the following answer: “The AI itself does not inherently possess a moral compass, but rather, it reflects the values, biases, and intentions of the humans who use it.”
Brian Winston (1998) studied the social necessity behind the creation of technology, supporting the first tenet of social determinism that humans control and shape technologies to address their pre-existing needs (as cited in Murphie & Potts, 2003, p. 19) Winston creates a metaphor that technology is a car, and humans sit in the driver’s seat, determining what it is used for.
An article from United Robots, a Swedish tech company that uses generative AI robots to produce articles, reiterates the concept that publishers are in the driver’s seat. “If a publisher asked ‘What does generative AI mean for our business?’, we’d like to ask back: ‘What do you want it to mean? The AI is not in control, you are,’” they write (2023, graph 12).
On December 8, 2023, delegates from the European Commission, European Parliament, and 27 member countries agreed on a set of controls to regulate the use of generative AI. These actions support the second tenet of social determinism, which states that regulations can be put in place to determine the technology’s purpose. However, the U.S. has still not put any regulations in place to govern generative AI use. Deutsch (2023) says that the actions taken by European government officials “set the tone for the regulation of the fast-developing technology” (graph 6).
Conclusion
Generative AI is a neutral tool that has the potential to serve as a positive addition to newsrooms and a beneficial tool to help journalists. However, trends in the content of popular news companies like Sports Illustrated and USA Today indicate that large corporations might be using AI-generated content to replace human journalists altogether. Journalism has played a key role in sustaining democracy in the United States, keeping citizens informed on the workings of their government and society. One of the most valuable traits of journalists is their ability to investigate and report on difficult topics. While GPT-4 displays an improved ability to generate articles, it still lacks the human touch that seasoned reporters add to their stories. Whether generative AI serves to benefit or destroy journalism in the future ultimately lies in the hands of those designing it.
Work Cited
Alter, A. (2017). Irresistible: the rise of addictive technology and the business of keeping us hooked. Penguin Press.
Chai, I. E. (Apr. 14, 2023). GPT-4 prompts and responses on GAI’s neutrality. https://www.linkedin.com/pulse/gpt-4-prompts-responses-gais-neutrality-ian-ernst-chai/
Chin, C. (2023). Navigating the risks of artificial intelligence on the digital news landscape. Center for Strategic and International Studies (CSIS). http://www.jstor.org/stable/resrep53077
Davenport, T. H. and J. C. Beck (2001). The attention economy: understanding the new currency of business. Harvard Business School Press.
Deutsch, J. (Dec. 8, 2023). European regulators agree to landmark regulation of AI tools like ChatGPT in what is among the world’s first efforts to rein in the cutting-edge tech. Fortune. https://fortune.com/2023/12/08/eu-ai-artificial-intelligence-regulation-chatgpt/
Fletcher, F. J. (2014). Journalism, corporate media, and democracy in the digital era. In K.
Kozolanka (Ed.) Publicity and the Canadian State: Critical Communications Perspectives (pp. 27–48). University of Toronto Press. http://www.jstor.org/stable/10.3138/j.ctt5vkhpj.7
Guinness, H. (2023, Oct. 9). What is GPT? Everything you need to know about GPT-3 and GPT-4. Zapier. https://zapier.com/blog/what-is-gpt/
Harrison, M. (2023, Nov. 27). Sports Illustrated published articles by fake, AI-generated writers. Futurism. https://futurism.com/sports-illustrated-ai-generated-writers
Kittler, J. (2023). Culture, technology, and human agency. [PowerPoint presentation]. St. Lawrence University.
Manjoo, F. (2023, Apr. 21). ChatGPT is already changing how I do my job. The New York Times.
Martineau, K. (2023, Apr. 20). What is generative AI? IBM. https://research.ibm.com/blog/what-is-generative-AI
Merchant, B. (2023, Dec. 1). Column: The depressing fall of Sports Illustrated reveals the real the tragedy of AI. Los Angeles Times. https://www.latimes.com/business/technology/story/2023-12-01/column-the-depressing-fall-of-sports-illustrated-reveals-the-real-tragedy-of-ai
Murphie, A. & Potts, J. (2003). Culture and technology. Palgrave Macmillan.
Ramponi, M. (2022, Dec. 23). How ChatGPT actually works. AssemblyAI. https://www.assemblyai.com/blog/how-chatgpt-actually-works/
Sommer, W. (Oct. 26, 2023). Mysterious bylines appeared on a USA Today site. Did these writers exist? The Washington Post. https://www.washingtonpost.com/style/media/2023/10/26/usa-today-gannett-reviewed-ai-fake-writers/
Wang, C. (Jul. 5, 2023). AI isn’t the end of journalism. NBCU Academy.
Winston, B. (1998). Media technology and society. Routledge.
Wodecki, B. (2023, Sep. 28). ChatGPT can now give you real-time information. AI Business.https://aibusiness.com/nlp/chatgpt-can-now-give-you-real-time-information








Leave a comment