For the past ten years, AP has been experimenting with more basic kinds of artificial intelligence, employing it to produce brief news pieces from corporate earnings data or sports box scores.

AI in newsroom

The Associated Press has released guidelines regarding artificial intelligence, stating that staff workers should familiarize themselves with the technology but not utilize the program to produce publishable content or photos for the news agency.

Only a few news organizations, like AP, have started to establish guidelines for using ChatGPT and other rapidly evolving technology into their daily operations. On Thursday, the organization will pair this with a section in its venerable Stylebook that offers guidance to reporters on how to cover the issue and includes a glossary of terms.

“We want to make sure that people know that we can experiment a little bit while staying safe,” stated Amanda Barrett, the Associated Press vice president of journalistic standards and inclusion.

In a statement this spring, the journalism think tank Poynter Institute called it a “transformational moment” and urged news organizations to develop AI usage guidelines and disseminate them to viewers and readers.

Although generative AI can produce text, images, music, and video on demand, it is still unable to fully discriminate between reality and fabrication.

See also: Is marketing susceptible to AI takeover?

Because of this, the AP stated that content generated by artificial intelligence needs to be thoroughly reviewed, just like content from any other news source. Similarly, the AP stated that unless the modified content is the focus of an article, it is not appropriate to utilize an image, video, or audio clip produced by AI.

This is consistent with the statement made by tech publication Wired, which states that it does not publish AI-generated stories “unless the fact that it is AI-generated is the point of the whole story.”

Editor-in-chief of Insider Nicholas Carlson wrote to staff members, “Your stories must be completely written by you,” in a memo that was made public to readers.

“Every word in your stories must be true, fair, unique, and of high quality. You are accountable for these things.”

Consumers should be aware that standards are in place to “make sure the content they’re reading, watching, and listening to is verified, credible, and as fair as possible,” Poynter stated in an editorial, in light of well-publicized incidents of AI-generated “hallucinations,” or made-up facts.

The uses of generative AI outside of publication have been described by news organizations. For instance, it can assist AP editors in compiling digests of stories that are currently being worked on and distributed to subscribers. It might assist editors in crafting headlines.

For ten years, AP has been experimenting with more basic versions of artificial intelligence, turning corporate earnings information and sports box scores into brief news articles. Barrett stated that although that was valuable experience, “we still want to enter this new phase cautiously, making sure we protect our credibility and our journalism.” The Associated Press and ChatGPT creator OpenAI last month announced a partnership that would allow the AI startup to license the news story archive used by the latter for training reasons.

News organizations fear that AI businesses may use their content without their consent or payment. To safeguard the intellectual property rights of its members, the News Media Alliance, which represents hundreds of publishers, released a declaration of principles.

Concerns about artificial intelligence potentially taking over human occupations have been voiced by certain journalists. This is a topic of great interest, for instance, in contract negotiations between the Associated Press and the News Media Guild. The union’s president, Vin Cherwoo, stated that the guild hasn’t had time to properly analyze their meaning.

Cherwoo stated, “We have questions about some provisions and were encouraged by others.”

Barrett stated that even with protections in place, AP wants its reporters to learn about the technology since they will need to cover stories about it in the future.

Many of the elements that will be covered in the chapter that will be released on Thursday will be explained by AP’s Stylebook, a guide to journalistic standards and guidelines for the use of terminology in stories.

According to the AP, “the artificial intelligence story goes far beyond business and technology.” Along with many other topics, it also touches on human rights, the economy, politics, entertainment, education, sports, equality and inequality, and international law. Promising AI narratives demonstrate the wide range of applications of these technologies in human lives. A glossary of terms related to machine learning, training data, facial recognition, and algorithmic bias is included in this chapter.

It should not be regarded as the last word on the subject. Barrett added that a committee looking into guidelines on the subject meets once a month.

PC Soni Editor

Categorized in:

Artificial Intelligence, ChatGPT,

Last Update: 3 July 2024