However, respondents were open to it being used to speed up production processes
The BBC has released a report it commissioned on the use of AI in media, which has found the public uneasy with the use of generative AI to fully create video content.
Conducting detailed interviews with over 150 people from the UK, US, and Australia, the report found that while people are open to the technology being used to speed up production and personalise video content, they did not want it to take over from human creativity.
Personalised trailers and summaries were seen as “low stakes”, as long as they were true to the content being advertised, as well as personalised recommendations and avatars for use in video content - as they already are in gaming.
Meanwhile, “medium stakes” issues, where audiences “need reassurances that human skill and creativity will be respected to feel comfortable”, included using GenAI to swap out actors or changes characters’ age, sex, or ethnicity - as this could undermine a producer or director’s decision with regards to casting and storyline. Respondents were also conflicted over its use for editing and non-hero VFX, with worries over job losses. In terms of ideation, people were open to generative AI being used as inspiration, more so with content such as reality shows and less with drama and films.
Finally, “high stakes” uses, which audiences were particularly worried about, and less open to, included the use of generative AI to create storylines or scripts, and to produce or direct shows. This was particularly strongly felt for film and TV, with more openness to its use with digital content. Participants were least comfortable with generative AI being used to “create life- like actors, presenters or news casters that appeared to be human.”
In general, interviewees were most open to generative AI’s use in audio, less to it in video, and particularly worried about its use in news content.
As part of the report’s release, BBC programme director for generative artificial intelligence Peter Archer wrote: “It’s important to note that while this new report helps us understand how audiences view the use of Gen AI in media, it is not a roadmap for future development. It doesn’t consider the views of other communities like creative talent or production teams, whose views are very important to how any media organisation uses GenAI. Nor does it consider the technology itself, including questions of accuracy and the incorrect or misleading results that AI models can generate (hallucinations).
“Many of the issues raised in the report are not unique to the BBC. So, it’s important that we work together with our colleagues in the media industry to discuss the opportunities and challenges of GenAI.
“There’s also the simple truth that this is new territory for all of us, and many people’s views are likely to evolve as the potential of GenAI becomes better understood and its use becomes more commonplace. We will reflect carefully on what audiences have told us and use it as one input to shape further experimentation.”
Image: Claude AI
No comments yet