Posted in | News | Machine-Vision

AI Tool can be Used to Rate a Movie's Content in Advance

Movie ratings can determine a movie's appeal to consumers and the size of its potential audience. Thus, they have an impact on a film's bottom line. Typically, humans do the tedious task of manually rating a movie based on viewing the movie and making decisions on the presence of violence, drug abuse and sexual content.

Now, researchers at the USC Viterbi School of Engineering, armed with artificial intelligence tools, can rate a movie's content in a matter of seconds, based on the movie script and before a single scene is shot.

Such an approach could allow movie executives the ability to design a movie rating in advance and as desired, by making the appropriate edits on a script and before the shooting of a single scene.

Beyond the potential financial impact, such instantaneous feedback would allow storytellers and decision-makers to reflect on the content they are creating for the public and the impact such content might have on viewers.

Using artificial intelligence applied to scripts, Shrikanth Narayanan, University Professor and Niki & C. L. Max Nikias Chair in Engineering, and a team of researchers from the Signal Analysis and Interpretation Lab (SAIL) at USC Viterbi, have demonstrated that linguistic cues can effectively signal behaviors on violent acts, drug abuse and sexual content (actions that are often the basis for a film's ratings) about to be taken by a film's characters.

Method

Using 992 movie scripts that included violent, substance-abuse and sexual content, as determined by Common Sense Media, a non-profit organization that rates and makes recommendations for families and schools, the SAIL research team trained artificial intelligence to recognize corresponding risk behaviors, patterns and language.

The AI tool created receives as input all the script, processes it through a neural network and scans it for semantics and sentiment expressed. In the process, it classifies sentences and phrases as positive, negative, aggressive and other descriptors.

The AI tool automatically classifies words and phrases into three categories: violence, drug abuse and sexual content.

Victor Martinez, a doctoral candidate in computer science at USC Viterbi and the lead researcher on the study, which will appear in The Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing said, "Our model looks at the movie script, rather than the actual scenes, including e.g. sounds like a gunshot or explosion that occur later in the production pipeline. This has the benefit of providing a rating long before production to help filmmakers decide e.g. the degree of violence and whether it needs to be toned down."

The research team also includes Narayanan, a professor of electrical and computer engineering, computer science and linguistics, Krishna Somandepalli, a Ph.D. candidate in Electrical and Computing Engineering at USC Viterbi, and Professor Yalda T. Uhls of UCLA's Department of Psychology. They discovered many interesting connections between the portrayals of risky behaviors.

"There seems to be a correlation in the amount of content in a typical film focused on substance abuse and the amount of sexual content. Whether intentionally or not, filmmakers seem to match the level of substance abuse-related content with sexually explicit content," said Martinez.

Another interesting pattern also emerged. "We found that filmmakers compensate for low levels of violence with joint portrayals of substance abuse and sexual content," Martinez said.

Moreover, while many movies contain depictions of rampant drug-abuse and sexual content, the researchers found it highly unlikely for a film to have high levels of all three risky behaviors, perhaps because of Motion Picture Association (MPA) standards.

They also found an interesting connection between risk behaviors and MPA ratings. As sexual content increases, the MPA appears to put less emphasis on violence/substance-abuse content. Thus, regardless of violent and substance abuse content, a movie with a lot of sexual content will likely receive an R rating.

Narayanan whose SAIL lab has pioneered the field of media informatics and applied natural language processing in order to bring awareness in the creative community about the nuances of storytelling, calls media "a rich avenue for studying human communication, interaction and behavior, since it provides a window into society."

"At SAIL, we are designing technologies and tools, based on AI, for all stakeholders in this creative business - the writers, film-makers and producers - to raise awareness about the varied important details associated in telling their story on film," Narayanan said.

"Not only are we interested in the perspective of the storytellers of the narratives they weave," Narayanan said, "but also in understanding the impact on the audience and the 'take-away' from the whole experience. Tools like these will help raise societally-meaningful awareness, for example, through identifying negative stereotypes."

Added Martinez: "In the future, I'm interested in studying minorities and how they are represented, particularly in cases of violence, sex and drugs."

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.