TECH: Meta’s new AI “Make-a-Video” that can generate movie clips from text messages

Please fol­low and like us:
Pin Share

Like Dall‑E and Mid­jour­ney, it uses machine learn­ing algo­rithms (and a vast data­base of online art­work) to turn writ­ten prompts into fantasies.

On Thurs­day, Meta CEO Mark Zucker­berg unveiled a more ani­mat­ed “make-a-video” of “make-a-scene.”

As the name sug­gests, Make-a-Video is “a new arti­fi­cial intel­li­gence sys­tem that allows users to turn text mes­sages into short, high-qual­i­ty video clips,” Zucker­berg wrote in a Meta blog post on Thursday.

ing. Func­tion­al­ly, Video is sim­i­lar to Scene in that it com­bines nat­ur­al lan­guage pro­cess­ing and gen­er­a­tive neur­al net­works to con­vert non-visu­al cues into images, only in a dif­fer­ent form of content.

“Our intu­ition is sim­ple: we learn from text-image pair data how the world looks and how it is described,” Meta researchers said in a research paper released Thurs­day morning.

Learn how the world works from unsu­per­vised video sequences.” By doing this, while main­tain­ing the “breadth” (aes­thet­ic diver­si­ty, fan­ta­sy expres­sion, etc.) of the cur­rent image gen­er­a­tion mod­el, the time required for train­ing the video mod­el is short­ened, and the paired data of text and video is unnec­es­sary. I was able to do it.

Like most of Meta’s AI research, Make-a-Video is pub­lished as an open source project. “I want to think about how to build new gen­er­a­tive AI sys­tems like this,” Zucker­berg said.

“We are open­ly shar­ing this gen­er­a­tive AI research and its results with the com­mu­ni­ty and solic­it­ing feed­back.” keep evolving. ”

As with all gen­er­a­tive AI on the mar­ket, make-a-video has plen­ty of oppor­tu­ni­ties for abuse.

The research team assumed the pos­si­bil­i­ty of mali­cious behav­ior and removed NSFW images and harm­ful phras­es from the Make-a-Video train­ing dataset in advance.

Please fol­low and like us:
Pin Share

Be the first to comment

Leave a Reply

Your email address will not be published.


*