Into the spider-verse is one of my favorite comic book films ever made it made huge leaps in bringing a more artistic appeal to CG animation and it perfectly captured the essence of what a comic book feels like to read I would love to see more stories told in this World but the problem is it took years of research and development to innovate the style of the film it took so long in fact that at the end of the process the creative team behind the film started to implement machine learning into their workflow and since the film’s release in 2018 it will be four years until we get the sequel so in the meantime the question I’m asking is is it possible to replicate the style of the film in a procedural manner we’ve been experimenting with a lot of AI tools here at the studio in particular AI Image generation last week I had a huge breakthrough with a program called stable diffusion and I think this method will allow us to tell a story in the spider-verse universe by letting us take a real piece of footage and taking it into the spider-verse Dean you have something you have Something cool to show me right I’m not looking to have something I want you to spoil it we put out a video called the death of the effects you broke new ground with AI image generation I found some new ground I’m going to break that Ground now in front of you here is my first test wow it looks like you’re an Arcane like this stuff usually looks like garbage and yours doesn’t look like garbage okay I’ll show you a little more this is amazing in the popular film Man of Steel that looks really good that’s like deep Fake level quality I call it the stable fake it’s a stable fake the closest we got to like an animation run through stable diffusion that looked anywhere decent was like the low polygon Jurassic Park test but even then it’s like we’re all over the place just every once in a While there’s a glimpse like oh that almost worked but if you could fix that you basically make it so people can make animations without having to actually hand draw every frame they can just take video and you appear to have fixed it how are you doing this we’re applying a Process that you have used previously when you were running those deep fake program you were using deep face lab would kick out these stabilized face sets yeah yeah I applied that process here did a track of the face punched in on that exported that and then ran that Through stable infusion what that does is it really just allows a consistent character to emerge by using a locked seed here Nico what does that mean how does stable diffusion how do diffusion models work in the simplest of terms it’s a noise pattern that the computer looks at and tries to interpret An image from it’s basically the electronic equivalent of looking at clouds and trying to see what they look like it’s like oh it’s kind of a face with a hat that kind of looks like an alligator what you’re doing here is you’re basically telling stable diffusion like take this noise pattern And interpret it as deem from Arcane and because a picture of you with a noise put on top you kind of have your likeness but it has some flexibility to kind of put that style on there the problem is stable diffusion on each frame usually changes the noise you have A blotch here that gets viewed as Shadow but the next frame now all the blotches are in different spots so maybe the hair is a little bit different and there’s a shadow over here and the I got adjusted weird but what you’re doing is you’re locking the noise so it’s the same noise Pattern in every frame the problem with that is when things move under that noise pattern you’re going to see everything kind of like stick to the spots and it’s like you pause on one frame and it looks really good but I’m wearing a wrist watch and this one frame And then the next frame I’m not but you’re solving that problem by locking everything to the face and having the face just locked in the middle of the frame so the noise stays locked on top so I’m able to achieve this art style thanks to a guy named Nitro sock on this Website called hugging face he’s created these different stable diffusion models that have specific training on different art styles and he’s also created a model for into the spider-verse and you can apply the same process as a matter of fact I took a scene from Spider-Man 2 and applied the into the spider-verse Really model I drove Spider-Man away he was the only one who could have stopped Octavius yes well is the result wow so that’s Jay Jonah into the spider-verse wow the face is the most important thing and in the face the eyes are the most important thing I can see where he’s Looking and I can see what he’s thinking I really want to take this process we’re seeing here because this Jay Jones Jameson shot is my favorite I would love to see this style applied to maybe I see you wait wait wait wait wait wait wait wait wait wait spider-verse Took four years to make and thousands and thousands of artists you have the arrogance to think that you can just one man with one computer make any shot of spider verse shot well I might need one other man with one other computer and that guy that that guy’s name is Better I think with the help of Fenner we could maybe take a look at the style of the spider-verse films and try to replicate that in Nuke Not only would we have procedurally generated faces from stable diffusion we would have a procedurally generated look from those Films I I have one challenge for you can you give me some new spider spider verse shots that aren’t just other frames the filter like can you show me that you can truly create spider-verse on your own using this technique any excuse to put on a Spider-Man costume Is a worthy one to me I will make my spider-verse fan film All right Federer we have decided to move forward putting the MCU into the spider-verse we’re going to pull up the movie and we’re going to dissect it frame by frame and figure out how can we build a system that procedurally replicates that look so that we can plug A shot in and it’ll already look like it’s coming out of the spider verse sweet I love it let’s do it all right let’s do this one last time the first thing I’m seeing here anything that has exponential glow has that comic book kind of halftone pattern the transition From dark to light always has that halftone pattern you’ll see in the Shadows on Spider-Man’s face you see the cross hatching that’s not a crazy effect for us to do like you just have basically a simple lumic here we can have the highlights reveal a half tone Pattern Shadow areas that can reveal our crosshatch pattern there’s another post effect here that I think is critical to achieving the look you guys can see in the Deep background the colors begin to separate and break apart and that’s chromatic aberration without using depth of field this allowed them to create a Separation between the subject and the background and also this is the sort of stuff we should be adding in like the spidey sense squiggle lines and stuff yeah and here like motion blur is it’s almost kind of like a ghosting stepping yeah so that’s something to pay Attention to as well a lot of times in the movie The foreground character will be on twos so they’ll move every other frame but the background is on once and it’ll move every frame anytime we have a character on an abstract background we should totally do twos the biggest thing About this film’s aesthetic they embraced imperfections they brought imperfections back into CG which is so perfect whether that’s the offset lines you see on a character’s Iris or sudden pop frames that are from a completely different art style all of these things lend themselves to AI generated imagery Well I feel like we kind of have it highlights are going to reveal our halftone patterns Shadows are going to reveal crosshatch pattern chromatic aberration chromatic aberration for separating out background from foreground and then also paying attention to those style frames and just like maybe we won’t run like three Frames through stable diffusion just lean heavy into the Jack Kirby art style lean heavy into like an anime art style like I think what we need to do is test this process on a shot from no way home to see if it’s actually possible to bring a real Spider-Man into the spider-verse foreign Guys we have our first spider-verse test ready to watch all right three two one go Wow it looks like he’s in the spiderverse oh dude holy smokes it’s like so it’s so consistent yeah Fenner just did like a little bit of Magic on it and then it looks perfect it looks like something that a professional Animation Studio created for across the spider verse So with this Andrew Garfield test working I think we have our look dialed in and we’re ready to start diving into this spider-verse short we’re going to write and edit a little piece using footage from the MCU films and into the spider-verse and Fender is going to take A bunch of those shots and run him through his new script thus applying the spider-verse look to these MCU shots in the meantime we have a few original scenes that we need to film so I’m going to squeeze into our dumbest Spider-Man suit and hop in front of the green Screen and with the help of Matt I’m going to shoot some plates that are basically just going to end up as reference for stable diffusion we’ll run my face independently through stable diffusion to get me looking like a spider-verse Tom Holland and then we’ll isolate the pajama suit in stable Diffusion and really push it to look like the suit from Spider-Man homecoming all we need is one usable frame and then we can use this program called EB synth to do a style transfer of that image back onto the original plate and when Fenner combines the face with the suit And his nuke magic it’ll look like an animated shot of Tom Holland in the spider-verse I just it’s like I don’t know what I mean it looks great I have no idea what I’m looking at I can’t tell if you’re just playing these clips from the spider-verse movie or not no this is This is what I wanted to show you so you saw me yesterday I was like Sam’s gonna think this is super lame but you add your spider once you add the cool filter you become the real Spiderman what what and so then you’re running through EB synth two so the costume is through Evsynth and the faces through stable Fusion so that was so crazy at the other day I was just in stupid pajamas if you guys want like a free Oscar do something original with this I think that’s the next one so we’re having a freaking blast and we’re getting some really cool results I Have like a few reservations I feel like there’s an element of this where I just hit generate on this button and just get this sweet face out that looks just like this film I love I kind of I kind of questioned myself I’m like are we just stealing from these artists that we Super respect I don’t know you’ve been messing with the stuff longer than anyone here and I just wonder like what your perspective is on that I mean it just comes down to like being open about things right like you guys didn’t make the spider-verse style if you made like A short film and you just use the spider-verse style and you didn’t credit them or acknowledge it you just said no this is mine like yeah that’d be a dick move right there’s nothing wrong with using someone’s style you just have to be honest about it oh you took Inspiration from that person then don’t pretend that you invented it just give that person credit who cares if you use an AI it’s no different than using tracing paper right it’s like so what like give credit where credit’s due if you want to be yours bring something to the table that Serious make something fun bring your own voice to it everybody stands on the backs of giants everybody’s benefited from all the hard work other people have done so we can’t act like now so I’m like oh you can’t use someone else’s Styles like then you can use your style But just say it’s their style it’s what you create it’s what you say it’s like yeah that makes sense to me you know if the AI is trained on billions of images I mean we’re trained on yeah millions of images you know it’s like hip-hop it’s Like samples you know but there there is a there is a line it’s up to the audience to be moral people and support people that you know create and don’t just support people that rip people off you know I mean you’re in it right you’re in it so you’ve acted you’ve Performed you’re saying something you’re doing something even like yeah you’re good all right Nico I’m pretty set of these now I think we have the Creative Energy now to wrap this piece out and uh I can’t wait to show it to you thank you so we are using AI at every level of This piece from AI generated Roto to AI generated images all sorts of crazy things but the main thing we’re doing with AI is we’re using it as a tool we’re not using it to create this entire piece it’s not actually you know taking away creativity it’s actually enabling It in particular this shot here where we have our Peter Parker being sucked into the spider-verse basically what I was able to do was bring our PNG image sequence into stable diffusion and run every single individual frame through as a different art style so it creates a really cool effect another thing we’re Doing to blend these two different cinematic universes and art styles in a cohesive way is basically taking parts of the spider-verse that are really iconic such as these Kirby dot elements we actually start to see those really cool graphic images from the spider-verse appear in our real world Scene so once our Peter Parker is sucked into the spider-verse we needed to kind of take some iconic MCU shots and reimagine them in that classic spider-verse style we’re basically taking our foreground characters and making it so that only every second frame of them is animated it’s something That the spider-verse movie uses to sell that 2D animated look despite it being CG characters what I’m also doing is adding in that little comic Splash we added this big Kaboom element behind when the little drone explodes another example of how we’re kind of pushing the Graphic style in this video to take some of that spider-verse feeling is a shot like this where we’re really going over the top with all the graphic elements we basically took this Hulk from Ragnarok ran it through stable diffusion to generate our face here it just made this Such a you know graphic looking shot he has the white outline around him he’s got we really actually we went in there and rotoed specifically his teeth so they really pop so you can see here as we step through we’re not just basically you know taking the Hulk and throwing a Snapchat face filter on him we’re actually taking these images breaking them apart rebuilding them piecing together different graphic elements and ultimately adding on our own kind of look and style to this thing that’s inspired by the incredible incredible art that was done for the into the spider-verse movie so we still have a Bunch of work to do on this project before it comes out this weekend but because we did all of this work on the front end using all these AI tools we’re able to basically just be in this sandbox where we can pull all these different images that we’ve created Using Ai and create this really awesome final collage of images Ben and I have been cranking on these spider-verse shots almost every shot of this piece we’ve edited together is a VFX shot we’re essentially making an animated short film and it hasn’t just been us Jordan has been working on some Sick renders and Peter is working on the final shot of the piece so we’re getting really close to the finish line but I realized it’s not complete unless we have as many Spider-Man as humanly possible appear and for that we’re gonna have to do a deep dive on the most obscure Spider-Man So we are sitting down here for a very important reason we are going to find all of the spider we’re gonna find all of them and we’re gonna figure out which ones will appear into the spider-verse now they’re working on the sequel to into the spider-verse Spider-Man 2099 is In the trailer but we gotta get the cream of the crop you got to get the ones that that truly reward the Die Hard 70s Japanese Spiderman was that a freeze frame in the actual thing I mean it’s that yeah so we got that guy of course I just thought of one Remember this game dude when we were kids this was like Cutting Edge Cutting Edge you’re like this is real yeah and the whole city is like a cloud I have one that I’d like to submit and it might be the greatest Spider-Man of all time golden sponge cake Spider-Man he can only shoot Hostess snacks yes yes it was supposed to be I guess like a cross promotion but then it became Canon who would win golden sponge cake Spider-Man versus Superman you can only shoot Twinkies yeah he’s going in there son do you have another like crazy one Like that oh maybe you’re familiar I think they call them spider monkey Marvel AIDS purple apes that’s what it’s called and honestly stable diffusion puts hands for feet anyway so what if we grab a shot of Caesar from Planet of the Apes we Evie synth a Spider-Man costume onto Caesar Yeah and then you just have a little tail come up it’s pretty good it’ll be on the list yeah I think we got it guys I’ll edit the final sequence together hopefully will be able to fit them all into the piece and fulfill the promise that everybody’s home All right everybody I hit a button on my computer a week ago and a video has popped out and Fender and I did nothing that’s true that’s not true we spent a lot of time on this and we’re really excited about it because there’s some Cool new tech in it but also there’s a lot of love for the original spider-verse film and uh yeah I just want to hop into it see what you guys think What is this Because of the Multiverse kit now shut up because I don’t remember which potion to put in this little cup oh yeah this is the one Are you okay I think so are you from another dimension well it’s a long story but I’ll make it quick all right let’s do this one last time my name is Peter Parker I was bitten by a radioactive spider and for six years I’ve been need one and only Spider-Man Well I was dead for five of those years but that’s another story I saved a bunch of people I fell in love saved the city and then the world I even joined a team you might have heard of them nah not them it was Captain America also we had Hulk Thor Black Widow Scarlet Witch Doctor Strange black panther Ant-Man Falcon the Winter Soldier vision and uh oh that guy with the bow and arrow and I’ll never forget my mentor Mr star what else oh there was this big purple guy his chin kind of looked like Who are these guys Decent time oh here we go not again who are you guys anyway I gotta say you look just like me say my name and I magically appear who’s next I’m gonna put some dirt in your eye it’s still part of the experiment Beware my tasty Hostess snacks [Applause] Spider-Man [Applause] that might be like the best use of stable diffusion up to this point on the planet and taking that Meme and taking it to the power of ten thousand they’re just like thousands of friggin spiders special shout outs Peter that shot if you had to guess how many different Iterations of Spider-Man you guys generated with AI hundreds yeah same with Doctor Strange Doctor Strange is a cowboy for a couple frames yeah it’s pretty awesome and the twos they’re on freaking Thanos animating on the twos like you guys did is such a great touch It just makes it feel so much more like it’s a part of that Universe yeah like watching this right now it’s like I forget that this is being AI like a lot of this stuff is AI generated I just get lost in the sauce of it it’s just like You guys have like seamlessly integrated the style of a spider verse well we didn’t expect going into this was how stable diffusion would be so useful just for generating like still frames of backgrounds and like of course like the Spider-Man it’s easy to dismiss how much actual hard work is here it’s like There’s some AI stuff but like 80 of this is just grit you guys just like knuckling down and making it Fenner like you brought that extra level of it having an artistic appeal to this thing Man Spider Boys it’s worth noting too that I don’t think that we would have Gotten such good AI results here if we didn’t already have such a realistic Spider-Man costume that was the biggest thing Dean if you guys are interested in seeing how we’re gonna keep pushing this technology here hit that subscribe button freaking like the video I don’t know ring the Bell what else do they do hit our P.O box hey thank you I’m Spider-Man I’m Spiderman Video Information
This video, titled ‘We Put TOM HOLLAND into the SPIDERVERSE’, was uploaded by Corridor Crew on 2022-11-13 17:00:17. It has garnered 1463598 views and 97081 likes. The duration of the video is 00:22:32 or 1352 seconds.
Join OUR WEBSITE ► https://www.corridordigital.com/signup
THIS EPISODE ► Dean and Fenner set out to create a revolutionary take on the Marvel Cinematic Universe as they take a deep dive into A.i. Animation tools.
SUPPORT ► Join Our Website: https://bit.ly/Crew_Membership Instagram: http://bit.ly/_Corridor_Instagram Twitter: http://bit.ly/_Corridor_Twitter Buy Merch: http://bit.ly/Corridor_Store
OUR GEAR, SOFTWARE & PARTNERS ► Our Go-To Gear: https://bhpho.to/3r0wEnt Puget Systems Computers: http://bit.ly/PC_Puget_Workstations ActionVFX: https://bit.ly/TheBest_ActionVFX Lighting by Aputure: http://bit.ly/CORRIDOR_LIGHTS Cinema4D: http://bit.ly/Try_Cinema4D Greyscale Gorilla: https://bit.ly/GreyscaleGorilla Nuke by The Foundry: https://bit.ly/Nuke_Compositing Insydium: https://bit.ly/Insydium_Plugins Octane Render by OTOY: http://bit.ly/Octane_Wrender Boris FX; Mocha Pro, Sapphire, Silhouette & Continuum: https://bit.ly/2Y0XLUX Motion Captured with Xsens Suit: http://bit.ly/Xsens_MoCap_Suit Reallusion: https://corridor.video/Reallusion_3Dsoftware Unreal MegaGrant: http://bit.ly/Unreal_MegaGrant
MUSIC ► Epidemic: http://bit.ly/Corridor_Music click this link for a free month!