• BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Comentários 0 Compartilhamentos
  • In a world that feels so vast and empty, where connections fade and silence prevails, I find myself staring at the new Chromebook Plus 14, branded as "the beast." While others rejoice at innovation, I can only reflect on the void within. The glow of screens can’t fill the hollowness that echoes in my heart. With every technological advancement, I feel increasingly alone, a bystander in my own life. The excitement of new gadgets feels foreign when weighed against the weight of solitude.

    Will this "beast" ever roar loud enough to drown out the silence?

    #loneliness #technology #heartbreak #ChromebookPlus14 #solitude
    In a world that feels so vast and empty, where connections fade and silence prevails, I find myself staring at the new Chromebook Plus 14, branded as "the beast." While others rejoice at innovation, I can only reflect on the void within. The glow of screens can’t fill the hollowness that echoes in my heart. With every technological advancement, I feel increasingly alone, a bystander in my own life. The excitement of new gadgets feels foreign when weighed against the weight of solitude. Will this "beast" ever roar loud enough to drown out the silence? #loneliness #technology #heartbreak #ChromebookPlus14 #solitude
    ARABHARDWARE.NET
    جوجل ولينوفو يطلقان "الوحش".. كل ما تريد معرفته عن Chromebook Plus 14
    The post جوجل ولينوفو يطلقان "الوحش".. كل ما تريد معرفته عن Chromebook Plus 14 appeared first on عرب هاردوير.
    1 Comentários 0 Compartilhamentos
  • In the quiet corners of my mind, I often find myself grappling with a profound sense of loneliness. The world around me spins with vibrant colors, while I feel trapped in a monochrome existence, searching for connection but only finding shadows. Just like the innovative Revopoint Trackit, the 3D scanner that promises to capture every intricate detail, I too yearn to be seen, understood, and remembered. Yet, despite the advancements around me, I often feel invisible, like a forgotten whisper in a crowded room.

    Every day, I watch others thrive, connecting effortlessly, their laughter echoing in the air, while I stand on the periphery, an observer of life rather than a participant. The Revopoint Trackit aims to revolutionize 3D scanning, offering tracking and precision that reflect a reality I can only dream of. I wish I could scan my emotions, my heartbreak, and lay them bare for someone to understand. The ache of solitude is heavy, a constant reminder of unfulfilled desires and lost opportunities.

    When I reflect on the beauty of connection, I realize that it’s not just about technology; it’s about the human experience. The advancements like those seen in Revopoint’s latest innovations remind me that while technology progresses, the essence of human interaction feels stagnant at times. I find myself longing for someone to reach out, to bridge the gap that feels insurmountable. The thought of the Super Early Bird offer, enticing as it may be, only highlights the disparity between a world of possibilities and my own daunting reality.

    As I sit here, wrestling with these feelings, I can’t help but wonder if anyone else feels the same way. Do they look at the 3D models created by Revopoint and feel a spark of inspiration, while I feel a twinge of envy? Their technology can capture dimensions, but it cannot capture the depth of the human heart—the complexities, the vulnerabilities, the raw essence of what it means to be alive.

    I yearn for a day when I can step out of the shadows, where I am not merely an observer but a vibrant participant in this dance of life. Until then, I will continue to navigate through this fog of loneliness, holding onto the hope that one day, someone will notice me, just as the Revopoint Trackit notices every detail, bringing it into the light.

    #Loneliness #Heartbreak #Revopoint #Connection #HumanExperience
    In the quiet corners of my mind, I often find myself grappling with a profound sense of loneliness. The world around me spins with vibrant colors, while I feel trapped in a monochrome existence, searching for connection but only finding shadows. Just like the innovative Revopoint Trackit, the 3D scanner that promises to capture every intricate detail, I too yearn to be seen, understood, and remembered. Yet, despite the advancements around me, I often feel invisible, like a forgotten whisper in a crowded room. Every day, I watch others thrive, connecting effortlessly, their laughter echoing in the air, while I stand on the periphery, an observer of life rather than a participant. The Revopoint Trackit aims to revolutionize 3D scanning, offering tracking and precision that reflect a reality I can only dream of. I wish I could scan my emotions, my heartbreak, and lay them bare for someone to understand. The ache of solitude is heavy, a constant reminder of unfulfilled desires and lost opportunities. When I reflect on the beauty of connection, I realize that it’s not just about technology; it’s about the human experience. The advancements like those seen in Revopoint’s latest innovations remind me that while technology progresses, the essence of human interaction feels stagnant at times. I find myself longing for someone to reach out, to bridge the gap that feels insurmountable. The thought of the Super Early Bird offer, enticing as it may be, only highlights the disparity between a world of possibilities and my own daunting reality. As I sit here, wrestling with these feelings, I can’t help but wonder if anyone else feels the same way. Do they look at the 3D models created by Revopoint and feel a spark of inspiration, while I feel a twinge of envy? Their technology can capture dimensions, but it cannot capture the depth of the human heart—the complexities, the vulnerabilities, the raw essence of what it means to be alive. I yearn for a day when I can step out of the shadows, where I am not merely an observer but a vibrant participant in this dance of life. Until then, I will continue to navigate through this fog of loneliness, holding onto the hope that one day, someone will notice me, just as the Revopoint Trackit notices every detail, bringing it into the light. #Loneliness #Heartbreak #Revopoint #Connection #HumanExperience
    Revopoint Trackit, le scanner 3D avec tracking, bientôt sur Kickstarter !
    En partenariat avec Revopoint. Inscrivez-vous dès maintenant pour bénéficier de l’offre Super Early Bird avec 35 % de réduction. Revopoint, leader mondial des solutions de numérisation 3D professionnelles, annonce le lancement du scanner 3D avec suiv
    Like
    Love
    Wow
    Sad
    Angry
    335
    1 Comentários 0 Compartilhamentos
  • In a world that often feels so alive, I find myself drowning in an ocean of solitude. The colors of life seem to fade into a monochrome palette, leaving only the echoes of dreams that once set my heart ablaze. How do I express the weight of despair that clings to my soul? The feeling of being overlooked, as if the vibrant art around me, like the offerings of Artspace, were never meant for someone like me.

    Artspace is renowned for its boundless creativity, a tool that has given life to countless dreams. Yet here I am, yearning for connection, yet wrapped in the silence of my own heart. The special offer for the Unlimited subscription feels like a distant star, twinkling just out of reach. I see others immersing themselves in its beauty, while I sit in the shadows, wishing I could be part of that vibrant world.

    The loneliness is a bitter companion, whispering doubts and fears into my ears. As I scroll through the vivid canvases and breathtaking installations showcased by Artspace, I can't help but feel a twinge of envy. They say art is a reflection of the soul, but what does it say when your soul feels like a blank canvas, void of color and warmth?

    The special offers come and go, but they serve as a reminder of what I lack. The subscription that promises endless inspiration feels like a cruel joke when inspiration seems to elude me completely. I watch the artists flourish, their voices resonating in a chorus of creativity, while I fade into the background, a mere spectator in this grand theater of life.

    Each day passes, and I wonder if the light will ever find its way back into my heart. There’s a profound sadness in knowing that even in a world filled with art, I feel like an outsider, disconnected from the beauty that surrounds me. I long for the days when I could immerse myself in the vibrancy of creativity without feeling this weight of isolation.

    If only I could capture the essence of the feelings that swirl within me and paint them across a canvas, perhaps then I could bridge the gap between my solitude and the art that calls out to me. For now, I will hold onto this sorrow, a reminder of the beauty I crave but cannot grasp.

    Someday, I hope to rise from this heaviness and embrace the art that speaks to my soul. Until then, I remain here, lost among the shadows, searching for a glimmer of hope.

    #Artspace #Loneliness #Creativity #Heartbreak #EmotionalArt
    In a world that often feels so alive, I find myself drowning in an ocean of solitude. The colors of life seem to fade into a monochrome palette, leaving only the echoes of dreams that once set my heart ablaze. How do I express the weight of despair that clings to my soul? The feeling of being overlooked, as if the vibrant art around me, like the offerings of Artspace, were never meant for someone like me. Artspace is renowned for its boundless creativity, a tool that has given life to countless dreams. Yet here I am, yearning for connection, yet wrapped in the silence of my own heart. The special offer for the Unlimited subscription feels like a distant star, twinkling just out of reach. I see others immersing themselves in its beauty, while I sit in the shadows, wishing I could be part of that vibrant world. 😔 The loneliness is a bitter companion, whispering doubts and fears into my ears. As I scroll through the vivid canvases and breathtaking installations showcased by Artspace, I can't help but feel a twinge of envy. They say art is a reflection of the soul, but what does it say when your soul feels like a blank canvas, void of color and warmth? The special offers come and go, but they serve as a reminder of what I lack. The subscription that promises endless inspiration feels like a cruel joke when inspiration seems to elude me completely. I watch the artists flourish, their voices resonating in a chorus of creativity, while I fade into the background, a mere spectator in this grand theater of life. Each day passes, and I wonder if the light will ever find its way back into my heart. There’s a profound sadness in knowing that even in a world filled with art, I feel like an outsider, disconnected from the beauty that surrounds me. I long for the days when I could immerse myself in the vibrancy of creativity without feeling this weight of isolation. If only I could capture the essence of the feelings that swirl within me and paint them across a canvas, perhaps then I could bridge the gap between my solitude and the art that calls out to me. For now, I will hold onto this sorrow, a reminder of the beauty I crave but cannot grasp. Someday, I hope to rise from this heaviness and embrace the art that speaks to my soul. Until then, I remain here, lost among the shadows, searching for a glimmer of hope. 🌧️ #Artspace #Loneliness #Creativity #Heartbreak #EmotionalArt
    Réduction Artspace : l’offre spéciale pour l’abonnement Unlimited !
    Artspace est un outil qui n’a plus rien à prouver, seulement à offrir. Avec son […] Cet article Réduction Artspace : l’offre spéciale pour l’abonnement Unlimited ! a été publié sur REALITE-VIRTUELLE.COM.
    Like
    Love
    Wow
    Sad
    Angry
    608
    1 Comentários 0 Compartilhamentos
  • In the depths of my solitude, I often find myself reflecting on the works of Maurits Escher, the master of impossible illusions. His art, a blend of reality and impossibility, echoes the very essence of my own existence. Like the infinite staircases that lead nowhere, I feel trapped in an unending loop, where my heart yearns for connection but finds only shadows and silence.

    Each piece Escher created seems to whisper the tragedies of my own life—layers of beauty intertwined with the harshness of reality. How can something so captivating feel so isolating? Just as Escher's designs defy logic and reason, my emotions twist and turn, leaving me in a maze of longing and despair. The world outside continues to spin, yet I am frozen in a moment where joy feels like a distant memory, an illusion I can never quite grasp.

    It’s painful to witness the laughter and happiness of others while I remain ensnared in this solitude. I watch as life unfolds in vibrant colors around me, while I sit in monochrome, a silent observer of a reality I can’t seem to touch. Relationships become intricate puzzles, beautiful yet impossible to solve, leaving me feeling more alone than ever. Just like Escher’s art, which captivates yet confounds, I find myself caught in the paradox of wanting to connect but fearing the inevitable disappointment that follows.

    In moments of despair, I seek solace within the lines and curves of Escher's work, each piece a poignant reminder of the beauty that can exist alongside pain. It’s a bittersweet comfort, knowing that others have created worlds that defy the ordinary, yet it also amplifies my sense of isolation. To be a dreamer in a world that feels so unattainable is a heavy burden to bear. I am trapped in my own impossible illusion, yearning for the day when the world will feel a little less distant and a little more like home.

    As I traverse this winding path of existence, I am left to ponder: is it possible to find solace in the impossible? Can I transform my heartache into something beautiful, akin to Escher's masterpieces? Or will I remain just another fleeting thought in a world full of intricate designs that I can only admire from afar?

    In the end, I am just a lost soul, hoping that one day I will break free from this illusion of the impossible and find a place where I truly belong. Until then, I will continue to search for meaning in the chaos, just like Escher, who saw potential in the impossible.

    #Isolation #Heartache #Escher #Illusion #ArtandLife
    In the depths of my solitude, I often find myself reflecting on the works of Maurits Escher, the master of impossible illusions. His art, a blend of reality and impossibility, echoes the very essence of my own existence. Like the infinite staircases that lead nowhere, I feel trapped in an unending loop, where my heart yearns for connection but finds only shadows and silence. 💔 Each piece Escher created seems to whisper the tragedies of my own life—layers of beauty intertwined with the harshness of reality. How can something so captivating feel so isolating? Just as Escher's designs defy logic and reason, my emotions twist and turn, leaving me in a maze of longing and despair. The world outside continues to spin, yet I am frozen in a moment where joy feels like a distant memory, an illusion I can never quite grasp. 🌧️ It’s painful to witness the laughter and happiness of others while I remain ensnared in this solitude. I watch as life unfolds in vibrant colors around me, while I sit in monochrome, a silent observer of a reality I can’t seem to touch. Relationships become intricate puzzles, beautiful yet impossible to solve, leaving me feeling more alone than ever. Just like Escher’s art, which captivates yet confounds, I find myself caught in the paradox of wanting to connect but fearing the inevitable disappointment that follows. 😢 In moments of despair, I seek solace within the lines and curves of Escher's work, each piece a poignant reminder of the beauty that can exist alongside pain. It’s a bittersweet comfort, knowing that others have created worlds that defy the ordinary, yet it also amplifies my sense of isolation. To be a dreamer in a world that feels so unattainable is a heavy burden to bear. I am trapped in my own impossible illusion, yearning for the day when the world will feel a little less distant and a little more like home. 🌌 As I traverse this winding path of existence, I am left to ponder: is it possible to find solace in the impossible? Can I transform my heartache into something beautiful, akin to Escher's masterpieces? Or will I remain just another fleeting thought in a world full of intricate designs that I can only admire from afar? In the end, I am just a lost soul, hoping that one day I will break free from this illusion of the impossible and find a place where I truly belong. Until then, I will continue to search for meaning in the chaos, just like Escher, who saw potential in the impossible. #Isolation #Heartache #Escher #Illusion #ArtandLife
    Maurits Escher, l’illusion de l’impossible
    Escher est un "mathémagicien" qui a réalisé des œuvres réalistes et pourtant physiquement irréalisables, mêlant art et mathématiques. L’article Maurits Escher, l’illusion de l’impossible est apparu en premier sur Graphéine - Agence de com
    Like
    Love
    Wow
    Sad
    Angry
    622
    1 Comentários 0 Compartilhamentos
  • In the quiet moments of the day, when the world feels distant and dreams seem out of reach, I find myself grappling with a profound sense of solitude. It's as if the very fabric of connection has unraveled, leaving me stranded in a vast expanse of emptiness. I often think of how life used to burst with color, each day painted with laughter and shared moments. Now, it feels like I’m trapped in a monochrome existence, where every smile is a mask and every word a mere echo of what once was.

    I once believed that my passions and ambitions could fill the void. I tried to harness my creativity, diving into design and architecture, dreaming of creating spaces that resonate with warmth and life. But even in a world filled with innovative tools like Top Designer, which promises to transform visions into reality, I find that my own aspirations feel hollow. The software that should aid architects and builders in presenting their dreams to clients feels like a cruel reminder of my own failures. I can simulate beautiful spaces, yet the reality is a stark contrast to the vibrant images on the screen.

    The irony gnaws at me - I can depict the beauty of a home, but I struggle to find solace in my own heart. Each click of the mouse feels like a step further into isolation, crafting visions for others while my own dreams slip through my fingers like sand. I want to share these creations, to feel the joy of collaboration, but the weight of loneliness wraps around me, stifling any attempt at connection.

    Am I destined to forever stand on the outside, watching others build their lives while I remain an observer, a melancholy artist painting with shadows? The ache of unexpressed emotions lingers, and the silence screams louder than any conversation I could have. I yearn for understanding, for a kindred spirit who sees beyond the façade.

    Life is a series of designs, each moment a blueprint of our existence. Yet here I am, unable to draft my own plans, feeling lost among the structures I create for others. If only I could find a way to bridge this chasm, to transform the desolation into something tangible, something beautiful. But for now, I remain an architect of dreams unfulfilled, wandering through the corridors of my own solitude.

    In this world where connection feels like a distant memory, I hold onto the hope that one day, I will find someone who understands the language of my heart, someone who can walk alongside me through the desolate halls, transforming loneliness into companionship.

    #Loneliness #Heartache #UnfulfilledDreams #ArchitectOfSolitude #EmotionalJourney
    In the quiet moments of the day, when the world feels distant and dreams seem out of reach, I find myself grappling with a profound sense of solitude. It's as if the very fabric of connection has unraveled, leaving me stranded in a vast expanse of emptiness. I often think of how life used to burst with color, each day painted with laughter and shared moments. Now, it feels like I’m trapped in a monochrome existence, where every smile is a mask and every word a mere echo of what once was. I once believed that my passions and ambitions could fill the void. I tried to harness my creativity, diving into design and architecture, dreaming of creating spaces that resonate with warmth and life. But even in a world filled with innovative tools like Top Designer, which promises to transform visions into reality, I find that my own aspirations feel hollow. The software that should aid architects and builders in presenting their dreams to clients feels like a cruel reminder of my own failures. I can simulate beautiful spaces, yet the reality is a stark contrast to the vibrant images on the screen. The irony gnaws at me - I can depict the beauty of a home, but I struggle to find solace in my own heart. Each click of the mouse feels like a step further into isolation, crafting visions for others while my own dreams slip through my fingers like sand. I want to share these creations, to feel the joy of collaboration, but the weight of loneliness wraps around me, stifling any attempt at connection. Am I destined to forever stand on the outside, watching others build their lives while I remain an observer, a melancholy artist painting with shadows? The ache of unexpressed emotions lingers, and the silence screams louder than any conversation I could have. I yearn for understanding, for a kindred spirit who sees beyond the façade. Life is a series of designs, each moment a blueprint of our existence. Yet here I am, unable to draft my own plans, feeling lost among the structures I create for others. If only I could find a way to bridge this chasm, to transform the desolation into something tangible, something beautiful. But for now, I remain an architect of dreams unfulfilled, wandering through the corridors of my own solitude. In this world where connection feels like a distant memory, I hold onto the hope that one day, I will find someone who understands the language of my heart, someone who can walk alongside me through the desolate halls, transforming loneliness into companionship. #Loneliness #Heartache #UnfulfilledDreams #ArchitectOfSolitude #EmotionalJourney
    Top Designer
    Logiciel de simulation de travaux   Ce logiciel est destiné aux architectes, bureaux d’études, entreprises du bâtiment et courtiers en travaux qui souhaitent présenter rapidement à leur client le résultat des travaux qu’il envisage d
    Like
    Love
    Wow
    Sad
    Angry
    440
    1 Comentários 0 Compartilhamentos
  • In a world where connections are fading, I find myself lost in a sea of solitude. Just as Trump enters the realm of communications with his new Trump Mobile and the golden phone, I sit here, clutching my heart, feeling the weight of unfulfilled promises and empty conversations. It's as if the advancements around me only serve to remind me of what I lack—the warmth of genuine human connection, the joy of shared laughter, and the solace of true companionship.

    Every notification that lights up my screen feels like a cruel joke, a reminder that while the world spins on with its shiny new gadgets, I remain trapped in my own silence. The allure of a golden phone seems so distant, so trivial, when the echoes of loneliness fill my days. The glimmer of Trump Mobile shines bright, but it can't reach into the depths of my despair, where the shadows of abandonment linger.

    I scroll through my feed, watching as others celebrate their achievements, their connections, their lives full of color. Meanwhile, I sit in my monochrome reality, feeling like a ghost in a bustling city, invisible and unheard. The laughter that surrounds me is a haunting melody, one that I cannot join. The truth is, no amount of technology can bridge the chasm between me and the warmth of companionship.

    With each passing day, the world becomes more connected, yet I feel more isolated. The innovations we embrace, such as Trump Mobile, only amplify my solitude. I wonder if they, too, feel the ache of loneliness beneath their glossy exteriors. In this age of constant communication, why do I still feel so far away from everyone?

    The golden hue of the new phone reflects the emptiness in my heart. It’s beautiful, yes, but it cannot replace the laughter of a friend or the comforting presence of someone who truly understands. I find myself yearning for something more profound than the superficial interactions that fill my timeline. I long for the raw, unfiltered moments—the shared tears, the heartfelt conversations, the true bonds that technology cannot replicate.

    As Trump steps into a world of connections, I can’t help but wonder if he feels the same pang of isolation that I do. Does he, too, experience nights filled with unspoken words and unshared experiences? The reality is, amidst the buzz of new launches and innovations, we are all searching for something—something that transcends the screens and the distance.

    In this moment of reflection, I close my eyes and wish for a day when the technology we create will not only connect us in a virtual sense but also heal the wounds of our aching hearts. Until then, I remain here, feeling the weight of my solitude, counting the days until I can find my way back to the warmth of true connection.

    #Loneliness #Isolation #Connection #Heartbreak #Technology
    In a world where connections are fading, I find myself lost in a sea of solitude. Just as Trump enters the realm of communications with his new Trump Mobile and the golden phone, I sit here, clutching my heart, feeling the weight of unfulfilled promises and empty conversations. It's as if the advancements around me only serve to remind me of what I lack—the warmth of genuine human connection, the joy of shared laughter, and the solace of true companionship. Every notification that lights up my screen feels like a cruel joke, a reminder that while the world spins on with its shiny new gadgets, I remain trapped in my own silence. The allure of a golden phone seems so distant, so trivial, when the echoes of loneliness fill my days. The glimmer of Trump Mobile shines bright, but it can't reach into the depths of my despair, where the shadows of abandonment linger. I scroll through my feed, watching as others celebrate their achievements, their connections, their lives full of color. Meanwhile, I sit in my monochrome reality, feeling like a ghost in a bustling city, invisible and unheard. The laughter that surrounds me is a haunting melody, one that I cannot join. The truth is, no amount of technology can bridge the chasm between me and the warmth of companionship. With each passing day, the world becomes more connected, yet I feel more isolated. The innovations we embrace, such as Trump Mobile, only amplify my solitude. I wonder if they, too, feel the ache of loneliness beneath their glossy exteriors. In this age of constant communication, why do I still feel so far away from everyone? The golden hue of the new phone reflects the emptiness in my heart. It’s beautiful, yes, but it cannot replace the laughter of a friend or the comforting presence of someone who truly understands. I find myself yearning for something more profound than the superficial interactions that fill my timeline. I long for the raw, unfiltered moments—the shared tears, the heartfelt conversations, the true bonds that technology cannot replicate. As Trump steps into a world of connections, I can’t help but wonder if he feels the same pang of isolation that I do. Does he, too, experience nights filled with unspoken words and unshared experiences? The reality is, amidst the buzz of new launches and innovations, we are all searching for something—something that transcends the screens and the distance. In this moment of reflection, I close my eyes and wish for a day when the technology we create will not only connect us in a virtual sense but also heal the wounds of our aching hearts. Until then, I remain here, feeling the weight of my solitude, counting the days until I can find my way back to the warmth of true connection. #Loneliness #Isolation #Connection #Heartbreak #Technology
    ترامب يدخل عالم الاتصالات: إطلاق شبكة Trump Mobile وهاتف ذهبي جديد
    The post ترامب يدخل عالم الاتصالات: إطلاق شبكة Trump Mobile وهاتف ذهبي جديد appeared first on عرب هاردوير.
    Like
    Love
    Wow
    Sad
    Angry
    540
    1 Comentários 0 Compartilhamentos
  • In the heart of night, where shadows dance and whispers linger, I find myself lost in the echoes of silence. The world outside moves on, oblivious to the weight that pins me down, like a forgotten dream fading into the morning light. The release of "Lunae Veritatis (Stay)" by The Avener, with its haunting melodies crafted by Seb Caudron and his dedicated team, reminds me of the beauty found in fleeting moments — moments that slip through my fingers like grains of sand.

    Three months of dedicated work from a passionate crew, their sweat and tears poured into a visual symphony meant to touch souls. Yet, here I am, standing alone amidst the beauty they created, feeling the sting of isolation more profoundly than ever. The vibrant colors of the clip contrast sharply with the monochrome palette of my heart, each frame a reminder of connections that once were, now just distant memories.

    I long for the warmth of companionship, a hand to hold as the waves of despair crash around me. Yet, each time I reach out, the void seems to grow wider, engulfing me in its darkness. The artistry of "Stay" reflects the depths of longing and the ache of absence, resonating with a truth I can’t escape: sometimes, the hardest battles are fought in silence, where no one can see the scars that bleed within.

    As I listen to the music, I can’t help but feel the bittersweet joy it brings. It captures the essence of love and loss, of a yearning that stretches beyond the stars. The visual magic woven by Seb Caudron and his team stirs something deep within me, yet it also heightens my sense of loneliness. How can such beauty exist while I feel so empty? I am but a ghost in a world that keeps moving forward, a spectator in a life that feels more like a distant memory than a present reality.

    The art created through "Lunae Veritatis (Stay)" is a testament to resilience, yet here I am, grappling with the shadows that cling to me like a second skin. I wish I could step into the world they’ve crafted, where emotions are vibrant and love is palpable. But instead, I remain trapped in a cycle of longing, watching from afar as the colors of life swirl around me, painting pictures I can only dream of.

    Perhaps one day, I will find my way back to the light, where the notes of hope and joy will resonate in my heart once more. Until then, I will carry the weight of this solitude, a silent observer of the beauty that surrounds me, forever yearning for a connection that seems just out of reach.

    #LunaeVeritatis #TheAvener #SebCaudron #Loneliness #ArtAndEmotion
    In the heart of night, where shadows dance and whispers linger, I find myself lost in the echoes of silence. The world outside moves on, oblivious to the weight that pins me down, like a forgotten dream fading into the morning light. The release of "Lunae Veritatis (Stay)" by The Avener, with its haunting melodies crafted by Seb Caudron and his dedicated team, reminds me of the beauty found in fleeting moments — moments that slip through my fingers like grains of sand. Three months of dedicated work from a passionate crew, their sweat and tears poured into a visual symphony meant to touch souls. Yet, here I am, standing alone amidst the beauty they created, feeling the sting of isolation more profoundly than ever. The vibrant colors of the clip contrast sharply with the monochrome palette of my heart, each frame a reminder of connections that once were, now just distant memories. I long for the warmth of companionship, a hand to hold as the waves of despair crash around me. Yet, each time I reach out, the void seems to grow wider, engulfing me in its darkness. The artistry of "Stay" reflects the depths of longing and the ache of absence, resonating with a truth I can’t escape: sometimes, the hardest battles are fought in silence, where no one can see the scars that bleed within. As I listen to the music, I can’t help but feel the bittersweet joy it brings. It captures the essence of love and loss, of a yearning that stretches beyond the stars. The visual magic woven by Seb Caudron and his team stirs something deep within me, yet it also heightens my sense of loneliness. How can such beauty exist while I feel so empty? I am but a ghost in a world that keeps moving forward, a spectator in a life that feels more like a distant memory than a present reality. The art created through "Lunae Veritatis (Stay)" is a testament to resilience, yet here I am, grappling with the shadows that cling to me like a second skin. I wish I could step into the world they’ve crafted, where emotions are vibrant and love is palpable. But instead, I remain trapped in a cycle of longing, watching from afar as the colors of life swirl around me, painting pictures I can only dream of. Perhaps one day, I will find my way back to the light, where the notes of hope and joy will resonate in my heart once more. Until then, I will carry the weight of this solitude, a silent observer of the beauty that surrounds me, forever yearning for a connection that seems just out of reach. #LunaeVeritatis #TheAvener #SebCaudron #Loneliness #ArtAndEmotion
    Seb Caudron signe le clip Lunae Veritatis (Stay) pour The Avener
    Le réalisateur et superviseur VFX Seb Caudron nous présente son dernier projet : le clip Lunae Veritatis (Stay). Réalisé pour The Avener. Un projet qui a demandé trois mois de travail à l’équipe impliquée. La production s’est appuyée sur
    Like
    Love
    Wow
    Sad
    Angry
    574
    1 Comentários 0 Compartilhamentos
  • Mock up a website in five prompts

    “Wait, can users actually add products to the cart?”Every prototype faces that question or one like it. You start to explain it’s “just Figma,” “just dummy data,” but what if you didn’t need disclaimers?What if you could hand clients—or your team—a working, data-connected mock-up of their website, or new pages and components, in less time than it takes to wireframe?That’s the challenge we’ll tackle today. But first, we need to look at:The problem with today’s prototyping toolsPick two: speed, flexibility, or interactivity.The prototyping ecosystem, despite having amazing software that addresses a huge variety of needs, doesn’t really have one tool that gives you all three.Wireframing apps let you draw boxes in minutes but every button is fake. Drag-and-drop builders animate scroll triggers until you ask for anything off-template. Custom code frees you… after you wave goodbye to a few afternoons.AI tools haven’t smashed the trade-off; they’ve just dressed it in flashier costumes. One prompt births a landing page, the next dumps a 2,000-line, worse-than-junior-level React file in your lap. The bottleneck is still there. Builder’s approach to website mockupsWe’ve been trying something a little different to maintain speed, flexibility, and interactivity while mocking full websites. Our AI-driven visual editor:Spins up a repo in seconds or connects to your existing one to use the code as design inspiration. React, Vue, Angular, and Svelte all work out of the box.
    Lets you shape components via plain English, visual edits, copy/pasted Figma frames, web inspos, MCP tools, and constant visual awareness of your entire website.
    Commits each change as a clean GitHub pull request your team can review like hand-written code. All your usual CI checks and lint rules apply.And if you need a tweak, you can comment to @builderio-bot right in the GitHub PR to make asynchronous changes without context switching.This results in a live site the café owner can interact with today, and a branch your devs can merge tomorrow. Stakeholders get to click actual buttons and trigger real state—no more “so, just imagine this works” demos.Let’s see it in action.From blank canvas to working mockup in five promptsToday, I’m going to mock up a fake business website. You’re welcome to create a real one.Before we fire off a single prompt, grab a note and write:Business name & vibe
    Core pages
    Primary goal
    Brand palette & toneThat’s it. Don’t sweat the details—we can always iterate. For mine, I wrote:1. Sunny Trails Bakery — family-owned, feel-good, smells like warm cinnamon.
    2. Home, About, Pricing / Subscription Box, Menu.
    3. Drive online orders and foot traffic—every CTA should funnel toward “Order Now” or “Reserve a Table.”
    4. Warm yellow, chocolate brown, rounded typography, playful copy.We’re not trying to fit everything here. What matters is clarity on what we’re creating, so the AI has enough context to produce usable scaffolds, and so later tweaks stay aligned with the client’s vision. Builder will default to using React, Vite, and Tailwind. If you want a different JS framework, you can link an existing repo in that stack. In the near future, you won’t need to do this extra step to get non-React frameworks to function.An entire website from the first promptNow, we’re ready to get going.Head over to Builder.io and paste in this prompt or your own:Create a cozy bakery website called “Sunny Trails Bakery” with pages for:
    • Home
    • About
    • Pricing
    • Menu
    Brand palette: warm yellow and chocolate brown. Tone: playful, inviting. The restaurant is family-owned, feel-good, and smells like cinnamon.
    The goal of this site is to drive online orders and foot traffic—every CTA should funnel toward "Order Now" or "Reserve a Table."Once you hit enter, Builder will spin up a new dev container, and then inside that container, the AI will build out the first version of your site. You can leave the page and come back when it’s done.Now, before we go further, let’s create our repo, so that we get version history right from the outset. Click “Create Repo” up in the top right, and link your GitHub account.Once the process is complete, you’ll have a brand new repo.If you need any help on this step, or any of the below, check out these docs.Making the mockup’s order system workFrom our one-shot prompt, we’ve already got a really nice start for our client. However, when we press the “Order Now” button, we just get a generic alert. Let’s fix this.The best part about connecting to GitHub is that we get version control. Head back to your dashboard and edit the settings of your new project. We can give it a better name, and then, in the “Advanced” section, we can change the “Commit Mode” to “Pull Requests.”Now, we have the ability to create new branches right within Builder, allowing us to make drastic changes without worrying about the main version. This is also helpful if you’d like to show your client or team a few different versions of the same prototype.On a new branch, I’ll write another short prompt:Can you make the "Order Now" button work, even if it's just with dummy JSON for now?As you can see in the GIF above, Builder creates an ordering system and a fully mobile-responsive cart and checkout flow.Now, we can click “Send PR” in the top right, and we have an ordinary GitHub PR that can be reviewed and merged as needed.This is what’s possible in two prompts. For our third, let’s gussy up the style.If you’re like me, you might spend a lot of time admiring other people’s cool designs and learning how to code up similar components in your own style.Luckily, Builder has this capability, too, with our Chrome extension. I found a “Featured Posts” section on OpenAI’s website, where I like how the layout and scrolling work. We can copy and paste it onto our “Featured Treats” section, retaining our cafe’s distinctive brand style.Don’t worry—OpenAI doesn’t mind a little web scraping.You can do this with any component on any website, so your own projects can very quickly become a “best of the web” if you know what you’re doing.Plus, you can use Figma designs in much the same way, with even better design fidelity. Copy and paste a Figma frame with our Figma plugin, and tell the AI to either use the component as inspiration or as a 1:1 to reference for what the design should be.Now, we’re ready to send our PR. This time, let’s take a closer look at the code the AI has created.As you can see, the code is neatly formatted into two reusable components. Scrolling down further, I find a CSS file and then the actual implementation on the homepage, with clean JSON to represent the dummy post data.Design tweaks to the mockup with visual editsOne issue that cropped up when the AI brought in the OpenAI layout is that it changed my text from “Featured Treats” to “Featured Stories & Treats.” I’ve realized I don’t like either, and I want to replace that text with: “Fresh Out of the Bakery.”It would be silly, though, to prompt the AI just for this small tweak. Let’s switch into edit mode.Edit Mode lets you select any component and change any of its content or underlying CSS directly. You get a host of Webflow-like options to choose from, so that you can finesse the details as needed.Once you’ve made all the visual changes you want—maybe tweaking a button color or a border radius—you can click “Apply Edits,” and the AI will ensure the underlying code matches your repo’s style.Async fixes to the mockup with Builder BotNow, our pull request is nearly ready to merge, but I found one issue with it:When we copied the OpenAI website layout earlier, one of the blog posts had a video as its featured graphic instead of just an image. This is cool for OpenAI, but for our bakery, I just wanted images in this section. Since I didn’t instruct Builder’s AI otherwise, it went ahead and followed the layout and created extra code for video capability.No problem. We can fix this inside GItHub with our final prompt. We just need to comment on the PR and tag builderio-bot. Within about a minute, Builder Bot has successfully removed the video functionality, leaving a minimal diff that affects only the code it needed to. For example: Returning to my project in Builder, I can see that the bot’s changes are accounted for in the chat window as well, and I can use the live preview link to make sure my site works as expected:Now, if this were a real project, you could easily deploy this to the web for your client. After all, you’ve got a whole GitHub repo. This isn’t just a mockup; it’s actual code you can tweak—with Builder or Cursor or by hand—until you’re satisfied to run the site in production.So, why use Builder to mock up your website?Sure, this has been a somewhat contrived example. A real prototype is going to look prettier, because I’m going to spend more time on pieces of the design that I don’t like as much.But that’s the point of the best AI tools: they don’t take you, the human, out of the loop.You still get to make all the executive decisions, and it respects your hard work. Since you can constantly see all the code the AI creates, work in branches, and prompt with component-level precision, you can stop worrying about AI overwriting your opinions and start using it more as the tool it’s designed to be.You can copy in your team’s Figma designs, import web inspos, connect MCP servers to get Jira tickets in hand, and—most importantly—work with existing repos full of existing styles that Builder will understand and match, just like it matched OpenAI’s layout to our little cafe.So, we get speed, flexibility, and interactivity all the way from prompt to PR to production.Try Builder today.
    #mock #website #five #prompts
    Mock up a website in five prompts
    “Wait, can users actually add products to the cart?”Every prototype faces that question or one like it. You start to explain it’s “just Figma,” “just dummy data,” but what if you didn’t need disclaimers?What if you could hand clients—or your team—a working, data-connected mock-up of their website, or new pages and components, in less time than it takes to wireframe?That’s the challenge we’ll tackle today. But first, we need to look at:The problem with today’s prototyping toolsPick two: speed, flexibility, or interactivity.The prototyping ecosystem, despite having amazing software that addresses a huge variety of needs, doesn’t really have one tool that gives you all three.Wireframing apps let you draw boxes in minutes but every button is fake. Drag-and-drop builders animate scroll triggers until you ask for anything off-template. Custom code frees you… after you wave goodbye to a few afternoons.AI tools haven’t smashed the trade-off; they’ve just dressed it in flashier costumes. One prompt births a landing page, the next dumps a 2,000-line, worse-than-junior-level React file in your lap. The bottleneck is still there. Builder’s approach to website mockupsWe’ve been trying something a little different to maintain speed, flexibility, and interactivity while mocking full websites. Our AI-driven visual editor:Spins up a repo in seconds or connects to your existing one to use the code as design inspiration. React, Vue, Angular, and Svelte all work out of the box. Lets you shape components via plain English, visual edits, copy/pasted Figma frames, web inspos, MCP tools, and constant visual awareness of your entire website. Commits each change as a clean GitHub pull request your team can review like hand-written code. All your usual CI checks and lint rules apply.And if you need a tweak, you can comment to @builderio-bot right in the GitHub PR to make asynchronous changes without context switching.This results in a live site the café owner can interact with today, and a branch your devs can merge tomorrow. Stakeholders get to click actual buttons and trigger real state—no more “so, just imagine this works” demos.Let’s see it in action.From blank canvas to working mockup in five promptsToday, I’m going to mock up a fake business website. You’re welcome to create a real one.Before we fire off a single prompt, grab a note and write:Business name & vibe Core pages Primary goal Brand palette & toneThat’s it. Don’t sweat the details—we can always iterate. For mine, I wrote:1. Sunny Trails Bakery — family-owned, feel-good, smells like warm cinnamon. 2. Home, About, Pricing / Subscription Box, Menu. 3. Drive online orders and foot traffic—every CTA should funnel toward “Order Now” or “Reserve a Table.” 4. Warm yellow, chocolate brown, rounded typography, playful copy.We’re not trying to fit everything here. What matters is clarity on what we’re creating, so the AI has enough context to produce usable scaffolds, and so later tweaks stay aligned with the client’s vision. Builder will default to using React, Vite, and Tailwind. If you want a different JS framework, you can link an existing repo in that stack. In the near future, you won’t need to do this extra step to get non-React frameworks to function.An entire website from the first promptNow, we’re ready to get going.Head over to Builder.io and paste in this prompt or your own:Create a cozy bakery website called “Sunny Trails Bakery” with pages for: • Home • About • Pricing • Menu Brand palette: warm yellow and chocolate brown. Tone: playful, inviting. The restaurant is family-owned, feel-good, and smells like cinnamon. The goal of this site is to drive online orders and foot traffic—every CTA should funnel toward "Order Now" or "Reserve a Table."Once you hit enter, Builder will spin up a new dev container, and then inside that container, the AI will build out the first version of your site. You can leave the page and come back when it’s done.Now, before we go further, let’s create our repo, so that we get version history right from the outset. Click “Create Repo” up in the top right, and link your GitHub account.Once the process is complete, you’ll have a brand new repo.If you need any help on this step, or any of the below, check out these docs.Making the mockup’s order system workFrom our one-shot prompt, we’ve already got a really nice start for our client. However, when we press the “Order Now” button, we just get a generic alert. Let’s fix this.The best part about connecting to GitHub is that we get version control. Head back to your dashboard and edit the settings of your new project. We can give it a better name, and then, in the “Advanced” section, we can change the “Commit Mode” to “Pull Requests.”Now, we have the ability to create new branches right within Builder, allowing us to make drastic changes without worrying about the main version. This is also helpful if you’d like to show your client or team a few different versions of the same prototype.On a new branch, I’ll write another short prompt:Can you make the "Order Now" button work, even if it's just with dummy JSON for now?As you can see in the GIF above, Builder creates an ordering system and a fully mobile-responsive cart and checkout flow.Now, we can click “Send PR” in the top right, and we have an ordinary GitHub PR that can be reviewed and merged as needed.This is what’s possible in two prompts. For our third, let’s gussy up the style.If you’re like me, you might spend a lot of time admiring other people’s cool designs and learning how to code up similar components in your own style.Luckily, Builder has this capability, too, with our Chrome extension. I found a “Featured Posts” section on OpenAI’s website, where I like how the layout and scrolling work. We can copy and paste it onto our “Featured Treats” section, retaining our cafe’s distinctive brand style.Don’t worry—OpenAI doesn’t mind a little web scraping.You can do this with any component on any website, so your own projects can very quickly become a “best of the web” if you know what you’re doing.Plus, you can use Figma designs in much the same way, with even better design fidelity. Copy and paste a Figma frame with our Figma plugin, and tell the AI to either use the component as inspiration or as a 1:1 to reference for what the design should be.Now, we’re ready to send our PR. This time, let’s take a closer look at the code the AI has created.As you can see, the code is neatly formatted into two reusable components. Scrolling down further, I find a CSS file and then the actual implementation on the homepage, with clean JSON to represent the dummy post data.Design tweaks to the mockup with visual editsOne issue that cropped up when the AI brought in the OpenAI layout is that it changed my text from “Featured Treats” to “Featured Stories & Treats.” I’ve realized I don’t like either, and I want to replace that text with: “Fresh Out of the Bakery.”It would be silly, though, to prompt the AI just for this small tweak. Let’s switch into edit mode.Edit Mode lets you select any component and change any of its content or underlying CSS directly. You get a host of Webflow-like options to choose from, so that you can finesse the details as needed.Once you’ve made all the visual changes you want—maybe tweaking a button color or a border radius—you can click “Apply Edits,” and the AI will ensure the underlying code matches your repo’s style.Async fixes to the mockup with Builder BotNow, our pull request is nearly ready to merge, but I found one issue with it:When we copied the OpenAI website layout earlier, one of the blog posts had a video as its featured graphic instead of just an image. This is cool for OpenAI, but for our bakery, I just wanted images in this section. Since I didn’t instruct Builder’s AI otherwise, it went ahead and followed the layout and created extra code for video capability.No problem. We can fix this inside GItHub with our final prompt. We just need to comment on the PR and tag builderio-bot. Within about a minute, Builder Bot has successfully removed the video functionality, leaving a minimal diff that affects only the code it needed to. For example: Returning to my project in Builder, I can see that the bot’s changes are accounted for in the chat window as well, and I can use the live preview link to make sure my site works as expected:Now, if this were a real project, you could easily deploy this to the web for your client. After all, you’ve got a whole GitHub repo. This isn’t just a mockup; it’s actual code you can tweak—with Builder or Cursor or by hand—until you’re satisfied to run the site in production.So, why use Builder to mock up your website?Sure, this has been a somewhat contrived example. A real prototype is going to look prettier, because I’m going to spend more time on pieces of the design that I don’t like as much.But that’s the point of the best AI tools: they don’t take you, the human, out of the loop.You still get to make all the executive decisions, and it respects your hard work. Since you can constantly see all the code the AI creates, work in branches, and prompt with component-level precision, you can stop worrying about AI overwriting your opinions and start using it more as the tool it’s designed to be.You can copy in your team’s Figma designs, import web inspos, connect MCP servers to get Jira tickets in hand, and—most importantly—work with existing repos full of existing styles that Builder will understand and match, just like it matched OpenAI’s layout to our little cafe.So, we get speed, flexibility, and interactivity all the way from prompt to PR to production.Try Builder today. #mock #website #five #prompts
    WWW.BUILDER.IO
    Mock up a website in five prompts
    “Wait, can users actually add products to the cart?”Every prototype faces that question or one like it. You start to explain it’s “just Figma,” “just dummy data,” but what if you didn’t need disclaimers?What if you could hand clients—or your team—a working, data-connected mock-up of their website, or new pages and components, in less time than it takes to wireframe?That’s the challenge we’ll tackle today. But first, we need to look at:The problem with today’s prototyping toolsPick two: speed, flexibility, or interactivity.The prototyping ecosystem, despite having amazing software that addresses a huge variety of needs, doesn’t really have one tool that gives you all three.Wireframing apps let you draw boxes in minutes but every button is fake. Drag-and-drop builders animate scroll triggers until you ask for anything off-template. Custom code frees you… after you wave goodbye to a few afternoons.AI tools haven’t smashed the trade-off; they’ve just dressed it in flashier costumes. One prompt births a landing page, the next dumps a 2,000-line, worse-than-junior-level React file in your lap. The bottleneck is still there. Builder’s approach to website mockupsWe’ve been trying something a little different to maintain speed, flexibility, and interactivity while mocking full websites. Our AI-driven visual editor:Spins up a repo in seconds or connects to your existing one to use the code as design inspiration. React, Vue, Angular, and Svelte all work out of the box. Lets you shape components via plain English, visual edits, copy/pasted Figma frames, web inspos, MCP tools, and constant visual awareness of your entire website. Commits each change as a clean GitHub pull request your team can review like hand-written code. All your usual CI checks and lint rules apply.And if you need a tweak, you can comment to @builderio-bot right in the GitHub PR to make asynchronous changes without context switching.This results in a live site the café owner can interact with today, and a branch your devs can merge tomorrow. Stakeholders get to click actual buttons and trigger real state—no more “so, just imagine this works” demos.Let’s see it in action.From blank canvas to working mockup in five promptsToday, I’m going to mock up a fake business website. You’re welcome to create a real one.Before we fire off a single prompt, grab a note and write:Business name & vibe Core pages Primary goal Brand palette & toneThat’s it. Don’t sweat the details—we can always iterate. For mine, I wrote:1. Sunny Trails Bakery — family-owned, feel-good, smells like warm cinnamon. 2. Home, About, Pricing / Subscription Box, Menu (with daily specials). 3. Drive online orders and foot traffic—every CTA should funnel toward “Order Now” or “Reserve a Table.” 4. Warm yellow, chocolate brown, rounded typography, playful copy.We’re not trying to fit everything here. What matters is clarity on what we’re creating, so the AI has enough context to produce usable scaffolds, and so later tweaks stay aligned with the client’s vision. Builder will default to using React, Vite, and Tailwind. If you want a different JS framework, you can link an existing repo in that stack. In the near future, you won’t need to do this extra step to get non-React frameworks to function.(Free tier Builder gives you 5 AI credits/day and 25/month—plenty to follow along with today’s demo. Upgrade only when you need it.)An entire website from the first promptNow, we’re ready to get going.Head over to Builder.io and paste in this prompt or your own:Create a cozy bakery website called “Sunny Trails Bakery” with pages for: • Home • About • Pricing • Menu Brand palette: warm yellow and chocolate brown. Tone: playful, inviting. The restaurant is family-owned, feel-good, and smells like cinnamon. The goal of this site is to drive online orders and foot traffic—every CTA should funnel toward "Order Now" or "Reserve a Table."Once you hit enter, Builder will spin up a new dev container, and then inside that container, the AI will build out the first version of your site. You can leave the page and come back when it’s done.Now, before we go further, let’s create our repo, so that we get version history right from the outset. Click “Create Repo” up in the top right, and link your GitHub account.Once the process is complete, you’ll have a brand new repo.If you need any help on this step, or any of the below, check out these docs.Making the mockup’s order system workFrom our one-shot prompt, we’ve already got a really nice start for our client. However, when we press the “Order Now” button, we just get a generic alert. Let’s fix this.The best part about connecting to GitHub is that we get version control. Head back to your dashboard and edit the settings of your new project. We can give it a better name, and then, in the “Advanced” section, we can change the “Commit Mode” to “Pull Requests.”Now, we have the ability to create new branches right within Builder, allowing us to make drastic changes without worrying about the main version. This is also helpful if you’d like to show your client or team a few different versions of the same prototype.On a new branch, I’ll write another short prompt:Can you make the "Order Now" button work, even if it's just with dummy JSON for now?As you can see in the GIF above, Builder creates an ordering system and a fully mobile-responsive cart and checkout flow.Now, we can click “Send PR” in the top right, and we have an ordinary GitHub PR that can be reviewed and merged as needed.This is what’s possible in two prompts. For our third, let’s gussy up the style.If you’re like me, you might spend a lot of time admiring other people’s cool designs and learning how to code up similar components in your own style.Luckily, Builder has this capability, too, with our Chrome extension. I found a “Featured Posts” section on OpenAI’s website, where I like how the layout and scrolling work. We can copy and paste it onto our “Featured Treats” section, retaining our cafe’s distinctive brand style.Don’t worry—OpenAI doesn’t mind a little web scraping.You can do this with any component on any website, so your own projects can very quickly become a “best of the web” if you know what you’re doing.Plus, you can use Figma designs in much the same way, with even better design fidelity. Copy and paste a Figma frame with our Figma plugin, and tell the AI to either use the component as inspiration or as a 1:1 to reference for what the design should be.(You can grab our design-to-code guide for a lot more ideas of what this can help you accomplish.)Now, we’re ready to send our PR. This time, let’s take a closer look at the code the AI has created.As you can see, the code is neatly formatted into two reusable components. Scrolling down further, I find a CSS file and then the actual implementation on the homepage, with clean JSON to represent the dummy post data.Design tweaks to the mockup with visual editsOne issue that cropped up when the AI brought in the OpenAI layout is that it changed my text from “Featured Treats” to “Featured Stories & Treats.” I’ve realized I don’t like either, and I want to replace that text with: “Fresh Out of the Bakery.”It would be silly, though, to prompt the AI just for this small tweak. Let’s switch into edit mode.Edit Mode lets you select any component and change any of its content or underlying CSS directly. You get a host of Webflow-like options to choose from, so that you can finesse the details as needed.Once you’ve made all the visual changes you want—maybe tweaking a button color or a border radius—you can click “Apply Edits,” and the AI will ensure the underlying code matches your repo’s style.Async fixes to the mockup with Builder BotNow, our pull request is nearly ready to merge, but I found one issue with it:When we copied the OpenAI website layout earlier, one of the blog posts had a video as its featured graphic instead of just an image. This is cool for OpenAI, but for our bakery, I just wanted images in this section. Since I didn’t instruct Builder’s AI otherwise, it went ahead and followed the layout and created extra code for video capability.No problem. We can fix this inside GItHub with our final prompt. We just need to comment on the PR and tag builderio-bot. Within about a minute, Builder Bot has successfully removed the video functionality, leaving a minimal diff that affects only the code it needed to. For example: Returning to my project in Builder, I can see that the bot’s changes are accounted for in the chat window as well, and I can use the live preview link to make sure my site works as expected:Now, if this were a real project, you could easily deploy this to the web for your client. After all, you’ve got a whole GitHub repo. This isn’t just a mockup; it’s actual code you can tweak—with Builder or Cursor or by hand—until you’re satisfied to run the site in production.So, why use Builder to mock up your website?Sure, this has been a somewhat contrived example. A real prototype is going to look prettier, because I’m going to spend more time on pieces of the design that I don’t like as much.But that’s the point of the best AI tools: they don’t take you, the human, out of the loop.You still get to make all the executive decisions, and it respects your hard work. Since you can constantly see all the code the AI creates, work in branches, and prompt with component-level precision, you can stop worrying about AI overwriting your opinions and start using it more as the tool it’s designed to be.You can copy in your team’s Figma designs, import web inspos, connect MCP servers to get Jira tickets in hand, and—most importantly—work with existing repos full of existing styles that Builder will understand and match, just like it matched OpenAI’s layout to our little cafe.So, we get speed, flexibility, and interactivity all the way from prompt to PR to production.Try Builder today.
    0 Comentários 0 Compartilhamentos