• Le Bigscreen Beyond 2 prétend révolutionner la réalité virtuelle avec son suivi oculaire et son scan facial, mais soyons honnêtes : c'est une farce ! La légèreté de cet appareil est une blague, et ne compense pas ses défauts majeurs. Comment peut-on accepter une technologie qui, au lieu de nous plonger dans un monde virtuel immersif, nous laisse frustrés par des performances médiocres ? Les promesses de réalité virtuelle évoluant rapidement sont vaines si les géants de l'industrie continuent de sortir des produits bâclés. Nous méritons mieux que ces gadgets qui ne sont que des attrapes-nigauds.

    #BigscreenBeyond2 #RéalitéVirt
    Le Bigscreen Beyond 2 prétend révolutionner la réalité virtuelle avec son suivi oculaire et son scan facial, mais soyons honnêtes : c'est une farce ! La légèreté de cet appareil est une blague, et ne compense pas ses défauts majeurs. Comment peut-on accepter une technologie qui, au lieu de nous plonger dans un monde virtuel immersif, nous laisse frustrés par des performances médiocres ? Les promesses de réalité virtuelle évoluant rapidement sont vaines si les géants de l'industrie continuent de sortir des produits bâclés. Nous méritons mieux que ces gadgets qui ne sont que des attrapes-nigauds. #BigscreenBeyond2 #RéalitéVirt
    Suivi oculaire, scan facial et légèreté du Bigscreen Beyond 2
    La réalité virtuelle évolue vite, et beaucoup ont été particulièrement intrigués par le Bigscreen Beyond […] Cet article Suivi oculaire, scan facial et légèreté du Bigscreen Beyond 2 a été publié sur REALITE-VIRTUELLE.COM.
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • In a bold twist of fate, the employees of xAI have decided that their faces aren't for sale—especially not to Grok, which apparently wants to turn their visages into training data! Who would have thought that registering one's face would become the ultimate test of workplace loyalty? Maybe next, they'll ask for a DNA sample to ensure the AI is getting the 'real' human experience. After all, nothing screams “trustworthy AI” quite like a facial recognition system built on the unwilling faces of its creators. Stay tuned for the next episode: "Grok’s Quest for the Perfect Face!"

    #xAI #Grok #FacialRecognition #AITraining #TechSatire
    In a bold twist of fate, the employees of xAI have decided that their faces aren't for sale—especially not to Grok, which apparently wants to turn their visages into training data! Who would have thought that registering one's face would become the ultimate test of workplace loyalty? Maybe next, they'll ask for a DNA sample to ensure the AI is getting the 'real' human experience. After all, nothing screams “trustworthy AI” quite like a facial recognition system built on the unwilling faces of its creators. Stay tuned for the next episode: "Grok’s Quest for the Perfect Face!" #xAI #Grok #FacialRecognition #AITraining #TechSatire
    ARABHARDWARE.NET
    موظفو xAI يرفضون تدريب Grok بسبب طلب تسجيل وجوههم
    The post موظفو xAI يرفضون تدريب Grok بسبب طلب تسجيل وجوههم appeared first on عرب هاردوير.
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • In July 2021, the life of cyclist Dave Richards took an unexpected turn when he faced a serious accident during a bike ride with friends. But here's the amazing part: thanks to the incredible advancements in 3D printing technology, Dave has received a custom facial prosthesis that not only restores his appearance but also his confidence!

    This inspiring journey shows us that even in the face of adversity, innovation and hope can lead to new beginnings. Let's celebrate the power of technology to transform lives and remind ourselves that every challenge is an opportunity to rise stronger!

    #3DPrinting #InspirationalStories #OvercomingChallenges #CyclingCommunity #Innovation
    In July 2021, the life of cyclist Dave Richards took an unexpected turn when he faced a serious accident during a bike ride with friends. 🚴‍♂️💔 But here's the amazing part: thanks to the incredible advancements in 3D printing technology, Dave has received a custom facial prosthesis that not only restores his appearance but also his confidence! 🌟✨ This inspiring journey shows us that even in the face of adversity, innovation and hope can lead to new beginnings. Let's celebrate the power of technology to transform lives and remind ourselves that every challenge is an opportunity to rise stronger! 💪😊 #3DPrinting #InspirationalStories #OvercomingChallenges #CyclingCommunity #Innovation
    La impresión 3D le ofrece una prótesis facial a un ciclista lesionado
    En julio de 2021, la vida de Dave Richards cambió drásticamente durante un paseo en bicicleta con dos amigos en la región británica de Somerset. El grupo de ciclistas chocó con un conductor ebrio que hablaba por teléfono al volante.…
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • The art of two Mickeys

    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine.
    The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?”
    “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.”

    Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical.
    “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.”
    Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.”

    “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.”
    In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass.

    When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.”
    “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.”
    The Hydralite rig, developed by Volucap. Source:
    Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.”
    “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.”

    Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released.
    The post The art of two Mickeys appeared first on befores & afters.
    #art #two #mickeys
    The art of two Mickeys
    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters. #art #two #mickeys
    BEFORESANDAFTERS.COM
    The art of two Mickeys
    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: https://volucap.com Rising Sun Pictures (visual effects supervisor Guido Wolter) handled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • DISCOVERING ELIO

    By TREVOR HOGG

    Images courtesy of Pixar.

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red.
    “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.”

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’”

    Green is the thematic color for Elio.

    Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?”

    The Communiverse was meant to feel like a place that a child would love to visit and explore.

    Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’”

    The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena.

    The variety in the Communiverse is a contrast to the regimented world on the military base.

    There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’”

    Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters.

    Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.”

    An aerial image of Elio as he attempts to get abducted by aliens.

    Part of the design of the Coummuniverse was inspired by Chinese puzzle balls.

    A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.”
    Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.”

    Exploring various facial expressions for Elio.

    A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew.

    Character designs of Elio and Glordon. which shows them interacting with each other.

    Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.”
    Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.”

    Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.”
    Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    #discovering #elio
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.” #discovering #elio
    WWW.VFXVOICE.COM
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when those [alien] characters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Sanii [Visual Effects Supervisor] gave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Hom [Production Manager] and I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowds [department] is dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessup [Production Designer] because sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Muren [VES] was keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter (CCO, Pixar). “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Anthropic launches new Claude service for military and intelligence use

    Anthropic on Thursday announced Claude Gov, its product designed specifically for U.S. defense and intelligence agencies. The AI models have looser guardrails for government use and are trained to better analyze classified information.The company said the models it’s announcing “are already deployed by agencies at the highest level of U.S. national security,” and that access to those models will be limited to government agencies handling classified information. The company did not confirm how long they had been in use.Claude Gov models are specifically designed to uniquely handle government needs, like threat assessment and intelligence analysis, per Anthropic’s blog post. And although the company said they “underwent the same rigorous safety testing as all of our Claude models,” the models have certain specifications for national security work. For example, they “refuse less when engaging with classified information” that’s fed into them, something consumer-facing Claude is trained to flag and avoid. Claude Gov’s models also have greater understanding of documents and context within defense and intelligence, according to Anthropic, and better proficiency in languages and dialects relevant to national security. Use of AI by government agencies has long been scrutinized because of its potential harms and ripple effects for minorities and vulnerable communities. There’s been a long list of wrongful arrests across multiple U.S. states due to police use of facial recognition, documented evidence of bias in predictive policing, and discrimination in government algorithms that assess welfare aid. For years, there’s also been an industry-wide controversy over large tech companies like Microsoft, Google and Amazon allowing the military — particularly in Israel — to use their AI products, with campaigns and public protests under the No Tech for Apartheid movement.Anthropic’s usage policy specifically dictates that any user must “Not Create or Facilitate the Exchange of Illegal or Highly Regulated Weapons or Goods,” including using Anthropic’s products or services to “produce, modify, design, market, or distribute weapons, explosives, dangerous materials or other systems designed to cause harm to or loss of human life.” At least eleven months ago, the company said it created a set of contractual exceptions to its usage policy that are “carefully calibrated to enable beneficial uses by carefully selected government agencies.” Certain restrictions — such as disinformation campaigns, the design or use of weapons, the construction of censorship systems, and malicious cyber operations — would remain prohibited. But Anthropic can decide to “tailor use restrictions to the mission and legal authorities of a government entity,” although it will aim to “balance enabling beneficial uses of our products and services with mitigating potential harms.” Claude Gov is Anthropic’s answer to ChatGPT Gov, OpenAI’s product for U.S. government agencies, which it launched in January. It’s also part of a broader trend of AI giants and startups alike looking to bolster their businesses with government agencies, especially in an uncertain regulatory landscape.When OpenAI announced ChatGPT Gov, the company said that within the past year, more than 90,000 employees of federal, state, and local governments had used its technology to translate documents, generate summaries, draft policy memos, write code, build applications, and more. Anthropic declined to share numbers or use cases of the same sort, but the company is part of Palantir’s FedStart program, a SaaS offering for companies who want to deploy federal government-facing software. Scale AI, the AI giant that provides training data to industry leaders like OpenAI, Google, Microsoft, and Meta, signed a deal with the Department of Defense in March for a first-of-its-kind AI agent program for U.S. military planning. And since then, it’s expanded its business to world governments, recently inking a five-year deal with Qatar to provide automation tools for civil service, healthcare, transportation, and more.See More:
    #anthropic #launches #new #claude #service
    Anthropic launches new Claude service for military and intelligence use
    Anthropic on Thursday announced Claude Gov, its product designed specifically for U.S. defense and intelligence agencies. The AI models have looser guardrails for government use and are trained to better analyze classified information.The company said the models it’s announcing “are already deployed by agencies at the highest level of U.S. national security,” and that access to those models will be limited to government agencies handling classified information. The company did not confirm how long they had been in use.Claude Gov models are specifically designed to uniquely handle government needs, like threat assessment and intelligence analysis, per Anthropic’s blog post. And although the company said they “underwent the same rigorous safety testing as all of our Claude models,” the models have certain specifications for national security work. For example, they “refuse less when engaging with classified information” that’s fed into them, something consumer-facing Claude is trained to flag and avoid. Claude Gov’s models also have greater understanding of documents and context within defense and intelligence, according to Anthropic, and better proficiency in languages and dialects relevant to national security. Use of AI by government agencies has long been scrutinized because of its potential harms and ripple effects for minorities and vulnerable communities. There’s been a long list of wrongful arrests across multiple U.S. states due to police use of facial recognition, documented evidence of bias in predictive policing, and discrimination in government algorithms that assess welfare aid. For years, there’s also been an industry-wide controversy over large tech companies like Microsoft, Google and Amazon allowing the military — particularly in Israel — to use their AI products, with campaigns and public protests under the No Tech for Apartheid movement.Anthropic’s usage policy specifically dictates that any user must “Not Create or Facilitate the Exchange of Illegal or Highly Regulated Weapons or Goods,” including using Anthropic’s products or services to “produce, modify, design, market, or distribute weapons, explosives, dangerous materials or other systems designed to cause harm to or loss of human life.” At least eleven months ago, the company said it created a set of contractual exceptions to its usage policy that are “carefully calibrated to enable beneficial uses by carefully selected government agencies.” Certain restrictions — such as disinformation campaigns, the design or use of weapons, the construction of censorship systems, and malicious cyber operations — would remain prohibited. But Anthropic can decide to “tailor use restrictions to the mission and legal authorities of a government entity,” although it will aim to “balance enabling beneficial uses of our products and services with mitigating potential harms.” Claude Gov is Anthropic’s answer to ChatGPT Gov, OpenAI’s product for U.S. government agencies, which it launched in January. It’s also part of a broader trend of AI giants and startups alike looking to bolster their businesses with government agencies, especially in an uncertain regulatory landscape.When OpenAI announced ChatGPT Gov, the company said that within the past year, more than 90,000 employees of federal, state, and local governments had used its technology to translate documents, generate summaries, draft policy memos, write code, build applications, and more. Anthropic declined to share numbers or use cases of the same sort, but the company is part of Palantir’s FedStart program, a SaaS offering for companies who want to deploy federal government-facing software. Scale AI, the AI giant that provides training data to industry leaders like OpenAI, Google, Microsoft, and Meta, signed a deal with the Department of Defense in March for a first-of-its-kind AI agent program for U.S. military planning. And since then, it’s expanded its business to world governments, recently inking a five-year deal with Qatar to provide automation tools for civil service, healthcare, transportation, and more.See More: #anthropic #launches #new #claude #service
    WWW.THEVERGE.COM
    Anthropic launches new Claude service for military and intelligence use
    Anthropic on Thursday announced Claude Gov, its product designed specifically for U.S. defense and intelligence agencies. The AI models have looser guardrails for government use and are trained to better analyze classified information.The company said the models it’s announcing “are already deployed by agencies at the highest level of U.S. national security,” and that access to those models will be limited to government agencies handling classified information. The company did not confirm how long they had been in use.Claude Gov models are specifically designed to uniquely handle government needs, like threat assessment and intelligence analysis, per Anthropic’s blog post. And although the company said they “underwent the same rigorous safety testing as all of our Claude models,” the models have certain specifications for national security work. For example, they “refuse less when engaging with classified information” that’s fed into them, something consumer-facing Claude is trained to flag and avoid. Claude Gov’s models also have greater understanding of documents and context within defense and intelligence, according to Anthropic, and better proficiency in languages and dialects relevant to national security. Use of AI by government agencies has long been scrutinized because of its potential harms and ripple effects for minorities and vulnerable communities. There’s been a long list of wrongful arrests across multiple U.S. states due to police use of facial recognition, documented evidence of bias in predictive policing, and discrimination in government algorithms that assess welfare aid. For years, there’s also been an industry-wide controversy over large tech companies like Microsoft, Google and Amazon allowing the military — particularly in Israel — to use their AI products, with campaigns and public protests under the No Tech for Apartheid movement.Anthropic’s usage policy specifically dictates that any user must “Not Create or Facilitate the Exchange of Illegal or Highly Regulated Weapons or Goods,” including using Anthropic’s products or services to “produce, modify, design, market, or distribute weapons, explosives, dangerous materials or other systems designed to cause harm to or loss of human life.” At least eleven months ago, the company said it created a set of contractual exceptions to its usage policy that are “carefully calibrated to enable beneficial uses by carefully selected government agencies.” Certain restrictions — such as disinformation campaigns, the design or use of weapons, the construction of censorship systems, and malicious cyber operations — would remain prohibited. But Anthropic can decide to “tailor use restrictions to the mission and legal authorities of a government entity,” although it will aim to “balance enabling beneficial uses of our products and services with mitigating potential harms.” Claude Gov is Anthropic’s answer to ChatGPT Gov, OpenAI’s product for U.S. government agencies, which it launched in January. It’s also part of a broader trend of AI giants and startups alike looking to bolster their businesses with government agencies, especially in an uncertain regulatory landscape.When OpenAI announced ChatGPT Gov, the company said that within the past year, more than 90,000 employees of federal, state, and local governments had used its technology to translate documents, generate summaries, draft policy memos, write code, build applications, and more. Anthropic declined to share numbers or use cases of the same sort, but the company is part of Palantir’s FedStart program, a SaaS offering for companies who want to deploy federal government-facing software. Scale AI, the AI giant that provides training data to industry leaders like OpenAI, Google, Microsoft, and Meta, signed a deal with the Department of Defense in March for a first-of-its-kind AI agent program for U.S. military planning. And since then, it’s expanded its business to world governments, recently inking a five-year deal with Qatar to provide automation tools for civil service, healthcare, transportation, and more.See More:
    Like
    Love
    Wow
    Angry
    Sad
    682
    0 Yorumlar 0 hisse senetleri 0 önizleme
CGShares https://cgshares.com