• TECHCRUNCH.COM
    OpenAI is reportedly developing its own X-like social media platform
    In Brief Posted: 9:12 AM PDT · April 15, 2025 Image Credits:Jakub Porzycki/NurPhoto / Getty Images OpenAI is reportedly developing its own X-like social media platform OpenAI is building its own X-like social media network, according to a new report from The Verge. The project is still in the early stages, but there’s an internal prototype focused on ChatGPT’s image generation that contains a social feed. The report states that it’s unknown if OpenAI plans to launch the social network as a standalone app or if it plans to integrate it within the ChatGPT app. With this new social network, OpenAI would be taking on Elon Musk’s X and Meta’s social platforms, Facebook and Instagram. The new app would also allow OpenAI to access real-time data to train its AI models, something that both X and Meta already have. OpenAI CEO Sam Altman has reportedly been privately asking outsiders for feedback about the social network. At this point, it’s not clear whether the project will ever launch publicly, but the existence of a prototype shows that OpenAI is looking to expand beyond its current offerings. Topics
    0 Комментарии 0 Поделились 55 Просмотры
  • WWW.ARTOFVFX.COM
    The Electric State: Joel Behrens (VFX Supervisor) & Elizabeth “Liz” Bernard (Animation Supervisor) – Digital Domain
    Interviews The Electric State: Joel Behrens (VFX Supervisor) & Elizabeth “Liz” Bernard (Animation Supervisor) – Digital Domain By Vincent Frei - 15/04/2025 In 2018, Joel Behrens discussed Digital Domain‘s work on Ant-Man and the Wasp. He later contributed to Morbius and Doctor Strange in the Multiverse of Madness. Now, he walks us through a very different kind of project: The Electric State. Back in 2022, Elizabeth “Liz” Bernard told us about Digital Domain’s work on She-Hulk: Attorney at Law. She then worked on Secret Invasion, and today, she returns to discuss her latest project: The Electric State. How did you and Digital Domain get involved on this show? Joel Behrens: Matthew Butler was brought in to talk with the Russo Brothers very early on before the project was with Netflix, and brought me on. We obviously wanted to take on a big part of the show and decided to do some preliminary testing on our mocap stage to show how these characters might interact in the world of the film. From there we got the award and moved forward with character designing and builds. How was the collaboration with the Russo Brothers and VFX Supervisor Matthew Butler? Joel Behrens: Matthew and I have worked together for many years, so working with him was great as usual. This was the first time I worked with the Russo Brothers for me, and I felt like we had a fantastic collaboration on this film. I was able to go into AGBO often for reviews and meetings throughout the entire post-production run. They were always very receptive to ideas, and I enjoyed the experience of reviewing our work with them on a regular basis. Obviously, it was a very large movie in terms of scale and scope, and they always had a pretty clear vision for what they wanted. What are the sequences made by Digital Domain? Joel Behrens: We worked on quite a few throughout the film. Cosmo’s first appearance with Michelle, Fort Hull robot battlefield graveyard, The Mine sequence, The Exclusion zone and traveling through the Ex to the mall, Arriving at the mall and Mr. Peanut’s intro, all of Happyland and the haunted house, and the epilogue back at the mall. Can you walk us through the design process for the main robots such as Cosmo, Herman, The Marshall, and Mr. Peanut? What were the core ideas you wanted to convey through their appearance and movements? Joel Behrens: All the bots had their own sets of challenges and goals that we went through. Cosmo, from an overall design point, was probably the easier of them since his overall look and silhouette was pulled directly from Simon’s book. We still took great care in Cosmo, as well as the rest of the bots, to try to ground them in reality as much as we could. We studied modern day robotics, and how the structure and the joints are put together. We wanted them to have realistic gears, servos, motors, pistons, actuators, etc. We tried to infuse these bots with as much believability as we could from the materials we chose to use, to the structural components that allowed them to move with an acceptable range of motion. Herman had a unique challenge of creating a retro-futuristic domed screen for his head. The desire was to evoke the feeling of old CRT television technology with the RGB pixels under glass, while including some modern touches of being able to essentially use it like a modern LCD screen with the ability to create whatever facial expressions and graphic elements we wanted. The Marshall went through a few design changes. Initially drawn up to have a cowboy aesthetic, he was eventually brought in line with the rest of the “drones” in the film. A big part of that was mirroring the iconic headset that is pulled from the book as the actual heads on the drones to try to reinforce to the audience that these were being controlled by humans. Mr. Peanut obviously has a well-known look and design to work from. The thought behind his build was that he was some sort of mascot robot for the company. He was a somewhat simpler robot that had a sort of latex shell that has not aged well, around an interior robotic structure — like the mid to late 80’s animatronics from Chuck E. Cheese and Showbiz Pizza that were made to entertain kids. How did you approach the animation of each of these robots to ensure they had distinct personalities? Were there specific inspirations or references that helped guide their animation? Liz Bernard: When you begin with this idea of animating these characters to look “robotic,” that could mean almost anything, and in a show with this many unique characters, it doesn’t even always mean the same thing from ‘bot to ‘bot. I’ve always loved this concept that physicality drives personality, that those two things are irrevocably linked. So, we started out by studying each robot’s body structure to figure out what each bot could and couldn’t do: some of them were on wheels, others were bipedal but had limited joints that could only rotate in one axis, some had faces, some didn’t, others were suspended from the ceiling, etc. A great example of this is that our piano-playing Tacobot was, naturally, shaped like a hardshell taco that came to a sharp point where his butt would be, and thus he couldn’t really sit on a bench at the piano. So we had to have him stand at the piano instead, which meant that it made sense for him to dance and sway as he was playing: a little character detail that added extra sparkle and life to his performance. We also started to think about the timeline of when these characters would have been built: Mr. Peanut and Popfly are more vintage and clunky, whereas Herman is a little more advanced tech with his screen face. That allowed us to make a rough timeline of when each of our characters would have been built, and that impacted how we felt they should move and behave as well. With all of that in mind, in the summer of 2023, we spent some time creating a “four-pack” of different movement styles per main character to pitch to Joe and Anthony. The styles ranged from stiff and clunky to smoother and more natural, so that the directors could choose which style they liked best for each character. That early effort established the hero ‘bots like the Marshall and Cosmo, but we still had dozens of background characters to flesh out for the Exclusion Zone and the Mall. For those, I assigned out two or three robots per animator and asked them to use their imaginations and put their own personality into some longer animation clips that we could use throughout the Mall. What we got out of that was this really organic mix of styles and distinct personalities, and when we put them all together in the mall, it was magic. Cosmo and Herman are very different in terms of function and personality. What unique challenges did you face animating these two characters, and how did you solve them? Liz Bernard: Cosmo is the heart of the film and the book, and he needed to be able to do a lot both physically and emotionally. His feet were huge, and his lollipop head was absolutely enormous, giving him a top- and bottom-heavy silhouette with spindly little limbs, and hands based on the old school white gloves used in classic cartoons. This odd anatomy gave his movements a kind of clown-shoe-shuffling dodder that was quite charming once we worked out the kinks in the mechanics of his boots. His white hands were the brightest part of his body, and naturally drew the eye, so hand posing was critically important. And, of course, it’s tough when you have a character with a giant smiley face permanently painted on to have him emote pathos, sadness, determination, and depth. With a character like this, restraint is important. Context tells a lot of the story, and the audience knows what a character might be thinking about without us knocking them over the head with it. Herman was at the other end of the spectrum in some respects. He is loud, sarcastic, confident, and most importantly, he can talk. His body was built to lift and move things, and we put a lot of time into making sure his joints were functional and could support the weight he was expected to carry around (in all his different size iterations). Because of that contrasting lightness in his humor and personality, we gave Herman a little bit of extra spring in his step (both figuratively and literally), and a touch of swagger. His banter with Keats is often deadpan, so there were also moments when it was important to dial back the movement and let the comic timing play out in stillness. As we developed Herman’s pixelated face, our Rigging department gave us the capability to project images onto it, and although we used that feature very sparingly to ensure that he stayed on model, we got some funny moments out of it. The Marshall has a more humanoid form compared to the others. What considerations did you take into account to make sure it felt both robotic yet expressive in its movements? Liz Bernard: We started with motion capture for most of the Marshall’s performances, and the performer played the character with a touch of cowboy swagger, which suited Giancarlo’s vocal performance. From there, it was up to us in animation to adjust the movements to make his body feel heavier, his joints stiffer, and his range of motion a bit limited by rust and disrepair. Giancarlo played the Marshall’s voice and face with a quiet professionalism and dignity even though his drone was a built-to-task bot-killing machine. When we animated him, we were usually looking for a balance between those two things: the restraint of the man controlling the drone, and the immense strength and ruthless violence his heavy drone body was clearly capable of. Mr. Peanut has such a quirky design. Can you share the creative process behind his look, and how his animations helped to bring out his character traits? Liz Bernard: This character was maybe the most challenging one in our roster of hero characters. We had a clear silhouette to maintain and specific features like the top hat, cane, spats, and monocle because we were basing him on a real corporate mascot that is instantly recognizable in North America. He was one of the older ‘bots in the world of the Electric State, a savvy and inspiring politician, and the founder of this “oasis of safety” in the desert. We realized that if we kept Mr. Peanut as a solid peanut shell (as he is in the corporate mascot), his body would be too stiff, and we would be forced to teeter-totter him around like a child playing with an action figure; not exactly the dignity that this character deserved. To avoid the totter, we separated his head from his body and sliced off his butt so that we could swivel it like hips to help him move around without breaking that all-important peanut shell silhouette. The resulting stiffness in his walk worked perfectly with his iconic cane and with the slightly elderly warble that Woody Harrelson gave to the character in his performance. Mr. Peanut’s face was a unique challenge, too. As they say, “the eyes are the window into the soul,” and we knew from Woody’s voice that this character had a lot of soul. Even though Mr. Peanut’s eyelids and eyebrows were simplified and non-deforming, the eye animation we developed for him was fundamentally human: long gazes, quick darts when he felt threatened, blinks to punctuate dialogue and bump up the humor, etc. The mouth was another matter, however. As Joel mentioned, the shell of Mister Peanut was meant to be a metal endoskeleton covered in a layer of thick rubbery latex, so we started out by making a whole batch of rubbery blend shapes similar to how we would approach a normal fleshy human face. However, too much articulation in his lips while speaking meant that he started to look like a character in an animated feature, and that look didn’t fit into the gritty world we had built with all of the other bots. After some experimentation and research, we settled on the somewhat dilapidated classic animatronics from the 80s/90s, which meant that we kept restricting and removing blend shapes until we were down from about eighty to six. Less is sometimes a whole lot more. Was there a specific moment or sequence in the film where animating the robots felt particularly challenging or rewarding for you and your team? Liz Bernard: One of the most beautiful and understated scenes we animated was an intimate one in the car between Michelle and Cosmo after they escaped from her deadbeat foster dad’s house. Cosmo can only speak in these canned kid Cosmo doll expressions (e.g., “the solar system’s gone haywire!”) when he’s trying to communicate with Michelle. He is not fully used to his new robot form either, and yet he finds a way to overcome his own physical limitations, his weird body, his inability to say what he wants to say, and he manages to tell her what she needs to know. We had good acting reference from Devyn, the stand-in actor who played Cosmo on set with Millie, but editorial changes and other adjustments to the acting beat meant that we started to veer away from that performance in the details. We used pantomime and subtle body language — particularly carefully timed head movements — to steer the conversation and develop the empathy and chemistry with Michelle needed for that scene to work. This film explores themes about humanity and technology, and how those ideas intersect. What’s more relatedly human than figuring out a way to communicate with another person, even when the language is not there? I love that scene because Cosmo shows that he can be interpretive and creative; he is so human in that moment. Tricky question, out of all the robots in The Electric State, which one is your personal favorite, and why? Joel Behrens: That’s a hard one, there are so many. I think, for me, it’s Cosmo. I was a big fan of the book before, and I was fortunate enough to be involved in bringing him to life on the screen. I love the character that Simon created in the book, and I think the life and soul that Liz and her team put into his performance really cements him as my favorite. Liz Bernard: I’m going to do a top three because it’s too hard to choose: Herman for his sarcasm, Cosmo for his charm, and Perplexo for his bombast (plus, it was a geeky childhood dream come true to work on a character voiced by the great Hank Azaria). The Haunted Amusement Park has such a unique and eerie feel. How did you approach the design and VFX work to make the park come to life, especially with the blend of horror and nostalgia? Joel Behrens: Happyland was a pretty incredible set to go to every day for a couple weeks. The production design team did an amazing job of turning an Atlanta water park parking lot into this retro fun fair environment. Honestly, we didn’t have to do much. The set was pretty much fully built and for the exteriors we did some minor environment extensions, some fx fog and smoke, and added all of our scavenger bots, of course. For the interior of the haunted house, the set was built for principal photography with full scale tesla coils with LED’s in the core of them to give us our interactive light on the environment and actors, which we later replaced with the crackling blue lightning. However, new choreography and a revised scene was created during additional photography that necessitated us building the whole environment in cg to complete the Mister Peanut and Marshall fight. The Mall Sequence appears to be a huge moment in the film. What were the key elements you focused on to ensure the mall’s scale and its deserted nature were effectively conveyed through VFX? Joel Behrens: The interiors of the mall were, once again, some brilliant production design from Dennis Gassner and his team. We took over a large portion of an essentially abandoned mall in Atlanta. The interior was decorated beautifully, so we had to augment very, very little. For the exterior, we used the footprint of the actual Atlanta mall as a base which had a partially dressed facade and parking lot, and did a lot of environment extension and matte painting combined with footage that was shot in Utah by our splinter unit for the deeper bg mesas and desert. For wider establishing shots of the mall when our heroes first come upon it, we placed a 3D version of our mall and surroundings along with digital matte painting extension from our outstanding environment department led by Juan Pablo Allgeier into the bg plates shot in and around Moab, Utah. The Walk Across the Exclusion Zone desert feels like a desolate yet fascinating location. What kind of challenges did you face in creating such a barren, expansive environment, and how did you use VFX to enhance its storytelling potential? Joel Behrens: That was a fun environment to build out. It was a great mix of practical plates, re-projected photography on geometry, digital matte painting, and full cg builds of terrain. We shot the VW bus with our actors on a large gimbal since they were being carried by Herman 20 on his shoulder. We would end up replacing the VW to add reflections/shadows/light interaction, and went full CG for everything on some of the wider shots we couldn’t get on the bluescreen stage. JP and his enviro team then built various pieces of desert terrain and background mesas along with the incredible giant dead robot skeleton we walk through at the beginning of the scene. How did you collaborate with the director and production team to ensure the environments felt cohesive with the tone and emotional beats of the story? Joel Behrens: The directors and production designer had a pretty clear vision for the design from the start, which stayed very true to the original source material. The practical set builds that Dennis and his team built were really beautiful and helped ground us in that world. We took our cues from that and really tried to maintain the look that Simon had created for the book. Looking back on the project, what aspects of the visual effects are you most proud of? Joel Behrens: Looking back, I’m most proud of how our visual effects helped create such a compelling and immersive world that truly brought the unique, dystopian atmosphere of Simon’s book to life. I’m very proud of our team for managing that many assets so well and all the work that was put into the characters. I think our characters played a pivotal role in communicating the emotional weight of the story, holding their own with the human actors, and that’s something I’m really proud of. How long have you worked on this show? Joel Behrens: It ended up being a little over 2.5 years for me from pre-production, shoot, and through post. What’s the VFX shots count? Joel Behrens: 857 shots finaled, and worked on a little over 1000. What is your next project? Joel Behrens: Onto another exciting project, but unfortunately can’t share what it is yet. Liz Bernard: Joel and I are keeping the party going and working together again on the next one, a sci-fi feature film. We can’t say more right now. A big thanks for your time. WANT TO KNOW MORE?Digital Domain: Dedicated page about The Electric State in Digital Domain website. © Vincent Frei – The Art of VFX – 2025
    0 Комментарии 0 Поделились 82 Просмотры
  • 3DPRINTINGINDUSTRY.COM
    Wind Turbine Blade Reused in 3D Printed Modular Bridge
    Poly Products, a Dutch company specializing in composite processing, has developed a modular bridge that reuses a decommissioned wind turbine blade as its primary girder. The structure, now installed in Almere, was created as part of the Circular Viaduct program led by the Ministry of Infrastructure and Water Management, which supports pilot initiatives for circular infrastructure across the Netherlands. The bridge was developed in cooperation with Antea Group, an engineering consultancy; GKB Group, which specializes in ground, road, and waterway infrastructure; and the Amsterdam University of Applied Sciences. The concept emerged after a visit to Eneco’s decommissioning operations at the Herkingen Wind Farm. Blade-Made, a company focused on wind blade repurposing, facilitated access to the LM38.8 turbine blade used in the project. “We want to demonstrate what is possible,” said Michiel de Bruijcker, managing director of Poly Products. “With our 55 years of experience in processing composite materials, it is interesting to investigate what role we can play in reuse, while preserving the original shape where possible.” 3D printed blade bridge in Almere reuses LM38.8 wind turbine. Photo via Poly Products. Over the past year, the team worked on developing a full-scale prototype. The completed bridge spans 12 meters, is 3 meters wide, and can occasionally support loads up to 5 tons. Prior to installation, the blade was tested in Heerenveen, where weak points were identified and reinforced. Additional recycled materials were used throughout the structure, including thermoplastic and thermoset components and a deck surface made from repurposed sheet piling. To accommodate the curved geometry of the wind turbine blade and enable integration with the modular road deck, Poly Products employed large-format 3D printing. Using its in-house production capacity, the company developed customized 3D printed shanks that serve multiple roles. These parts support the deck, secure the handrail, and provide the necessary bridge width. The shanks were designed to ensure mechanical continuity and modular adaptability. During the design phase, Poly Products engaged various market stakeholders to assess demand. The bridge received positive feedback and was determined to be financially viable at market prices. A patent application for the structural solution is currently pending, and discussions are underway for additional installations based on the same design. Modular deck with printed shanks and recycled components. Photo via Poly Products. 3D Printed Bridges Showcase Circular Design and Material Reuse in Europe In Italy, researchers at the Polytechnic University of Bari recently developed a six-meter pedestrian bridge inspired by Leonardo da Vinci’s self-supporting arch design. The structure, called “Da Vinci’s Bridge,” was 3D printed using a specialized mortar made from waste stone powders and lime-based binder. Fabricated in partnership with 3D printing firm WASP and engineered by startup B&Y, the bridge consisted of 13 modular blocks printed using the WASP 3MT LDM Concrete system. The project emphasized sustainable construction through recycled material use and stereotomy-based modular design, offering a precedent for functional reuse of industrial byproducts in civil structures. In 2021, the city of Nijmegen unveiled what is reportedly the world’s longest 3D printed concrete pedestrian bridge, spanning 29 meters. Developed by Rijkswaterstaat and designer Michiel van der Kley, the bridge was realized through a collaboration with TU Eindhoven, Witteveen+Bos, and manufacturing partners BAM and Weber Beamix. The parametric design enabled optimized structural geometry, while the 3D printing process significantly reduced material use and construction time. The project demonstrated the scalability of 3D printed concrete structures while highlighting their potential for faster, lower-impact infrastructure development. The Da Vinci Bridge in Action. Photo via Politecnico di Bari, B&Y and WASP. Ready to discover who won the 20243D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletter to stay updated with the latest news and insights. Featured image showcase 3D printed blade bridge in Almere. Photo via Poly Products. Anyer Tenorio Lara Anyer Tenorio Lara is an emerging tech journalist passionate about uncovering the latest advances in technology and innovation. With a sharp eye for detail and a talent for storytelling, Anyer has quickly made a name for himself in the tech community. Anyer's articles aim to make complex subjects accessible and engaging for a broad audience. In addition to his writing, Anyer enjoys participating in industry events and discussions, eager to learn and share knowledge in the dynamic world of technology.
    0 Комментарии 0 Поделились 101 Просмотры
  • WWW.ARCHPAPER.COM
    An exhibition of Syd Mead’s concept art showcases an optimistic vision of future worlds
    Future Pastime 534 West 26th Street New York, New York Through May 21Imagine a future where we aren’t beholden to smartphones or touchscreens. How would we socialize? How would we get around? What would we wear? Syd Mead offers a kaleidoscopic series of answers in Future Pastime, an exhibition on view in New York City. Celebrated in his lifetime for his masterfully rendered science-fiction concept art, the late designer is often referred to as a “visual futurist,” a sui generis designation that is readily evident in the selection of 16 works dating from 1969 to 2004. “‘Pastime’ is very literal: capturing moments of leisure, of play, of togetherness, of going to sporting events,” William Corman, who curated the show with Elon Solo, told AN. “Syd is able to imbue a sort of familiarness and nostalgia in his works—that’s what makes him so profound.” Space Wheel Interior, 1979 (Courtesy Estate of Syd Mead) The curators intentionally excluded Mead’s most familiar works from the show: his seminal concept art for Blade Runner, Tron, and Aliens, among others. Although Mead’s work has long since entered the cultural imagination as a touchstone of science fiction since the 1970s, he remains more a cult figure than a household name; like author Philip Dick, whose Do Androids Dream of Electric Sheep? was adapted into Blade Runner, Mead’s work remains one degree removed from mainstream pop culture. Corman sees this show as an effort to change that: “For two decades leading up to Hollywood recognizing him—he was 45 years old [when] Robert Wise brought him in to do [his first film,] Star Trek: The Motion Picture—this man was a titan of industrial design and more generally optimistic futurism.” In contrast to the perpetual downpour of Ridley Scott’s dystopian Los Angeles, Future Pastime offers glimpses into bolder, brighter worlds in crisp gouache—and in media res, vividly foregrounding industrial design as storytelling device. Space Wheel Interior (1979) evokes scenes from 2001: A Space Odyssey and Don Davis’s Stanford Torus: Interior View (1975), from MoMA’s recent Emerging Ecologies exhibition, while Moon 2000 (1979) is by far the most restrained piece, a celestial body emerging from chalky white linework. The balance of the works depict dazzling starscapes, social gatherings small and large, or Mead’s signature trope: “Syd loved arrivals—you see that in a lot of his scenes,” Corman related, noting that he had loved to host parties himself. Many of the scenes look like fun: One can almost hear the cheers of the crowd as 20-story-tall robo-hounds—gargantuan incarnations of Boston Dynamics’ quadrupeds—thunder down the track, as in Running of Six Drgxx (1983). Other works depict the specular sheen of chrome in ethereal light; mood-setting use of color, from twilit blues to an otherworldly scarlet; and demigod physiques scantily clad in what Mead called “steel couture.” “In writings that accompany [his preliminary sketches for these works], he tried to make sense of these worlds,” noted Corman, “whether it’s through a sociological perspective or an engineer’s mind—[contemplating] how you would build it even if the tech wasn’t ready.” RAYS Wheels, 1985 (Courtesy Estate of Syd Mead) As prolific as he was prodigious, Mead’s oeuvre ranges from graphic design, including the logo for Tron, to interiors, such as a short-lived Manhattan restaurant and bar, over the course of a career that spanned over half a century. Coming of age in the post-war era, Mead got his start at Ford Motor Company’s Advanced Design Studio upon completing a degree in industrial design at what is now the Art Center College of Design, after a three-year stint in the army. He quickly parlayed his superlative drafting skills and gift for visual communication into a series of commissions for U.S. Steel in the ’60s—Corman generously allows visitors to peruse a coveted copy of Concepts, a 1961 hardbound book illustrated entirely by Mead, depicting dozens of concept vehicles and other use cases for the material—and established his own studio within a decade. Alongside clients such as Philips and Raymond Loewy, architectural renderings were his “bread and butter,” throughout the ’70s and ’80s, as Roger Servick, his partner in life and work, recounted in a fascinating 2023 interview, until CAD software obsolesced hand-rendering in the ’90s. Beyond the garish hues and gorgeous details, the works express an unbridled optimism bordering on outright hedonism, with a few being sensual to the point of campiness. (A wall-sized mural of Pebble Beach (2000), a tour-de-force triptych in the show, can be seen in archival photos/video of Mead in his Pasadena home, where he and Servick had it reproduced at what appears to be tenfold scale.) That a queer reading of works such as Party 2000 (1977), as in Evan Moffitt’s essay for Future Pastime, may be lost on the primary audience of the show—the fanboys skew male, cis-, and het-—scarcely detracts from their appeal. Corman, for his part, is happy to geek out over Mead lore or gush about gouache with all comers. “Going into this, I knew that Syd had many fans from the automotive world, industrial design, cinema… now you’re starting to see fine art people come into the mix.” Party 2000, 1977 (Courtesy Estate of Syd Mead) For all its richness, Future Pastime’s willful escapism comes at an uncanny moment, when the future seems more uncertain—but almost certainly worse—than ever. Ironically, I found out about the show via algorithm: a recommended post, liked by a friend, from the account @syd_mead, which appeared in my feed alongside AI-generated Ghibli artwork, news of car tariffs (!), Severance memes, etc. Meanwhile, the Cybertruck—which Mead lauded upon its unveiling in November 2019, the date in which the original Blade Runner is set—has become a political statement in addition to being a surreal sight against the familiar backdrop of New York City’s polyglot streetscape. Fanciful though his work may be, Mead provided a couple of clues to our present in one work in particular, Running of the 200th Kentucky Derby (1975), in which a spectator brandishes a smartphone-like handheld device while an aircraft labeled “INRNET” hovers overhead. The storied visual futurist certainly foresaw invisible, planetary-scale innovations such as cloud computing, big data, and machine learning; he simply chose to fantasize about their utopian potential as opposed to the darker aspects of distraction, manipulation, and surveillance. (One could make the case that show du jour Severance is the essentially inverse of Meadian futurism: insidious in its mundanity.) After all, Mead started out in 1959, when cars and steel were good for the economy as opposed to bad for the environment; he was, in Servick’s telling, a “hardware guy” through and through. The ineffable materiality of his worlds—sensuously rendered in metal, textile, and flesh—is precisely their appeal. No wonder their inhabitants are fully and enviably present in the moment: enraptured, as in the best works of art. Ray Hu is a Brooklyn-based design writer and researcher.
    0 Комментарии 0 Поделились 83 Просмотры
  • WWW.THISISCOLOSSAL.COM
    Formidable Bronze Crowns by Marianna Simnett Conjure Myth and the Sublime Feminine
    “Megaera” (2023), bronze and velvet, 59 x 17 1/2 x 17 1/2 inches. All images courtesy of Marianna Simnett and SOCIÉTÉ, Berlin, shared with permission Formidable Bronze Crowns by Marianna Simnett Conjure Myth and the Sublime Feminine April 15, 2025 Art Kate Mothes For Marianna Simnett, sticking to one medium or theme defies her interpretation of what art can be. She fights the natural proclivity of her audience to typecast her practice as one thing. “Trying to shed those expectations every time—trying to do something different—it’s exhausting but so worth it,” she says in an interview for Art Basel. “Now the signature is that people don’t know what to expect, and that’s the best outcome possible.” Among myriad strains of her practice—which include filmmaking, sculpture, installation, painting, and performance—a collection of bronze crowns created between 2022 and 2024 command our attention. Situated on top of bespoke velvet cushions, Simnett’s Crowns are cast in an alloy that would make the elaborate headpieces burdensome or even painful to wear, yet the meticulously formed arches, band, and spikes manifest as delicate mammals and birds. “Hydra” (2023), bronze and velvet, 55 1/2 x 17 1/2 x 17 1/2 inches “Simnett uses vivid and visceral means to explore the body as a site of transformation,” says a statement from SOCIÉTÉ, which represents the artist. “In psychologically charged works that challenge both herself and the viewer, Simnett imagines radical new worlds filled with untamed thoughts, strange tales, and desires.” Named for powerful female figures from ancient lore like Discordia, the Greek goddess of strife, or Lilith, a she-demon in Jewish and Mesopotamian mythology, Simnett’s Crowns examine the power, ferocity, and sublimity of allegorical female figures. One can imagine that only supernatural beings could wear these pieces and feel comfortable. Simnett’s sculptures were first shown in her exhibition OGRESS in 2022. “In fairy tales and folklore, the ogress is a voracious monster who deceives men and torments children in her quest to ravish them whole,” says an exhibition statement. Simnett wielded “the ogress’ insatiable hunger as a radical force,” illuminating the role of women in myth and legend, especially the symbolic tension between embracing and fearing those who are different. Simnett’s solo exhibition Charades opens at SOCIÉTÉ on May 1, coinciding with Berlin Gallery Weekend. Explore a wide range of the artist’s multimedia work on her website and Instagram. “Laverna” (2023), bronze and velvet, 55 1/2 x 17 1/2 x 17 1/2 inches Detail of “Laverna” “Discordia” (2023), bronze and velvet, 17 1/2 x 17 1/2 x 16 inches “Maniae” (2022), bronze and velvet, 17 1/2 x 17 1/2 x 16 inches “Lilith” (2024), bronze and velvet, 57 1/2 x 17 1/2 x 17 1/2 inches Detail of “Lilith” “Astraea” (2023), bronze and velvet, 55 1/2 x 17 1/2 x 17 1/2 inches Next article
    0 Комментарии 0 Поделились 80 Просмотры
  • WWW.COMPUTERWEEKLY.COM
    Roadmap for commercial adoption of quantum computing gains clarity
    Over the past few months, some significant breakthroughs in quantum computing technology have indicated how quickly the technology is evolving. While it remains very much in the domain of academia and researchers tackling error correction, the roadmaps of quantum computing businesses suggest that useful machines are on their way. IBM’s roadmap shows that this year, there will be a move away from its current Heron machine to a new device called Flamingo, which is effectively based on connecting two Heron devices together. During its first quantum computing developer conference in November 2024, IBM demonstrated the connectivity technology, called L-couplers, which connects two Heron R2 chips with four connectors measuring up to a metre long. Flamingo marks the start of a three-year programme to evolve the number of gates on a quantum device from 5,000 to 15,000 by 2028, using a modular quantum computing architecture. In February, Microsoft published research of topological qubits called Majorana fermions, which the company anticipated would offer more stable qubits, requiring less error correction. It’s also working on a device called Majorana 1, which can be used to detect these qubits, enabling it to be used in running quantum computing calculations. Will Ashford-Brown, director of strategic insights at Heligan Group, said: “Every day we inch closer to realising commercial quantum usage for real applications. Size, cooling, price, speed and impact are all part of the long tail of improvements, but it would seem we are at the point where commercial application, investment and opportunity are knocking at the door.” He anticipates that the availability of a new generation of quantum computing platforms will result in a surge in customer demand. “Presently, the market has been mostly limited to national research laboratories and supercomputing labs,” said Ashford-Brown. “But commercial adoption is getting started, beginning with the tech giants. Microsoft, Amazon, Google and IBM have all partnered with quantum computing startups to provide quantum-based cloud services or are developing their own machines.” While quantum computing evolves, there’s also a lot of interest in hybrid approaches that can take advantage of the technology to speed up computationally complex calculations. D-Wave, for instance, recently expanded its quantum-optimisation offering, with several initiatives aimed at boosting adoption. It said Ford Otosan, a joint venture between Ford Motor Company and Koç Holding in Turkey, has deployed a hybrid-quantum application in production based on D-Wave technology, which streamlines manufacturing processes for the Ford Transit. The US has an eight-year plan to make quantum computing commercially useful. Alice & Bob, Quantinuum and Rigetti are among 10 quantum computing businesses selected by the US Department of Defense to participate in the first stage of the agency’s Quantum Benchmarking Initiative (QBI). This aims to assess the feasibility of building an industrially useful quantum computer by 2033. These developments represent a small snapshot of the immense work taking place across the tech sector to develop quantum computing and hybrid architectures that use quantum technology to accelerate computationally difficult tasks. Graeme Malcolm, founder and CEO of M Squared Lasers, believes there is now a need for a decisive commercial push. “The industry is on the cusp of crossing the so-called ‘quantum valley of death’ – a pivotal transition from research excellence to commercial reality,” he said. Read more about quantum computing Research team demonstrates certified quantum randomness: A 56-qubit trapped ion quantum computer from Quantinuum has demonstrated quantum supremacy as a random number generator. When will quantum computing be available? It depends: Quantum computing availability timelines depend on who is measuring and how they interpret ‘availability’. Varied definitions make for a complex market. Given the government’s recent injection of funding, which will see £121m being put up to drive development of quantum technology in the UK, he added: “Our collective focus must now shift to industrialisation. A nation without quantum will be a nation without critical advantage.” However, in spite of the progress being made, a survey from Economist Impact recently reported that 57% of respondents believe misconceptions about quantum computing are actively hindering advancement. The findings suggest a disconnect between technological development and business readiness, reinforcing the need for better communication, education and alignment at the executive level to maintain the momentum of progress. Helen Ponsford, head of trade, technology and industry events programming at Economist Impact, said: “With 80% of respondents stating that demonstrating industry-specific use cases is essential to accelerating adoption, and two-thirds highlighting the importance of proving return on investment, the message is clear: commercial relevance must closely follow scientific breakthroughs for quantum to sustain its growth.” Although there has been plenty of progress in making quantum computing technology available to software developers through platforms and software developer kits, no discussion on quantum computing is complete without addressing security concerns, which need to be in place well before mass adoption. Looking at quantum-safe cryptography, Daniel Shiu, chief cryptographer at Arqit, said: “Even though the timeline for a viable quantum computer is uncertain, two things are clear – the industry is advancing and the threat is already here. Any systems compromised today could have their data decrypted once quantum machines arrive, unless adequately protected. Quantum security is a concern we need to address now.”
    0 Комментарии 0 Поделились 71 Просмотры
  • WWW.ZDNET.COM
    How you succeed in business is shifting fast - and not because of AI
    Igor Ovsyannykov/Getty ImagesHave you ever wondered why some businesses thrive without spending a dime on ads? 🤔Meanwhile, when you try the good old fashioned "build it, and they will come," you're as busy as a Chick-fil-A on a Sunday? 😖Some business owners have figured out something most refuse to see. But in an ever-evolving world of AI and tech, this "secret" isn't just essential to building a sustainable marketing campaign; it might be the only way to do business moving forward.I wish I had known this "secret" sooner. It would have saved me countless sleepless nights and millions in ad spend.Also: AI won't take your job, but this definitely willIn this article, I will share how the business world could change forever and why creators are the future of business.If you're new to my work, my name is Lester, but feel free to call me Les. I'm chairman of a group of DTC brands and an award-winning performance marketer who's built and scaled brands by developing internal data and analytics tools. I've crafted and managed tens of millions in digital marketing campaigns, so I have a unique perspective on how AI is changing the future of marketing, but not for the reason you think. 🤓☝️All that said, if you're into data-driven marketing insights and strategies, along with AI tips on how to drive growth, check out my free newsletter, No Fluff Just Facts.But enough about me. Let's talk about how business is shifting and how to capitalize on what might be the biggest opportunity this century. 🚀Understanding the landscapeLet me give you some context so you can understand what has changed and what the future holds.You see, life used to be really simple for online businesses. Spend some money on online ads; people show up ready to buy, and boom, money in the bank. Or maybe you did some SEO voodoo and got tons of organic traffic. Either way, money in the bank. 🤑Fast forward a few years, and the algorithms are out for blood. If you're not careful, they'll squeeze every penny out of your account.Also: How I used this AI tool to build an app with just one prompt - and you can tooThe Apple iOS update made it harder for advertisers to target their audience, driving up costs, messing with tracking, and turning every campaign into a gamble.On top of that, every online industry got flooded with new, well-funded players, and customer acquisition costs hit record highs.All of this created the perfect storm for a full-blown pay-to-play ecosystem.But then TikTok entered the chat and said hold my beer! 🤭TikTok focused on creators and didn't care about the old rules of the internet, which required thousands or even millions of followers to get noticed. Anyone had a shot to go viral, and that changed everything.TikTok also gave creators the tools they needed, from content editing apps like CapCut to native selling through TikTok Shop.Also: Why a TikTok ban could collapse the creator economyNaturally, new creators flocked to the platform, hoping to build a following for themselves. That shift toward creators pushed every platform to adapt or risk becoming irrelevant.Suddenly, it wasn't all about pay-to-play anymore. Organic reach is now at an all-time high, giving creators the chance to build real, meaningful businesses. This shift goes beyond TikTok; it's happening across all platforms.This is precisely why creators are shaping the future. 💪For once, the small guys like you and I got the win we desperately needed.This shift started around 2020. While creator-led businesses existed before, this time, it felt different. This time, it felt permanent.Also: This social media shift could be the opportunity you've been waiting forFast-forward to now, and headlines like "YouTube Star Mr.Beast Is Raising Money at a $5 Billion Valuation" are appearing on the front pages of your favorite news sites.Sure, that's an extreme example, but it's real. At the same time, Gen Z is driving a cultural shift built on trust in creators, a desire for authenticity, and a need for authentic connection.Some might say this is a story about what changed. But maybe it's the opposite.Maybe this is a story about how we never really changed at all. Perhaps this is proof that we've always craved authentic human connection. Maybe we just forgot how powerful it could be.🤔The secret explained and the future of businessBased on current evidence, the future of business is in the hands of creators who build authentic connections with their audience. Future entrepreneurs won't just need to create exceptional products and services; they'll also need to be their own biggest "influencer."Traditional paid customer acquisition is becoming harder to sustain, pushing businesses to rely more on owned channels like email, SMS, and content. At the same time, ad fatigue is growing, especially among the next generation, who are increasingly skeptical of paid ads. 🙅Also: AI is huge, but this opportunity will be even biggerIn fact, 77% of social media users prefer content created by influencers over traditional ads, according to IZEA's 2025 report.This is precisely why creators play a central role in what comes next.According to Shopify, 69% of consumers trust influencer recommendations. Moreover, 74% of Gen Z shoppers say they spend most of their free time online, and influencers are the primary way they discover new products.It's easy to see why creators are shaping the future of business. And it's not just B2C; B2B is shifting too. 🚀According to Sprout Social's Q1 2025 Pulse Survey, 67% of B2B brands use influencer marketing to increase brand awareness, while 54% use it to build credibility and trust.This approach works because people trust people. According to Nielsen, 92% of consumers trust recommendations from individuals over brands, even if they don't know them personally. 😱Also: GPT-4o's image update unlocked a huge opportunity most people are ignoringAs AI and competition flood every market, and users are overwhelmed with options, people will turn to the ones they know, like, and trust to help them decide what to buy.According to Statista, 60% of marketers say influencer marketing delivers a higher ROI than traditional ads, proving this isn't just a trend.Influencer marketing brings in an average return of $5.78 for every $1 spent, according to Influencer Marketing Hub. It's not hard to see why this might be the solution to your rising customer acquisition costs. 🤯The future is already here in many ways, and while the mediums have evolved, the core hasn't changed. People still do business with people.Your big opportunity This future isn't about going viral or hiring a bunch of influencers to promote your product or service. Sure, those things can help.What I'm saying is this: Now is the time to get closer to your customers. Show behind the scenes. Share the wins, the losses, what's new, what's coming. You might not realize it, but a lot of what you do every day is already interesting to your audience.Being human will be rare in a world filled with AI and automation. 🎯So here's your big opportunity.This isn't about being a creator or chasing views. It's about leverage. The kind of leverage that comes from owning attention, not renting it.Also: Will AI destroy human creativity? No - and here's whyWhen you build an audience, you don't have to beg platforms to talk to your people. You don't have to rely on discounts to drive sales, and you're not at the mercy of another algorithm update wiping out your growth overnight.That leverage turns into attention. And if you have a good product or service, that attention turns into money in the bank. 💸🤑💰I know what you might be thinking: What influencer or creator should I use?Hear me out… I think you should be the creator. 😩If creators are the future, shouldn't you be the one telling the world about your product or service? You don't need to be Mr. Beast making Emmy-level content, but it's important that you show up as your authentic self.Whether you're running a service business, building a product, or launching a new brand, it starts with telling your story.Also: Email marketing is back and big social is panicking - everything you need to knowYou can tell your story on social channels, but if the past has taught us anything, it's that owning your audience is the smart play. Think of ways to grow your email and SMS lists, not just followers you rent from someone else.It's not one or the other. It's both. 🤓☝️And yeah, I get it. It's one more thing to do on our daily grind. But it is what it is. I'd rather not overpay for customer acquisition.As my mother would say… pick a struggle. And in this case, I'd rather choose the one that leaves the most money in the bank.My two centsMy biggest, and I mean biggest, regret is not taking content more seriously earlier. I started to 10 years ago, but I didn't follow through simply because paid traffic worked, and at the time, I didn't see the value in creating content.In many ways, I thought I could "hide" behind my laptop and make a living. It turns out that not investing in content was a massive miss.I'd also tell myself that the Jeff Bezoses and Mark Zuckerbergs of the world didn't create content, so why should I? 😤I didn't realize back then that they were content creators, too. It just looked different. Now I see it clearly. Every interview, every event, every magazine cover was content. It just came disguised as "press."Also: GPT-4o's image update unlocked a huge opportunity most people are ignoringBut I see now that regardless of what you call it, it was a mechanism for them to get in front of their audience.The good news? It's not too late for me, and it's not too late for you.All that said, I'll leave you with this.Get out there and show the world how special you are. It won't be easy, but it is simple. Don't overthink it. With time, you'll get better.😇 Hope this helps.P.S. If you want more easy and helpful AI tips and tricks, sign up for my free newsletter, No Fluff Just Facts. Plus, you'll get to see behind the scenes as I pay my penance for not creating content 10 years ago.Get the morning's top stories in your inbox each day with our Tech Today newsletter.Featured
    0 Комментарии 0 Поделились 66 Просмотры
  • WWW.FORBES.COM
    Apple’s Beats Unveils 1st-Ever USB-C Cables In New Colors And Unexpected Sizes
    After headphones, speakers and iPhone cases, the newest Beats product is a range of charging cables.
    0 Комментарии 0 Поделились 67 Просмотры
  • TIME.COM
    Exclusive Clip: In War-Torn Kyiv, Vitalik Buterin Makes the Case for Crypto’s Future
    31-year-old Vitalik Buterin is one of crypto’s most important figures. As the founder of Ethereum, he pioneered the idea that crypto and blockchains could serve larger purposes beyond money. But Buterin also sought to diminish his own role within the ecosystem, and encouraged his community to think much bigger than their own short-term gains. A new documentary, Vitalik: An Ethereum Story, tells his story, following him around the world as he confronts difficult technical problems and evangelizes a strange new technology that attempts to reorient the world around its decentralized value system. ”In our minds, tech is never neutral: it’s a reflection of the values and blind spots of its creators,” says co-director Chris Temple. “As we head into conversations around AI and crypto, Vitalik models a new kind of leadership compared to the tech leaders that we're used to—the Elon Musks, Mark Zuckerbergs and Jeff Bezos’s of the world—that are that are structured within these centralized organizations and decisionmaking apparatuses.” Temple and co-director Zach Ingrasci filmed Buterin across two years, and much of their footage didn’t make it into the 86-minute film. That includes one scene in Kyiv, Ukraine, in which Buterin plays chess with Mykhailo Fedorov, Ukraine’s Minister of Digital Transformation. The scene, published exclusively by TIME, shows a fascinating peek into Buterin’s ideological leanings, and his acute desire for crypto to have real-world use cases beyond speculation. Buterin traveled to Ukraine in September 2022, six months after Russia’s invasion. While Buterin was born in Russia, he staunchly opposed the invasion and personally donated millions to Ukrainian relief efforts. Thanks in part to his vocal support, almost $100 million in crypto poured into Ukraine in the first couple weeks of the invasion, offering fast relief and easy, direct transactions.The deleted scene shows Fedorov and Buterin talking over a game of chess. In the early days of the invasion, Fedorov tells him, the country’s national bank had banned international transactions, so the Ukrainian government instead used crypto to receive funding and buy weapons and military supplies. “All of the first drones, lots of ammunition, arrived thanks to crypto,” Fedorov tells him. “We saved the lives of hundreds—maybe thousands of our military. So it was highly important.”Buterin responds: “We love Ukraine… For the blockchain community itself, this was the first opportunity to make a real difference with blockchain and cryptocurrency.” Later, when Fedorov points out that Ukrainians continue to live in the war-torn country, Buterin adds: “I think continuing real life means, even if I’m a good person, to show Putin a middle finger.” Ingrasci says the clip didn’t make it into the final cut because the moment it depicts is referenced in other ways. “But I think it's the most important moment for Vitalik in our journey with him, because it’s this real world use case of how crypto can really make the world a better place,” he says. Vitalik: An Ethereum Story is available on VOD on April 15. TIME Studios served as one of the film’s production companies. Andrew R. Chow’s book about crypto and Sam Bankman-Fried, Cryptomania, was published in August.
    0 Комментарии 0 Поделились 84 Просмотры
  • WWW.TECHSPOT.COM
    AMD Epyc 'Venice' will be built on TSMC's N2 node, 5th-gen Epyc to be fabbed in Arizona
    In a nutshell: The 6th-generation AMD Epyc processors, codenamed Venice, will be the first high-performance computing product built using TSMC's 2nm (N2) process node. Team Red also confirmed that TSMC's new Fab 21 facility in Arizona has successfully validated 5th-generation Epyc silicon and will handle some of the chip production in the United States. Venice, built on AMD's upcoming Zen 6 microarchitecture, represents a major milestone in the company's data center roadmap and remains on track for release next year. While AMD withheld further details, it confirmed the silicon has been taped out and brought up – indicating the CCD powered on successfully and passed initial tests. Leaked details indicate that Epyc Venice CPUs will use the new SP7 socket, replacing the SP6 (LGA 4094) platform in Zen 4c-based Siena processors. Rumors also point to support for both 12-channel and 16-channel memory configurations, along with faster DIMM speeds on the PCIe Gen 6 interface. The new N2 process node marks TSMC's first use of gate-all-around (GAA) nanosheet transistors. The company calls it the industry's most advanced technology for density and energy efficiency, claiming the nanosheet structure delivers a 15 percent performance boost at the same voltage or a 24 to 35 percent drop in power use compared to the older 3nm finFET (N3) process. The announcement comes on the heels of Intel's delay of its Xeon "Clearwater Forest" data center processors, now expected in the first half of 2026. Initially slated for release this year and based on the company's 18A process technology, the chips will arrive at least a few quarters late – even if Intel sticks to its revised timeline. Unlike Clearwater Forest, Intel's Panther Lake CPUs for client PCs, built on the 18A process, remain on track for release later this year. Like other recent Intel chips, Panther Lake will use a hybrid architecture combining Cougar Cove performance and Skymont efficiency cores. The processors will also feature the Xe3 'Celestial' GPU, offering up to 12 Xe3 cores. // Related Stories
    0 Комментарии 0 Поделились 74 Просмотры