• TECHCRUNCH.COM
    The race is on to make AI agents do your online shopping for you
    Millions of Americans will pop open their laptops to buy gifts this holiday season, but tech companies are racing to turn the job of online shopping over to AI agents instead.Perplexity recently released an AI shopping agent for its paying customers in the United States. Its supposed to navigate retail websites for you, find the products youre looking for, and even click the checkout button on your behalf.Perplexity may be the first major AI startup to offer this, but others have been exploring the space for a while so expect to see more AI shopping agents in 2025. OpenAI and Google are reportedly developing their own AI agents that can make purchases, such as booking flights and hotels. It would also make sense for Amazon, where millions of people already search for products, to evolve its AI chatbot, Rufus, to help with checkout as well.Tech companies are using a mix of new and old techniques to get around the barriers erected by retailers to block unwanted bots from using their sites. Rabbit released its LAM Playground earlier this month, which lets an AI agent navigate websites on your behalf using a computer in a data center. Anthropics computer use agent does the same thing, but its hosted on your personal computer.Meanwhile, Perplexity is partnering with Stripe to leverage some older payments features that have been repurposed for AI agents.Stripe is allotting single-use debit cards for Perplexitys AI agent to spend money with online a repurposed version of the Stripe Issuing feature. This makes it so the agent can buy you a pair of socks without needing access to your entire bank account. That way, if it hallucinates, the agent just buys the wrong socks for a few bucks and doesnt spend your rent money on, well, socks.Googles AI agent reportedly needs access to your credit card information, which could give consumers pause. However, several companies such as Google, Amazon, Apple, and Shopify already know your billing info and they regularly fill out forms for you when youre shopping online. This could give these companies an advantage when they ship products in the space.These tools could reshape online shopping something retailers and advertisers making a fortune from the status quo may not be happy about.Just as AI chatbots have proven somewhat useful for surfacing information thats hard to find through search engines, AI shopping agents have the potential to find products or deals that you might not otherwise have found on your own. In theory, these tools could save you hours when you need to book a cheap flight, or help you easily locate a good birthday present for your brother-in-law.Theres a long way to go before AI agents can buy everything on your holiday wishlist, but theres a lot of companies vying to do it.Based on our early attempts, Perplexitys shopping agent takes hours to process purchases and sometimes runs into issues where it cant purchase items at all. Overall, using the agent today seems more complicated than buying something on Amazon.Perplexity also says there are human checkers involved to ensure its AI agent is working accurately. Having a human in the loop is not uncommon for the AI industry but then, most AI chatbots dont see the items Im purchasing and my billing address. This raises some privacy issues for Perplexity, and whatever company is hiring its human checkers.TechCrunch tested out Perplexitys shopping agent by asking it to buy us toothpaste. After prompting Perplexity with, Id like to buy toothpaste, the agent returned several options from Walmart, Amazon, and some smaller websites. For a few options, Perplexity offers a button under the product called Buy with Pro while other options take you straight to the website of the retailer. Buy with Pro is Perplexitys shopping agent at work.Prompting Perplexitys Shopping agent (left), results (middle), and purchase confirmation (right).Image Credits:Perplexity/Maxwell Zeff (screenshot)I chose a tube of Crest from Walmart. Without leaving the Perplexity app, I was able to check out and (seemingly) purchase the toothpaste. But instead of paying Walmart, my bank statement showed that I had paid Perplexitys agent.Three hours later, I received an email from Perplexity that its agent was not able to buy me the toothpaste, because it was sold out at Walmart. The next day, I tried to purchase another tube of Crest with Perplexitys shopping agent. Eight hours later, I got a confirmation from Perplexity that it worked.So what gives? Why did my first purchase get rejected, and why did both take hours to complete?While Perplexity Shopping might seem a lot like Amazon or the TikTok Shop, where you can buy items from a wide array of merchants who upload and manage storefronts on the platform, its actually completely different.Perplexitys AI agent seems to be scraping retailers websites and giving you information about their products. Because this process isnt necessarily in real time, it can cause a disconnect between what Perplexity tells you and what a store actually has in stock, which appears to be what happened in my case.Perplexity declined to comment on whether retailers like Walmart were aware that their products were appearing on its app. This suggests that their scraping and purchase process is not authorized by those companies something that could complicate buying or returning items.Youre also not actually buying anything when you check out in Perplexitys app. Youre paying Perplexity the exact amount that item costs, giving its AI agent instructions to buy a specific item, and telling it to fill out your name and shipping address in the process. Some time later, perhaps hours, the agent executes that task, or at least tries to.This is the equivalent of giving a small pot of money to an assistant in the real world, and give them rules about how theyre allowed to spend it, said Stripe product lead Jeff Weinstein, who helped build Stripes AI agent toolkit, in an interview with TechCrunch.But instead of giving money (in a pot or otherwise) to a real human assistant, who I would trust to buy toothpaste on their own, Perplexitys AI agent occasionally needs to be monitored by another human. And even then, it doesnt always work.I cant disclose specifics around how Buy with Pro works, but what I can say is that there is human oversight providing occasional support, which ensures that transactions are completed in a timely manner and we avoid issues like purchasing the wrong product, said Perplexity spokesperson Sara Platnick in an email to TechCrunch.These days, hiring human checkers to watch AI systems is commonplace. Companies like Scale AI and Turing have built large businesses around the service.But in this case, Perplexity declined to answer TechCrunchs questions around how often human oversight was necessary, how involved humans are in the process, and whether human checkers are watching AI agents make purchases in real time. The lack of transparency here may not bother everyone, but its certainly worth noting.If AI shopping agents really take off, it could mean fewer people going to online storefronts, where retailers have historically been able to upsell them or promote impulse purchases. It also means that advertisers may not get valuable information about shoppers, so they can be targeted with other products.For that reason, those very advertisers and retailers are unlikely to let AI agents disrupt their industries without a fight. Thats part of why companies like Rabbit and Anthropic are training AI agents to use the ordinary user interface of a website that is, the bot would use the site just like you do, clicking and typing in a browser in a way thats largely indistinguishable from a real person. That way, theres no need to ask permission to use an online service through a back end permission that could be rescinded if youre hurting their business.Rabbit CEO Jesse Lyu said in a recent interview that AI agents are getting better than humans at solving CAPTCHA, the human verification tests that have previously prevented bots from shopping online. That means website owners will need to develop more sophisticated ways to prove personhood online.Its possible that one day, AI agents could be part of a better online shopping experience than what exists today. Perplexitys shopping agent isnt that by a long shot, but it offers an early glimpse of what could be.In the next year, were likely to see better versions of AI shopping agents from Perplexity, OpenAI, and Google. We may just be seeing the tip of the iceberg in terms of how this could reshape the online retail industry, and what sorts of problems AI agent developers could run into.
    0 Comentários 0 Compartilhamentos
  • WWW.AWN.COM
    Patton Oswalt, Jordan Blum Present Marvel Unforgettable Stories
    Independent publisher The Folio Society presents Marvel Unforgettable Stories, a 280-page hardcover collection of 10 Marvel stories, selected by actor/writer Patton Oswalt and writer Jordan Blum. The selections range from the classic, such as The Amazing Spider-Man #33 from the heart of the Silver Age, to the contemporary Hawkeye #11. Marvel Unforgettable Stories features a new cover and slipcase design by Marvel artist Marcos Martn (Daredevil), an introduction by Oswalt, and 280 pages of adventures featuring Spider-Man, Wolverine, Daredevil, and Captain America.Unlike so many best-of collections, here Patton and Jordan have carefully provided a favorite selection of the comics that have defined their love of Marvel, said James Rose, Head of Editorial at The Folio Society. It's a personal journey through some of Marvel's most unforgettable stories. Housed in a hardback volume featuring artwork by Marvel artist Marcos Martn, the selection also includes one or two surprises.Jordan Blum and Iboth lifelong comics readers and make mine Marvel fanaticshad a hard time picking only 10 stories for this collection, writes Oswalt in the books introduction. I can hear the bleats and rumblings of, But you didnt include... and How could you forget... Lemme save your vocal cords the stress - youre right. Any objections you have to any stories being passed over? Youre right. Theres simply too many to print under one cover. Hey, maybe buy this edition, and get all of your friends to buy one, and theyll let us pick 10 more! Capitalism!Marvel Unforgettable Stories includes:The Amazing Spider-Man #33 "If This Be My Destiny - The Final Chapter" (1966)Daredevil #191 "Roulette" (1983)Uncanny X-Men #205 "Wounded Wolf" (1986)Captain America #367 "Magnetic Repulsion" (1990)X-Factor #87 "X-aminations" (1993)Untold Tales of Spider-Man #20 "Wings of Hatred" (1997)Spider Man's Tangled Web #4 "Severance Package" (2001)The Runaways #1 "Pride and Joy, Chapter One" (2003)Fantastic Four #587 "Three, Part 5: The Last Stand" (2011)Hawkeye #11 "Pizza Is My Business" (2013)Other Folio Society and Marvel collaborations include Marvel: The Golden Age 1939-1949, Marvel: The Silver Age 19601970, Marvel: The Bronze Age 1970-1980, Black Panther: A Nation Under Our Feet, and more.Marvel Unforgettable Stories is now available for purchase, including 100 signed copies.Source: Superfan Promotions Journalist, antique shop owner, aspiring gemologistL'Wrenbrings a diverse perspective to animation, where every frame reflects her varied passions.
    0 Comentários 0 Compartilhamentos
  • WWW.AWN.COM
    Technicolor Group Names David Conley President of MPC
    Technicolor Group, one of the global leaders in creative technology and visual experiences, has appointed David Conley as the new President of MPC, reporting directly to Andrea Miloro, Chief Business & Strategy Officer, and Caroline Parot, Chief Executive Officer.Conley brings over two decades of experience in the VFX industry. Most recently, he served as the Executive Producer at Wt FX, where he played a pivotal role in delivering groundbreaking visuals for projects such as Avatar: The Way of Water, The Last of Us Series, and War is Over!. His expertise in managing complex, large-scale productions while fostering collaboration between creative and technical teams has earned him a reputation as one of the most respected figures in the industry.He has also held leadership roles at other globally recognized studios, spearheading initiatives to integrate emerging technologies into creative workflows, delivering results that set new benchmarks for quality and innovation. His work has garnered numerous accolades, including Academy Awards, BAFTAs, and Emmys, further highlighting his impact within the industry.Andrea Miloro and Caroline Parot commented, Davids appointment embodies our commitment to delivering world-class visual experiences. Renowned as a visionary creative leader, his passion for pioneering emerging technologies perfectly aligns with our mission to redefine the boundaries of possibility in visual effects. Under Conleys leadership, MPC will look to expand its creative partnerships with filmmakers by combining creative excellence with advanced trends in technology to deliver innovative content. The company noted that This announcement underscores MPCs dedication to remaining at the forefront of the industry and delivering artistry and innovation that captivate audiences worldwide.I am honored to join MPC at such an exciting moment in its journey, Conley shared. The teams talent, passion, and ambitions to push creative boundaries are inspirational, and I look forward to collaborating with Andrea, Caroline, and the entire team to shape the future.The company also noted that, As Technicolor continues to grow its global footprint and technological capabilities, this appointment signals the companys steadfast commitment to innovation and its unwavering dedication to the creative vision of its clients.The Technicolor Group is home to a network of award-winning studios: MPC, The Mill, Mikros Animation and Technicolor Games.Source: Technicolor Group Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.
    0 Comentários 0 Compartilhamentos
  • WWW.ARTOFVFX.COM
    David Conley, Former Weta FX Executive Producer, Joins MPC as President
    Studios NewsDavid Conley, Former Weta FX Executive Producer, Joins MPC as PresidentBy Vincent Frei - 02/12/2024 MPC welcomes David Conley as its new President! Renowned for his pivotal contributions to Avatar and The Last of Us at Weta FX, Conley brings a wealth of experience to one of the industrys leading VFX studios. Exciting times ahead for MPC!Heres the press release:David Conley Appointed President, MPC by Technicolor GroupLondon, December 2, 2024 Technicolor Group, a global leader in creative technology and visual experiences, is excited to announce the appointment of David Conley as the new President of MPC (Moving Picture Company). Reporting directly to Andrea Miloro, Chief Business & Strategy Officer, and Caroline Parot, Chief Executive Officer, this dynamic addition to the team highlights a steadfast dedication to fostering world-class talent and pushing the boundaries of excellence in visual effects, driving groundbreaking creativity for audiences across the globe. David brings over two decades of experience in the VFX industry, having contributed to some of our most celebrated films and television projects. Most recently, he served as the Executive Producer at W?t? FX, where he played a pivotal role in delivering groundbreaking visuals for projects such as Avatar: The Way of Water, The Last of Us Series, and War is Over!. His expertise in managing complex, large-scale productions while fostering collaboration between creative and technical teams has earned him a reputation as one of the most respected figures in the industry.David has also held leadership roles at other globally recognized studios, spearheading initiatives to integrate emerging technologies into creative workflows delivering results that set new benchmarks for quality and innovation. His work has garnered numerous accolades, including Academy Awards, BAFTAs, and Emmys, further highlighting his impact within the industry.Andrea Miloro and Caroline Parot expressed their confidence in Davids leadership: Davids appointment embodies our commitment to delivering world-class visual experiences. Renowned as a visionary creative leader, his passion for pioneering emerging technologies perfectly aligns with our mission to redefine the boundaries of possibility in visual effects. Under Davids leadership, MPC will look to expand its creative partnerships with filmmakers by combining creative excellence with advanced trends in technology to deliver innovative content. This announcement underscores MPCs dedication to remaining at the forefront of the industry and delivering artistry and innovation that captivate audiences worldwide. David Conley shared his excitement about joining the business: I am honored to join MPC at such an exciting moment in its journey. The teams talent, passion, and ambitions to push creative boundaries are inspirational, and I look forward to collaborating with Andrea, Caroline, and the entire team to shape the future. As Technicolor continues to grow its global footprint and technological capabilities, this appointment signals the companys steadfast commitment to innovation and its unwavering dedication to the creative vision of its clients.ABOUT THE TECHNICOLOR GROUPTechnicolor Group is a creative technology company providing world-class production expertise driven by one purpose: The realization of ambitious and extraordinary ideas. Home to a network of award-winning studios, MPC, The Mill, Mikros Animation and Technicolor Games, we inspire creative companies across the world to produce their most iconic work. Our global teams of artists and technologists partner with the creative community across film, television, animation, gaming, brand experience and advertising to bring the universal art of storytelling to audiences everywhere. Vincent Frei The Art of VFX 2024
    0 Comentários 0 Compartilhamentos
  • WWW.ARTOFVFX.COM
    Blitz: Andrew Whitehurst Production VFX Supervisor
    InterviewsBlitz: Andrew Whitehurst Production VFX SupervisorBy Vincent Frei - 02/12/2024 Earlier this year, Andrew Whitehurst took us behind the scenes of the visual effects for Indiana Jones and the Dial of Destiny. Now, hes back to share insights into his latest project, Blitz, and the challenges of crafting invisible visual effects.How was the collaboration with Director Steve McQueen?It was a very close and collaborative working relationship. Steve likes people to be coming up with ideas and creative solutions to inform his process. Its part of my job to try and get into the head of the director so I can, hopefully, second guess what he would want ahead of showing him. Once Id had the chance to talk with Steve about his vision for the film we were able to work together closely and to help guide the VFX houses in their work. Sometimes there were happy accidents. For example, Steve briefed me on one of the fully digital establishing shots and I produced some detailed previs. I thought I had fully understood and executed the brief, but when I showed it to Steve and asked if it was what hed imagined, he said, Its not what I imagined at all, but I love it!. So sometimes the unexpected is whats wanted.How did you organize the work with your VFX Producer?It was a pretty clear delineation of work really. There was a lot of creative decision making to be done in post, which fell to me to work with the VFX houses and get the shots looking as Steve wanted, and that required a lot of production smarts and diligence to make sure we were sensible in the use of resources and that we would be able to deliver that work on time. We couldnt have managed that without Sona.What is the your role on set and how do you work with other departments?On set I will liaise with all the other departments to make sure that what we shoot will be usable in post. That means figuring our green screen placement with the grips, how the cameras and lenses are used and shooting plates and reference where needed, and how the lighting is going to complement the finished shot. Its a lot of dialogue, and then also being able to quickly react when creative decisions change on the day.How did you approach the challenge of recreating 1940s London with invisible visual effects?Casting visual effects houses whose strengths are in creating richly detailed environments is essential as they have the workflow and aesthetic knowledge to make that kind of work sing. Cinesite, ILM and Raynault VFX did beautiful work. We began with a lot of historical research so we could establish what the city looked like in 1940, how the buildings were constructed (so we could do realistic damage to them) and what the levels of smoke and atmosphere were like: London was a much smokier and dirtier city than today, even before there was bomb damage to add.How do you balance historical accuracy with creative liberty in recreating London during WWII?This is a really interesting part of the process. We would always start with historical reality as that establishes a ground truth that keeps us honest. Then we would look at what the emotional feeling of the moment was; reflective, angry, frightened etc. and look to see how we could adjust the lighting of the city, the amount of smoke and fire etc. to push the shot in the aesthetic and emotive direction we wanted without compromising the historical authenticity. For example the long aerial shot of the desolated city midway through the film was lit as if the sun was low so the buildings were mostly in shadow. This added a melancholic quality to the shot which would have been lost if we had used brighter and more direct sunlight. Doing a wedge test of different lighting directions can really help figure these things out.What were the main techniques used to depict the destruction caused by bombings in a realistic way?We used the full range of digital VFX techniques to achieve the wider city. We were very fortunate that the production built large exterior set pieces of bombed out streets that we could scan and use as a basis to work from. Then, after looking at archival material, we were able to extrapolate what the additional buildings should look like and model them in CG, and dress in digital debris which provided enough fidelity to run the fluid simulations for the fire and smoke. Small details could be added with a paint pass to add extra texture and detail a the end.How does creating invisible effects differ from more overt VFX work, such as creatures or explosions?The tools and techniques are the same but the aesthetics are a little different. We are always looking for ways we can help make a shot more beautiful, or more impactful, but we are always asking Does this tread on the drama? as the performance is the most important thing in the frame. When creating the full CG shots, we are looking at the photography of the surrounding shots to match the camera movement style, lens choices and lighting decisions so our shot, hopefully, just feel like part of the fabric of the film.How did you simulate the dust, smoke, and debris of bombed-out buildings without overwhelming the viewer?This is less of a question of technology and more of the aesthetic and creative sensibilities of the artists. They were brilliant at judging when dramatic was verging on too much, and backing off appropriately. Its a delicate balance to achieve but the VFX crews managed it beautifully.What challenges did you face when recreating the effects of bombings on peoples homes and historic landmarks?The main challenge was to be respectful of the lived experience of the place and those who dwelt there. For example, there is a shot towards the end of the film where the camera cranes up and in the foreground is a recently bombed out house. The house on that site had really been bombed out in the Blitz and the area was subsequently cleaned up and the tattered end wall of the surviving house patched up. So we were essentially charged with recreating the darkest moment in that streets history which felt like quite a responsibility, and we took that seriously.What details were essential in recreating a war-torn London that might go unnoticed but add authenticity?As well as adding period correct details such as adding tape across windows in homes and businesses where needed, it was often what we removed that was key. For example internet junction boxes, skylights in roofs, and modern doorbells all had to go, and that made a massive difference. They are things that in our day to day life are so ubiquitous that we just filter them out, but once theyre gone, the sparser look of the buildings becomes very noticeable.Can you describe how lighting and shadow were used to enhance the realism of the war scenes?Our approach to lighting wasnt really driven by chasing realism; the look of the film was not stylised so our lighting had to be naturalistic anyway. We did use light and shade to enhance the emotion of a shot, whether by keeping buildings under cloud cover shadow to make the mood bleaker or by adding little patches of sun to contrast with that shade. It was judged on a sequence by sequence basis to maximise the dramatic impact of each shot.What were the key emotional tones you wanted to achieve with these visual effects, and how did they influence your creative process?The film has many phases, from the chaos and violence of the opening scene with the firefighters struggling to contain the burning buildings, to long haunting aerial shots of the aftermath of the bombing. We also had occasional moments of joy, such as the children riding on the train roof. Each of these moments required the visual effects to enhance the emotion established by the lighting and production design. It was certainly a case when reviewing shots to look at the frame and ask what the desired emotion was and whether there was anything more we could do to enhance that. Would adding a bit more smoke in the background help to suggest the loss a character feels at the destruction surrounding them? Questions like that were always being asked.Looking back on the project, what aspects of the visual effects are you most proud of?I am so proud of the recreation of London. It became a character in its own right with moods and emotions of its own. I think the way visual effects combined with special effects and production design to achieve that was really effective and it created a credible and visually engaging world for the characters to inhabit. It really was a case of all the departments pulling together to realise a cohesive vision. Steves unerring eye was key to all of this.A big thanks for your time.WANT TO KNOW MORE?Cinesite: Dedicated page about Blitz on Cinesite website.WATCH IT ON Vincent Frei The Art of VFX 2024
    0 Comentários 0 Compartilhamentos
  • 3DPRINTINGINDUSTRY.COM
    3DPI Awards 2024: Frontier Bios Lab Grown Lung Tissue
    Eric Bennett, CEO of Frontier Bio, told us more about 3D printing in regenerative medicine and the development of lab-grown human tissues, aiming to ultimately eliminate the organ transplant waitlist. Frontier Bios groundbreaking applications, such as their brain on a chip technology, provide humane and highly relevant alternatives to animal studies in traumatic brain injury (TBI) research, with future potential for drug discovery in conditions like Alzheimers and Dementia.Their advances in vascular tissue engineering promise biologically derived grafts that outperform synthetic alternatives, improving outcomes in cardiac and peripheral artery procedures. By integrating 3D bioprinting and cellular self-assembly, Frontier Bio has overcome significant challenges in replicating complex structures like alveoli. Frontier Bios Lab Grown Lung Tissue is nominated for the 2024 3D Printing Industry Awards, Medical, Dental, or Healthcare Application of the year.3D printed bioreactor containing a cell-seeded scaffold that evolves into a blood vessel, used as a model to evaluate the effects of medical devices on human vessels. Photo by Frontier Bio.3DPI: Can you describe your application?Eric Bennett: Frontier Bio develops lab-grown human tissues aiming to ultimately replace damaged organs and eliminate the organ transplant waitlist. In the near term, we offer our tissues and engineering services to research entities seeking alternatives to animal studies. Our brain on a chip technology, for instance, replaces animal models in traumatic brain injury (TBI) research, providing more relevant human data and offering significant ethical advantages. This technology has potential future applications in disease modeling and drug discovery for conditions like Alzheimers and Dementia.Frontier Bios brain-on-a-chip replicates the blood-brain barrier (BBB), crucial in studying neurological disease and conditions. Photo by Frontier Bio.3DPI: How does your application address a specific unmet need in the medical, dental, or healthcare field, and what impact do you see it having on patient care or treatment outcomes?Eric Bennett: Our innovations address critical gaps in medical research and treatment, particularly through our developments in vascular applications. We envision our living grafts to be used for cardiac bypass, peripheral artery disease, and other application areas. Our engineered blood vessels aim to reduce the reliance on less durable synthetic grafts, potentially improving patient outcomes with more natural, functional alternatives. Similarly, our lung and brain models provide platforms for more effective drug testing and disease study, accelerating the development of treatments with direct relevance to human health. In the longer term, we envision creating full-size lungs for patients that need a lung transplant.3DPI: Can you describe the most significant technical or engineering challenge you faced while developing your application and how you overcame it?Eric Bennett: One of the most significant challenges was the creation of alveoli structures, which are intricate and difficult to replicate with conventional techniques. We tackled this by integrating 3D bioprinting with cellular self-assembly, guiding stem cells to form complex structures naturally. This approach, which some would refer to as a form of 4D printing, has allowed us to overcome limitations in current printing technologies and achieve breakthroughs in tissue engineering. But its not just about creating the shape and positioning cells. Weve successfully engineered our human lung models to produce natural substances, such as surfactant and mucus.Progression from 3D bioprinted stem cells (left image) to autonomously maturing and branching into alveolar air sacs (middle and right images), showcasing the capabilities of 4D bioprinting. Photo by Frontier Bio.3DPI: In what ways has your innovation streamlined or improved the efficiency of medical procedures, manufacturing, or patient recovery times?Eric Bennett: We have developed a new manufacturing technique for blood vessels that could significantly enhance patient recovery times and long-term treatment outcomes. Traditional synthetic grafts often have high failure rates; our biologically derived vessels promise better integration and functionality, potentially transforming the prognosis for patients requiring vascular interventions.3DPI: What role does collaboration with healthcare professionals or research institutions play in your development process, and how have these partnerships influenced your innovation?Eric Bennett: Collaboration is crucial for our growth and innovation. From the outset, Frontier Bio has engaged with leading institutions like Mayo Clinic, who have been both a customer and a collaborator, supported by the National Science Foundations SBIR program. These collaborations enrich our development, ensuring our solutions are attuned to real-world medical needs and can swiftly transition from concept to clinical application.3DPI: Can you discuss any case studies or patient outcomes that highlight the real-world benefits of your technology?Eric Bennett: While our technologies have yet to be used in human patients, one case study in neural tissue modeling for TBI research underscores the potential near-term applications. Traditional TBI studies often rely on animal models, raising ethical concerns and questions about applicability to human biology. Our human cell-based models provide a more relevant and humane alternative, paving the way for more effective treatments and therapies.3DPI: Is there anything else you would like to add?Eric Bennett: We are immensely grateful for the support from our investors, collaborators, and community, who help propel our mission forward. We look forward to a future where our technologies minimize the need for animal studies and make the organ transplant waitlist obsolete, enhancing patient care and medical research globally.The Flux-1 by Frontier Bio is a custom-designed precision 3D bioprinter, pushing the boundaries of regenerative medicine. Photo by Frontier Bio.Vote now in the 2024 3D Printing Industry Awards.All the news from Formnext 2024.Who are the leaders in additive manufacturing? Vote now in the 2024 3D Printing Industry Awards!What does the future of 3D printing hold?What near-term 3D printing trends have been highlighted by industry experts?Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news.Featured image shows 3D printed bioreactor containing a cell-seeded scaffold that evolves into a blood vessel, used as a model to evaluate the effects of medical devices on human vessels. Photo by Frontier Bio.Michael PetchMichael Petch is the editor-in-chief at 3DPI and the author of several books on 3D printing. He is a regular keynote speaker at technology conferences where he has delivered presentations such as 3D printing with graphene and ceramics and the use of technology to enhance food security. Michael is most interested in the science behind emerging technology and the accompanying economic and social implications.
    0 Comentários 0 Compartilhamentos
  • 3DPRINTINGINDUSTRY.COM
    3D Printing Industry Awards 2024: 3D Systems, Medical, Dental, or Healthcare Application of the Year
    Ahead of this years 3DPI Awards, Stijn Hanssen Director of Application Development for Dental at 3D Systems, told us more about developing the industrys first monolithic jetted denture solution, transforming how dentures are fabricated and delivered. This innovation combines multiple materials in a single print, mimicking the properties of natural teeth and gums to create durable, aesthetically superior, and comfortable prosthetics. Leveraging 3D Systems MultiJet printing technology, comprehensive workflow solutions, and decades of expertise in dental materials, this solution streamlines production for high-volume dental labs, reducing complexity, cost, and patient wait times. Collaborations with dental professionals have been integral to refining this technology, resulting in a product that enhances patient outcomes with precise fits, shorter workflows, and exceptional long-term comfort.3D Systems Dental Director of Application Development, Stijn Hanssen, at LMT Lab Day 2024. Photo via 3D Systems.Vote now in the 2024 3D Printing Industry Awards.3DPI: Can you describe your application?Stijn Hanssen: 3D Systems jetted denture solution is a first-to-market solution for monolithic dentures that utilizes multiple materials to deliver a durable, long-wear, aesthetically beautiful prosthetic to the patient.3DPI: How does your application address a specific unmet need in the medical, dental, or healthcare field, and what impact do you see it having on patient care or treatment outcomes?Stijn Hanssen: When these materials are used as part of 3D Systems complete workflow solution comprising materials, MultiJet printing technology, software, and services, high-volume dental laboratories can efficiently deliver a multi-material one-piece denture with properties that mimic teeth and gums in a single print.3D Systems showcased its new monolithic denture product line LMT Lab Day 2024. Photo via 3D Systems.3DPI: Can you describe the most significant technical or engineering challenge you faced while developing your application and how you overcame it?Stijn Hanssen: This product required the development of a full solution comprising materials, hardware and software. Additionally, it required enabling multiple materials with different properties to be 3D-printed simultaneously. To develop such a solution to produce a medical device for long-term use by a patient is a huge challenge. Throughout the entire process, it was important for our team to remain focused on the final application and outcome. As a result, making design and research decisions with the application in mind was of paramount importance to reach our final goal.3DPI: In what ways has your innovation streamlined or improved the efficiency of medical procedures, manufacturing, or patient recovery times?Stijn Hanssen: Our multi-material, monolithic jetted denture solution reduces complexity and cost for single-print denture fabrication, providing fast delivery times and shorter workflows. This allows users to create more precise designs, leading to better fit and function for their patients, as well as enabling fast and easy replacement prints if the patient loses or damages their dentures.3DPI: What role does collaboration with healthcare professionals or research institutions play in your development process, and how have these partnerships influenced your innovation?Stijn Hanssen: After producing our first proof-of-concept, it was most critical to discuss this with dental professionals to get their feedback and input on the requirement. As a company, we have many decades of experience in making dental materials and almost 10 years in 3D printing of these materials. We learned that developing good user requirements is key to developing a successful product. Therefore, its imperative that we consult many dental professionals throughout the process and consolidate their feedback on different iterations to develop a comprehensive set of requirements.3DPI: Can you discuss any case studies or patient outcomes that highlight the real-world benefits of your technology?Stijn Hanssen: We have multiple patients using the jetted dentures for more than six months and the feedback has been overwhelmingly positive. What we hear most from patients is the comfort and fit of these dentures is an improvement over their previous dentures. The digital workflow can have many advantages and using this highly accurate and reliable production process provides this best-in-class quality denture.Vote now in the 2024 3D Printing Industry Awards.All the news from Formnext 2024.Who are the leaders in additive manufacturing? Vote now in the 2024 3D Printing Industry Awards!What does the future of 3D printing hold?What near-term 3D printing trends have been highlighted by industry experts?Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news.Featured image shows 3D Systems jetted, monolithic dentures utilizes multiple materials to deliver a durable, long-wear, aesthetically beautiful prosthetic to the patient. Photo via 3D Systems.Michael PetchMichael Petch is the editor-in-chief at 3DPI and the author of several books on 3D printing. He is a regular keynote speaker at technology conferences where he has delivered presentations such as 3D printing with graphene and ceramics and the use of technology to enhance food security. Michael is most interested in the science behind emerging technology and the accompanying economic and social implications.
    0 Comentários 0 Compartilhamentos
  • REALTIMEVFX.COM
    How to get beam to angle/bend like this?
    Hey there! So Im a bit stumped on how to get ribbons/beams to angle/bend like this for a continuous (always on) effect, almost like a tesla coil. I dont need it to branch, just looking for those nice angular bendy shapes.Please do assist if you have any ideas, thank you! 2 posts - 2 participants Read full topic
    0 Comentários 0 Compartilhamentos
  • REALTIMEVFX.COM
    Niagara Mesh Particle Stretch Effect: How Does It Work? Help Please
    Hello, while watching tuataras youtube video I noticed the interesting and seemingly simple technique you see above (Its not simple). I want to understand the technical working principle, please help with this by answering the questions I ask.(Writing about the screenshot above) The material looks simple. We subtract the Particle position from the XYZ position (XYZ = 0 in Dynamic Parameter), add a mask so that the bottom part is pulled by UV at the beginning, followed by the top part.This material does not work without the following specific Niagara system settings.Interpolate Over Time module with the Input Position: (Particles: Position)As I understand it, the Interpolate Over Time module makes a smooth transition between digital values. (Not 0 1, but 0 0.1 0.2 0.3 etc), where smoothness of transition depends on Rate Of hange item.Target Vector Value - This is the target we want to move towards (In our case dynamically updated Niagara position on the scene)Conversion Operation: Passthrough as a Non-Large World Vector - Treats values as vectors (directions), not positions. It doesnt tell where to come (position), it just shows which direction to go and how fast.View of effect when Rate of change = 0.5A new Vector 4 parameter is created. For Vector RGB, the following is selected:(Particles, Interpolate Over Time, Interpolate Over Time V3: Moving Average).For the Float (Alpha), nothing is added.The Dynamic Material Binding is changed to our new Vector 4.At the same time, the value of Float (Alpha) works as the alpha of our material (responsible for the masks intensity). If Dynamic Param Alpha in the material is disabled, then Float Alpha will not affect the emitter either.Inherit Source Movement is a module in Niagara that allows particles to inherit Niagara System Actor object on the scene. Does the same thing as Local Space (If Local Space is switched off)The Dynamic Parameter module is not essential (without it, everything works), but in practice, changing its parameters has no visual effect. Specifically, WPOIntensity (Dynamic Param Alpha) doesnt work in the Dynamic Material Parameters module. Instead of that the Float in the new Vector 4 module does what Dynamic Material Parameters WPO Intensity should do.Heres how it works. Please answer the questions:Why do we set the start position in the material as a Dynamic Parameter (split via Make Float to XYZ)? Can we use something else? For example Object Position, etc. What exactly happens when we subtract Particle Position from Dynamic Parameter (split via Make Float to XYZ)? Why does everything work with it?Why is it that if you turn off the Dynamic Material Parameters module, nothing changes, but if you turn it off in Material (as coordinates), the effect is broken? Why does Float in the new Vector 4 module does what Dynamic Material Parameters WPO Intensity should do? Why doesnt WPO Intensity itself work? Why if I change XYZ no matter where: in the Niagara module in Dynamic Param, or inside the Dynamic Param in material = nothing changes on the stage? (It changes only visually in the Niagara viewport)Why is a new vector 4 module created? Why is Moving Average inserted into it in Vector RGB? Why is a new parameter inserted in Dynamic Material Binding?Below is the result when I reverted the Vector RGB to the default values of 1 1 1, or just turned off the new V4 parameterWhy does the stretching effect not work when the Inherit Source Movement module is not active?Below is an example of what the system looks like with Lifetime/LoopDuration = 0.2I.e. in general why are these three modules used for correct operation? (Interpolate Over Time, New V4 Param, Inherit Source Movement)I deeply appreciate your time in responding and thank you in advance to anyone who deigns to help. While I was writing this, I was thinking how hard it was to do VFX 10-15 years ago, when artists had to find the answers to these questions themselves and its not clear where, because the information was probably not even in the documentation.The clear answers that ChatGPT has given (your help is still needed):Why use Dynamic Parameter?Dynamic Parameter allows the material to receive real-time data from Niagara. Here, its used to set a reference position (XYZ = 0 in Dynamic Parameter) to control how particles behave relative to it.Why subtract Particle Position from Dynamic Parameter?By subtracting the Particle Position from this reference position, youre calculating the distance between the particle and that reference.Why if I disable Dyn Par in Emitter nothing changes?Dynamic Material Parameters in Niagara:This module acts as a bridge, sending data to the material. If you disable it in Niagara, the material still remembers the last received values, so it doesnt break immediately. However, if you disable it in the Material, the material no longer knows where to get the data.
    0 Comentários 0 Compartilhamentos
  • CGWAY.NET
    2023
    2023 o , . , casino online . , , . casino, , . , , , , , . , , online casino. , , . ! 2023 2023 , . casino online, .Casino Top 1 . . .Casino Top 2 , . live-. .Casino Top 3 . . . . , live-, casino online . , . , .-, . , MGA UKGC. , casino online .-, . . , casino online ., . , casino online . , . , . , . , . . , casino online, , . , . . , . , , , . casino online , . . casino online . , . , . , . , , . , . , , - . Online casino , . . , casino online, , . , . SSL- . . . , , . , . . . , . , . , , . , , . , , . , . , . : : casino online. , .: o . .: , . online casino . : . , .: , - , .: . , . , . : . , . : , , , . , . : . , , . : . , . , . . , , . Casino online , , . . , , . . o .
    0 Comentários 0 Compartilhamentos