0 Commentarii
0 Distribuiri
111 Views
Director
Director
-
Vă rugăm să vă autentificați pentru a vă dori, partaja și comenta!
-
-
WWW.GAMESPOT.COMThis Persona 5-Inspired RPG Is Cris Tales Developer's Next ProjectCris Tales developer We Are Dreams has released a gameplay trailer for its next game, Prisma. The game's art direction is heavily inspired by Persona 5 but has plenty of mechanics to help make it stand out.Prisma follows a photojournalist named Alma who finds a magical camera that transfers her to another world called Domacon, filled with other Almas from the multiverse. Friend, foe, or NPC, each Alma has a Prism Shard that grants them magical abilities in battle, and rival Almas will try to steal them. The game's turn-based combat is inspired by Persona 5 as well, but has some key differences. Players are able to use Alma's camera filters to distort reality and apply buffs to the entire party of alternate Almas, such as using a fisheye lens to bypass enemy shields or a shattered lens to turn a single-target attack into a multi-target one.Continue Reading at GameSpot0 Commentarii 0 Distribuiri 139 Views
-
WWW.GAMESPOT.COMPlayers Who Found A Destiny 2 Exploit Can Keep Their Free Loot, Says BungieDestiny 2 players recently discovered an exploit that allowed them to purchase the game's Dawning Event Card Upgrade--which normally retails for $10 USD--for the low, low price of free. Surprisingly, Bungie's letting them keep it.Players quickly noticed the upgrade was appearing in Destiny 2's in-game shop for free, with some even making videos and posts to spread the word about the free loot. Bungie quickly responded by disabling the upgrade, which had the unfortunate side-effect of also putting all Dawning 2024 Event Card Triumphs and related progression on hold.Due to disabling the Dawning Event Card, all Event Card Triumphs and progression are on hold until the main issue is resolved.We'll provide updates once we have more information. Destiny 2 Team (@Destiny2Team) December 10, 2024 Naturally, some players grew frustrated that their Dawning progress had been halted, while others were more upset about the fact that their ill-begotten gains might not be sticking around. After all, Bungie had said that once the issue was resolved, "players will need to re-acquire the Dawning Event Card Upgrade for 1,000 Silver."Continue Reading at GameSpot0 Commentarii 0 Distribuiri 136 Views
-
GAMERANT.COMPokemon Sun Hacked Save Seller ArrestedA person selling Pokemon Sun hacked saves has been arrested by the police in Japan. The move may surprise some fans of the series, but Pokemon Sun is far from the first game that's been hacked, and it's not even the first to have arrests connected to illegally modifying the games, either.0 Commentarii 0 Distribuiri 108 Views
-
GAMERANT.COMDune: Part Two Has A Surprising Number of Golden Globes NominationsDune: Part Two is undoubtedly one of the biggest blockbusters of the year and follows up the 2021 Academy Award nominee in grandiose style. However, while the sequel has been recognized in the recent crop of Golden Globe Award nominees, it has received a surprising number of nods.0 Commentarii 0 Distribuiri 111 Views
-
BLOGS.NVIDIA.COMAI Pioneers Win Nobel Prizes for Physics and ChemistryArtificial intelligence, once the realm of science fiction, claimed its place at the pinnacle of scientific achievement Monday in Sweden.In a historic ceremony at Stockholms iconic Konserthuset, John Hopfield and Geoffrey Hinton received the Nobel Prize in Physics for their pioneering work on neural networks systems that mimic the brains architecture and form the bedrock of modern AI.Meanwhile, Demis Hassabis and John Jumper accepted the Nobel Prize in Chemistry for Google DeepMinds AlphaFold, a system that solved biologys impossible problem: predicting the structure of proteins, a feat with profound implications for medicine and biotechnology.These achievements go beyond academic prestige. They mark the start of an era where GPU-powered AI systems tackle problems once deemed unsolvable, revolutionizing multitrillion-dollar industries from healthcare to finance.Hopfields Legacy and the Foundations of Neural NetworksIn the 1980s, Hopfield, a physicist with a knack for asking big questions, brought a new perspective to neural networks.He introduced energy landscapes borrowed from physics to explain how neural networks solve problems by finding stable, low-energy states. His ideas, abstract yet elegant, laid the foundation for AI by showing how complex systems optimize themselves.Fast forward to the early 2000s, when Geoffrey Hinton a British cognitive psychologist with a penchant for radical ideas picked up the baton. Hinton believed neural networks could revolutionize AI, but training these systems required enormous computational power.In 1983, Hinton and Sejnowski built on Hopfields work and invented the Boltzmann Machine which used stochastic binary neurons to jump out of local minima. They discovered an elegant and very simple learning procedure based on statistical mechanics which was an alternative to backpropagation.In 2006 a simplified version of this learning procedure proved to be very effective at initializing deep neural networks before training them with backpropagation. However, training these systems still required enormous computational power.AlphaFold: Biologys AI RevolutionA decade after AlexNet, AI moved to biology. Hassabis and Jumper led the development of AlphaFold to solve a problem that had stumped scientists for years: predicting the shape of proteins.Proteins are lifes building blocks. Their shapes determine what they can do. Understanding these shapes is the key to fighting diseases and developing new medicines. But finding them was slow, costly and unreliable.AlphaFold changed that. It used Hopfields ideas and Hintons networks to predict protein shapes with stunning accuracy. Powered by GPUs, it mapped almost every known protein. Now, scientists use AlphaFold to fight drug resistance, make better antibiotics and treat diseases once thought to be incurable.What was once biologys Gordian knot has been untangled by AI.The GPU Factor: Enabling AIs PotentialGPUs, the indispensable engines of modern AI, are at the heart of these achievements. Originally designed to make video games look good, GPUs were perfect for the massive parallel processing demands of neural networks.NVIDIA GPUs, in particular, became the engine driving breakthroughs like AlexNet and AlphaFold. Their ability to process vast datasets with extraordinary speed allowed AI to tackle problems on a scale and complexity never before possible.Redefining Science and IndustryThe Nobel-winning breakthroughs of 2024 arent just rewriting textbooks theyre optimizing global supply chains, accelerating drug development and helping farmers adapt to changing climates.Hopfields energy-based optimization principles now inform AI-powered logistics systems. Hintons architectures underpin self-driving cars and language models like ChatGPT. AlphaFolds success is inspiring AI-driven approaches to climate modeling, sustainable agriculture and even materials science.The recognition of AI in physics and chemistry signals a shift in how we think about science. These tools are no longer confined to the digital realm. Theyre reshaping the physical and biological worlds.0 Commentarii 0 Distribuiri 123 Views
-
BLOGS.NVIDIA.COMTurn Down the Noise: CUDA-Q Enables Industry-First Quantum Computing Demo With Logical QubitsQuantum computing has the potential to transform industries ranging from drug discovery to logistics, but a huge barrier standing between todays quantum devices and useful applications is noise. These disturbances, introduced by environmental interactions and imperfect hardware, mean that todays qubits can only perform hundreds of operations before quantum computations irretrievably deteriorate.Though seemingly inevitable, noise in quantum hardware can be tackled by so-called logical qubits collections of tens, hundreds or even thousands of actual physical qubits that allow the correction of noise-induced errors. Logical qubits are the holy grail of quantum computing, and quantum hardware builder Infleqtion today published groundbreaking work that used the NVIDIA CUDA-Q platform to both design and demonstrate an experiment with two of them.These logical qubits were used to perform a small-scale demonstration of the so-called single-impurity Anderson model, a high-accuracy approach necessary for many important materials science applications.This constitutes the first time that a demonstration of a materials science quantum algorithm has been performed on logical qubits. The creation of just a single logical qubit is extremely challenging. Infleqtion was able to achieve such a feat thanks to accurate modeling of its quantum computer using CUDA-Qs unique GPU-accelerated simulation capabilities.Having developed and tested its entire experiment within CUDA-Qs simulators, with only trivial changes, Infleqtion could then use CUDA-Q to orchestrate the experiment using the actual physical qubits within its Sqale neutral atom quantum processor.This work sets the stage for quantum computings move toward large-scale, error-corrected systems.Many scaling challenges still stand between todays quantum devices and large systems of logical qubits, which will only be solved by integrating quantum hardware with AI supercomputers to form accelerated quantum supercomputers.NVIDIA continues to work with partners like Infleqtion to enable this breakthrough research needed to make accelerated quantum supercomputing a reality.Learn more about NVIDIAs quantum computing platforms.0 Commentarii 0 Distribuiri 120 Views
-
WWW.VFXVOICE.COMACHIEVING MAXIMUM ALTITUDE WITH THE VISUAL EFFECTS FOR EDGE OF SPACEBy TREVOR HOGGImages courtesy of Jean de Meuron and VFX Los Angeles.While the likes of Chuck Yeager, Neil Armstrong and John Glenn looked to soar humanity to heights, Jean de Meuron took his fascination with cinema to commemorate their aerial accomplishments with the short film Edge of Space, which was a winner at the OSCAR and BAFTA Qualifying LA Shorts International Film Festival 2024 (Jury Special Mention). Running 18 minutes, the story revolves around United States Air Force test pilots being sent on a suborbital mission with the hypersonic rocket-powered X-15, paving the way for America to land on the Moon before the Soviet Union.If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Charlie Joslain, Senior Visual Effects SupervisorAn actual X-15 was used for exterior shots and scanned to create a digital version for when the hypersonic aircraft takes flight.The X-15 was able to penetrate the Krmn line, which, officially per NASA, is the edge of space where you go 330,000 feet up in the air, states producer/ director/writer de Meuron. Many X-15 test pilots received astronaut wings.A blueprint for the production was The Creator by Gareth Edwards. The Creator was cost-effective, but it also gave a sense of real scale and scope. Denis Villeneuve or Gareth Edwards said, If you make the most of the frame reel, the added visual effects blend in naturally. Charlie Joslain [Senior Visual Effects Supervisor], Izzy Traub [Visual Effects Supervisor] and I storyboarded everything in pre-production so on set we knew exactly what we wanted filmed. We also scanned multiple assets and locations during different times of the day that were then built in 3D.Something that would easily go unnoticed are the addition of 3D tombstones created by VFX Los Angeles. As often with films of this caliber, in terms of 100 plus visual effects, a lot of it is invisible effects, Joslain notes. You expect all of the contrails and airplanes, but there is a ton of clean-up little lights in the background, and traffic in the desert to create an isolated place since the airbase was supposed to be secret. Jean wanted some tombstones and they looked good, but were placed in a way that was too narrow and didnt quite give the gravitas and scale of the sacrifice of American pilots for that cause. We did some research to make sure that we found the right kind of tombstones, recreated multiple CGI ones, and ended up not quite Arlington National Cemetery but something similar in the desert.The framed picture of American President John F. Kennedy was incorporated into the set decoration to authentically recreate 1961.Crafting the contrails of the X-15 was made even more complicated by the aircraft essentially being a hypersonic rocket. That little gap between the back of the airplane and the contrail, Joslain explains. Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience? Through the use of plate versus recreating similar plates, then doing a lot of calculation, work and optical engineering as to what zoom lens would create what effect thats how we created the best of both worlds. It looks historically and scientifically accurate, but its telling a story, is still engaging, and the plane feels like its there.That little gap between the back of the airplane and the contrail: Imagine that multiplied by x amount. Would you be able to see the X-15 up in the sky, when its 45 feet long and 70,000 feet up in the air? You probably wouldnt be able to see it, but if you dont show it, what exactly is our character looking at up in the sky? We had to find that balance of what it should have looked like and how do we represent it so its engaging for the audience?Charlie Joslain, Senior Visual Effects SupervisorChanneling the cinematography of Days of Heaven, Jean de Meuron opted to shoot during the magic hour.Clouds became aerial landmarks indicating size, scale and speed. At one point, Glen Ford [Chad Michael Collins] penetrates [the Krmn line], and there is this massive, beautiful shot set against the sun, de Meuron recalls. Its backlit and silhouetted, but then we wanted to give a sense of scale. This is still earthbound, but the minute he penetrates, we go to space where we dont have clouds. The clouds helped us give a sense of scale and depth as well as layers and nuances with light, shadows, and a little underexposed in the foreground. We played heavily into those cloud formations. Even more important than scale is the sense of speed. Weve seen a million films and sci-fi movies, and again the X-15 is supposed to fly Mach 5 or 6, Joslain notes. When clouds of that scale start drifting past so fast, that helps to portray the sense of speed of the aircraft.Only digital versions of the Huey helicopter and B-15 are shown flying.Visor reflections are essential to have but hellish to pull off in a believable manner. Anything to do with the visor or helmet is a mixture, Joslain reveals. Roughly half of the scenes have the visor on where we had to erase reflections from outside of the cockpits and therefore recreate the performance, repaint skin or add the twinkle in the eye. The opposite was the true of the few shots that we got with the visor off where we had to recreate a CG visor and then repaint reflections from the cockpit and Moon on that visor. Thats more or less how this whole thing was tied together. No shot was untouched. If you look at a project like Edge of Space and think, Oh, my god, this shot with the spaceship is going to be the coolest and the hardest one. Yeah, but the last shots to be approved were the ones with the visor because theres no place to hide. Youve got to make sure that skin looks good. Youve got not to distract from the performance. Everyone can sense what a reflection looks like on a curved piece of glass, so actually those were the hardest ones to get right.Outer space was based on ISS footage. I would text pictures and references from astronauts in space either from the ISS, Mercury, Apollo or Gemini when they filmed and took pictures in outer space, de Meuron states. Its interesting because gradually the tones and shades of blue [change], Charlie and I would look at that. You can see from the ISS how the blue gradually transitions into a dark black and then becomes pitch black. Discoveries could be shared at any moment. Joslain recalls a funny anecdote that tells you a lot about Jeans dedication for the last two years: You would get a bunch of texts at 4 p.m., and I would go, I know this is Jean and hes found something! But most of the time this would actually take place at 3 a.m., and youre like, Jean, not now!The logos on the X-15 had to be digitally altered to make them period-accurate.We want to respect [director] Jeans [de Meuron] vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.Charlie Joslain, Senior Visual Effects SupervisorFor exterior shots, a real X-15 was photographed and scanned. Any sort of motion, such as the gears turning, were CG; even the front wheels because they werent quite right, Joslain remarks. As far as the texture, the real X-15 was used as a reference, but a lot of the logos had to be painted out, recreated and redesigned to match the historical plane, as opposed to the current NASA museum piece that it is. The cockpit was sealed off, so it was recreated in the studio by Production Designer Myra Barrera, with the visual effects team producing a digital version as well. We had LED panels and lights, and when you see the astronaut, its frontal, de Meuron states. I didnt want the actors profile because First Man had already done that. I wanted to do my own interpretation. I wanted it to be tight and claustrophobic in a real closeup or extreme closeup so we see every nuance of his performance, and maybe how he twitches or is sweating.One of the toughest elements to create and integrate were the contrails being generated by what is essentially a rocket.Capturing the aerial establisher of the landing strip was a drone. That was a real shot, explains Traub. As the camera keeps going, you see someone working on the plane; that was the same person doing the motion capture performances of everybody! Normally for a project, you can go in and purchase model packs and use them. We had to model everything from scratch because there wasnt anything that we could find for the most part that fit the historical references. One thing that is interesting is we actually replaced the X-15 in that particular shot because it gave us more control. One cannot have an airport landing strip without a control tower. We obviously didnt have a lot of photos of the Edwards Air Force Base in the 1960s, Joslain states. An important part of an airbase is going to be the control tower. We had an overview photograph of the base at the time. Assuming the picture and information were correct, we knew which month and year this was taken, and we did a bit of reverse engineering to figure out, according to the length of the shadow, normally how high the control tower was going to be. When we put the control tower in the shot, it was too small, so we had to make it bigger!A cool color palette was adopted for the outer space shots to make the cosmic environment feel colder.The landing shot of the X-15 was extremely difficult. The drone was more or less a continuous speed, but obviously an airplane landing and slowing down is not a continuous speed, Joslain remarks. But how do you create that? We had to find the right balance of what would be an accurate speed for the X-15 to slow down and grind to a halt. But matching that stopping moment with the twist of the pan of the camera, then having the jeep and vehicles enter, that was a complicated one to figure out. Contributing to the believability were lens aberrations. We were messing a little bit with the focus here and there, Joslain states. Adding a little grain there. Adding a little bit of a deep camera shake and vibration. We want to respect Jeans vision because thats what you want to achieve, but at the same time by damaging the perfection of the whole thing is how you achieve true perfection. My favorite shot is fully CG-made. Its the X-15 taking off, and its that super-long-lens-like 5000mm view of the X-15. There is enough shake in the camera and zoom play with the lens going on to add that sense there is an actual human being filming.A mixture of shots were done with visor up and down, with the reflections added and removed as needed.The X-15 does not actually take off but is attached to and released from a B-15.Drone photography was essential for the aerial shots of Edwards Air Force Base, with the buildings, vehicles and individuals digitally recreated.Cloud formations assisted with conveying the proper size, scale and speed of the aircrafts.Shots such as the B-52 releasing the X-15 were treated as if a camera operator was capturing the moment with a long lens.Particle simulations had to be produced on the ground and in the air. The stuff on the ground was the hardest, for sure, Traub notes. We have this sequence where the X-15 lands, there is a touchdown where the back of the plane basically slaps the ground, and there is an explosion of dust that goes up in the air. Then we see underneath the plane. Basically, the tracks are ripping up the ground as its coming to a halt. The X-15 pushes through a whole bunch of dust. In that same shot, you have this helicopter moving down and landing. The particle simulations become a lot more complicated because youre locked to lighting that is on the ground, so your lighting has got to be atmospherically correct. The shadows have to cast with the particle simulations that were doing in Houdini versus in the air. A lot of the particle simulations were atmospheric.CG tombstones were added to create a setting that had the gravitas of Arlington National Cemetery.One of the unique images features the death of a colleague reflected in the sunglasses worn by Glen Ford. The reflection of the explosion in the sunglasses was one of those cases where we did an absurd amount of reverse-engineering, Joslain explains, about the scale/size of X-15 contrail, the amount of curvature the piece of glass would have applied to it, and how it should have all looked to be 100% accurate versus what it needed to look like to be emotionally impactful, as well as aesthetically pleasing. Unreal Engine became a major tool for Edge of Space. One of the big things that we were dabbling with a little bit was Unreal Engine, but Unreal Engine became a key part of the pipeline when it came to all of the CG shots, Traub states. The reason for that was simply because of real-time rendering, the ability to tweak the lighting, quickly change the camera and output multiple versions. Especially when the deadline was coming up, it enabled us to move at speed that was a lot better. One thing that we had never done before was integrating Houdini simulations with Unreal Engine. Both of those paired up nicely, and by the time we had all of our renders, you could composite everything together in After Effects or Nuke. We got fairly adept with the Unreal Engine pipeline specifically for cinematic filmmaking, and it was a great experience. Well continue to use Unreal Engine for the rest of our projects most likely. Its an amazing tool.0 Commentarii 0 Distribuiri 114 Views
-
BLOG.PLAYSTATION.COMCelebrate 30 years of PlayStation with PlayStation 2024 Wrap-Up, launching todaySince the launch of the original PlayStation console on December 3, 1994, PlayStation has redefined the gaming experience year after year, bringing three generations of lasting memories for players. To celebrate this milestone with our community, and look back at another momentous year for gaming, were launching PlayStation 2024 Wrap-Up today featuring a retro PlayStation aesthetic and graphics that call back to PlayStations 30-year history.From today through January 10, 2025, PS4 or PS5 players* can access their PlayStation 2024 Wrap-Up experience to view their gaming achievements, including most played games, a breakdown of monthly gaming stats, gaming style, and more.New this year are personalized historical statistics, such as the total number of games a player has experienced since creating their account for PlayStation Network, as well as a look back at trophy milestones and personalized recommendations for games that are available on PlayStation Plus Game Catalog.2024 Wrap-Up will continue to update through the end of 2024, so be sure to check back again before January 10 for your full-year summary. Players who complete the experience can redeem a unique 30th anniversary-inspired avatar and PlayStation Stars digital collectible, as well as a shareable Wrap-Up summary card. Check out my summary below:Experience your PlayStation 2024 Wrap-Up today at wrapup.playstation.com and share your favorite PlayStation experiences in the comments section below.*Users need to have an account for PlayStation Network in their region, be aged 18 years or over and, have played games on a PS4 or PS5 console for at least 10 hours between January 1, 2024 and December 31, 2024.Users who have not consented to the collection of Full Data from their PlayStation 5 system settings in 2024, will be unable to participate in the Wrap-Up campaign.Users located in Europe, the Middle East, Africa, Australasia, India, and Russia, who have not consented to the collection of Additional Data from their PlayStation 4 system settings in 2024, will be unable to participate in the Wrap-Up campaign.0 Commentarii 0 Distribuiri 144 Views