• La verdad es que el tema sobre el 'National Design Studio' de Donald Trump no es muy emocionante. Hay siete maneras en las que podría intentar 'Hacer América Diseñar de Nuevo', pero, sinceramente, no estoy seguro de que eso realmente importe. Todo suena un poco redundante y cansado. No sé, quizás deberíamos preocuparnos, o quizás no. Al final, creo que solo es ruido.

    #DonaldTrump #DesignStudio #HacerAméricaDiseñarDeNuevo #Política #Desinterés
    La verdad es que el tema sobre el 'National Design Studio' de Donald Trump no es muy emocionante. Hay siete maneras en las que podría intentar 'Hacer América Diseñar de Nuevo', pero, sinceramente, no estoy seguro de que eso realmente importe. Todo suena un poco redundante y cansado. No sé, quizás deberíamos preocuparnos, o quizás no. Al final, creo que solo es ruido. #DonaldTrump #DesignStudio #HacerAméricaDiseñarDeNuevo #Política #Desinterés
    Like
    Love
    Wow
    Angry
    Sad
    299
    · 1 Commentaires ·0 Parts
  • فريق Falcons فاز ببطولة كأس العالم للرياضات الإلكترونية 2025. يعني، شيء عادي. كل شيء صار مكرر. البطولة كانت في مكان، والناس اتجمعت، وفازوا. ما في شيء جديد. بس كده، انتهى الموضوع.

    #Falcons #كأس_العالم #الرياضات_الإلكترونية #بطولات #لا_شيء_جديد
    فريق Falcons فاز ببطولة كأس العالم للرياضات الإلكترونية 2025. يعني، شيء عادي. كل شيء صار مكرر. البطولة كانت في مكان، والناس اتجمعت، وفازوا. ما في شيء جديد. بس كده، انتهى الموضوع. #Falcons #كأس_العالم #الرياضات_الإلكترونية #بطولات #لا_شيء_جديد
    فريق Falcons يتوج ببطولة كأس العالم للرياضات الإلكترونية 2025
    arabhardware.net
    The post فريق Falcons يتوج ببطولة كأس العالم للرياضات الإلكترونية 2025 appeared first on عرب هاردوير.
    Like
    Love
    Wow
    Sad
    Angry
    301
    · 1 Commentaires ·0 Parts
  • All Items In Peak Explained: Strange Gem, Crystal Skull, And More
    www.gamespot.com
    Every item in Peak serves an important purpose. Some are vital survival tools, aiding your injuries or keeping your tummy full to keep you climbing for longer. Others can jeopardize an entire expedition in an instant. Depending on who you're playing with, the latter can be dangerous--there's always that one player who won't hesitate to create a TikTok-worthy moment.Here's the trick with items in Peak: While the game tells you its name, it doesn't actually explain what an item does. Some are easy enough to guess from looks alone, but more often than not, you should be wary of using something to avoid being penalized or affecting the run by accident. Don't fret, though--if you're wondering what things like the strange gem or the crystal skull do, the list below includes every item in Peak and what they do, so you can have a helpful page at hand to dispel doubts in the group chat.I'm sorry, Bing Bong. I'm just doing research.Peak Items List and What They DoThe list includes every item up to the Mesa update of Peak, which was officially released on August 11. There have been quite a few smaller patches, but by and large, here are all the items available thus far. Note that we're skipping most of the food, as their purpose is largely self-explanatory.Continue Reading at GameSpot
    Like
    Love
    Wow
    Sad
    Angry
    215
    · 0 Commentaires ·0 Parts
  • Helldivers 2: What to Expect From the Into the Unjust Update
    gamerant.com
    After over a year of being a PS5 exclusive, Helldivers 2 is finally available for Xbox players, expanding the fight for Managed Democracy. Helldivers 2 has continued to keep players invested with its thrilling updates and strong community focus, and now with Xbox players being thrown into the mix, it will be fascinating to see what the future holds for the hit third-person shooter.
    Like
    Love
    Wow
    Angry
    Sad
    180
    · 0 Commentaires ·0 Parts
  • NVIDIA Jetson Thor Unlocks Real-Time Reasoning for General Robotics and Physical AI
    blogs.nvidia.com
    Robots around the world are about to get a lot smarter as physical AI developers plug in NVIDIA Jetson Thor modules new robotics computers that can serve as the brains for robotic systems across research and industry.Robots demand rich sensor data and low-latency AI processing. Running real-time robotic applications requires significant AI compute and memory to handle concurrent data streams from multiple sensors. Jetson Thor, now in general availability, delivers 7.5x more AI compute, 3.1x more CPU performance and 2x more memory than its predecessor, the NVIDIA Jetson Orin, to make this possible on device.This performance leap will enable roboticists to process high-speed sensor data and perform visual reasoning at the edge workflows that were previously too slow to run in dynamic real-world environments. This opens new possibilities for multimodal AI applications such as humanoid robotics.Agility Robotics, a leader in humanoid robotics, has integrated NVIDIA Jetson into the fifth generation of its robot, Digit and plans to adopt Jetson Thor as the onboard compute platform for the sixth generation of Digit. This transition will enhance Digits real-time perception and decision-making capabilities, supporting increasingly complex AI skills and behaviors. Digit is commercially deployed and performs logistics tasks such as stacking, loading and palletizing in warehouse and manufacturing environments.The powerful edge processing offered by Jetson Thor will take Digit to the next level enhancing its real-time responsiveness and expanding its abilities to a broader, more complex set of skills, said Peggy Johnson, CEO of Agility Robotics. With Jetson Thor, we can deliver the latest physical AI advancements to optimize operations across our customers warehouses and factories.Boston Dynamics which has been building some of the industrys most advanced robots for over 30 years is integrating Jetson Thor into its humanoid robot Atlas, enabling Atlas to harness formerly server-level compute, AI workload acceleration, high-bandwidth data processing and significant memory on device.Beyond humanoids, Jetson Thor will accelerate various robotic applications such as surgical assistants, smart tractors, delivery robots, industrial manipulators and visual AI agents with real-time inference on device for larger, more complex AI models.A Giant Leap for Real-Time Robot ReasoningJetson Thor is built for generative reasoning models. It enables the next generation of physical AI agents powered by large transformer models, vision language models and vision language action models to run in real time at the edge while minimizing cloud dependency.Optimized with the Jetson software stack to enable the low latency and high performance required in real-world applications, Jetson Thor supports all popular generative AI frameworks and AI reasoning models with unmatched real-time performance. These include Cosmos Reason, DeepSeek, Llama, Gemini and Qwen models, as well as domain-specific models for robotics like Isaac GR00T N1.5, enabling any developer to easily experiment and run inference locally.NVIDIA Jetson Thor opens new capabilities for real-time reasoning with multi-sensor input. Further performance improvement is expected with FP4 and speculative decoding optimization.With NVIDIA CUDA ecosystem support through its lifecycle, Jetson Thor is expected to deliver even better throughput and faster responses with future software releases.Jetson Thor modules also run the full NVIDIA AI software stack to accelerate virtually every physical AI workflow with platforms including NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents and NVIDIA Holoscan for sensor processing.With these software tools, developers can easily build and deploy applications, such as visual AI agents that can analyze live camera streams to monitor worker safety, humanoid robots capable of manipulation tasks in unstructured environments and smart operating rooms that guide surgeons based on data from multi-camera streams.Jetson Thor Set to Advance Research InnovationResearch labs at Stanford University, Carnegie Mellon University and the University of Zurich are tapping Jetson Thor to push the boundaries of perception, planning and navigation models for a host of potential applications.At Carnegie Mellons Robotics Institute, a research team uses NVIDIA Jetson to power autonomous robots that can navigate complex, unstructured environments to conduct medical triage as well as search and rescue.We can only do as much as the compute available allows, said Sebastian Scherer, an associate research professor at the university and head of the AirLab. Years ago, there was a big disconnect between computer vision and robotics because computer vision workloads were too slow for real-time decision-making but now, models and computing have gotten fast enough so robots can handle much more nuanced tasks.Scherer anticipates that by upgrading from his teams existing NVIDIA Jetson AGX Orin systems to Jetson AGX Thor developer kit, theyll improve the performance of AI models including their award-winning MAC-VO model for robot perception at the edge, boost their sensor-fusion capabilities and be able to experiment with robot fleets.Wield the Strength of Jetson ThorThe Jetson Thor family includes a developer kit and production modules. The developer kit includes a Jetson T5000 module, a reference carrier board with abundant connectivity, an active heatsink with a fan and a power supply.NVIDIA Jetson AGX Thor Developer KitThe Jetson ecosystem supports a variety of application requirements, high-speed industrial automation protocols and sensor interfaces, accelerating time to market for enterprise developers. Hardware partners including Advantech, Aetina, ConnectTech, MiiVii and TZTEK are building production-ready Jetson Thor systems with flexible I/O and custom configurations in various form factors.Sensor and Actuator companies including Analog Devices, Inc. (ADI), e-con Systems, Infineon, Leopard Imaging, RealSense and Sensing are using NVIDIA Holoscan Sensor Bridge a platform that simplifies sensor fusion and data streaming to connect sensor data from cameras, radar, lidar and more directly to GPU memory on Jetson Thor with ultralow latency.Thousands of software companies can now elevate their traditional vision AI and robotics applications with multi-AI agent workflows running on Jetson Thor. Leading adopters include Openzeka, Rebotnix, Solomon and Vaidio.More than 2 million developers use NVIDIA technologies to accelerate robotics workflows. Get started with Jetson Thor by reading the NVIDIA Technical Blog and watching the developer kit walkthrough.To get hands-on experience with Jetson Thor, sign up to participate in upcoming hackathons with Seeed Studio and LeRobot by Hugging Face.The NVIDIA Jetson AGX Thor developer kit is available now starting at $3,499. NVIDIA Jetson T5000 modules are available starting at $2,999 for 1,000 units. Buy now from authorized NVIDIA partners.NVIDIA today also announced that the NVIDIA DRIVE AGX Thor developer kit, which provides a platform for developing autonomous vehicles and mobility solutions, is available for preorder. Deliveries are slated to start in September.
    Like
    Love
    Wow
    Sad
    Angry
    119
    · 0 Commentaires ·0 Parts
  • THE RULES OF ENGAGEMENT FOR WARFARE
    www.vfxvoice.com
    By TREVOR HOGGImages courtesy of DNA Films, A24 and Cinesite.What starts off as a routine military operation goes horribly wrong, and such an experience left a lasting impression on former American Navy SEAL Ray Mendoza, who recounts how his platoon came under fire during the Iraq War in 2006 while monitoring U.S. troop movements through hostile territory. The real-life incident serves as the basis for Warfare, which Mendoza co-directed with Alex Garland and shot over a period of 28 days at Bovingdon Airfield in Hertfordshire, U.K. Assisting with the environmental transformation consisting of approximately 200 shots was the visual effects team led by Simon Stanley-Camp and sole vendor Cinesite.Im delighted and disappointed [that Warfare has been praised for its realistic portrayal of soldiers in action] because no one knows there are visual effects, and there has been nothing said about the visual effects yet. In this climate, Warfare should be seen by a lot of people.Simon Stanley-Camp, Visual Effects SupervisorProviding audience members with a sense of direction is the drone footage, which involved placing large bluescreen carpet down an airport runway.Without the shadow of a doubt, this was the most collaborative movie Ive ever worked on in 25 years, notes Visual Effects Supervisor Stanley-Camp. Every department was so helpful, from production design to special effects, which we worked with hand-in-hand. There were probably three different layers or levels of smoke. Theres smoke, dust and debris when the grenade goes off [in the room]. All of those special effects elements were captured mostly in-camera. Weve occasionally added a little bit of smoke within the masonry. The big IED [Improvised Explosive Device] explosion was smoky, but over the course of the 50 shots where theyre scrambling around in the smoke, we added 30% more smoke. It starts thick and soupy. You could have two guys standing next to each other and they wouldnt know it. There was this idea of layering more smoke to hide the surrounding action. We had lots of rotoscoping and layering in there.Practical explosions were used as the base, then expanded upon digitally.The Show of Force [where U.S. fighter jets fly overhead] occurs quickly. You cut back inside to be with the soldiers in the house. You dont linger outside and see the dust settling, blowing away and clearing. The first Show of Force we sped up to almost double the speed it was filmed. Its the one time we used the crane. On the whole, the action is always with the soldiers. Its handheld. Its Steadicam. You are a soldier.Simon Stanley-Camp, Visual Effects SupervisorPrincipal photography took place outdoors. Its funny because Bovingdon Airfield is a studio with five or six soundstages, but we didnt use any of them other than for some effects elements, Stanley-Camp reveals. We were shooting in the car park next to the airfield. There was one building, which is the old control tower from the Second World War, that we repurposed for a market area. Just before I was involved, there was talk about building one house. Then, it went up to four and finally to eight houses that were flattage and worked from specific angles. If you go slightly off center, you can see the sides of the set or down the gaps between the set. We had two 20-foot by 120-foot bluescreens and another two on Manitous that could be floated around and walked in.Greenscreen assisted with digital set extensions.Ramadi, Iraq is a real place, so maps and Google Docs were referenced for the layout of the streets. We lifted buildings from that reference, and Ray would say, No. That road wasnt there. We put in water towers off in the distance, which Ray remembered being there and where they were then. Palm trees and bushes were dressed into the set, which was LiDAR scanned and photomontaged before and after the battle. There is quite a lot of greens, and I shot ferns as elements blowing around with the smoke, and being blown with air movers as 2D elements to pepper back in along with laundry, Stanley-Camp states. I mention laundry because we were looking for things to add movement that didnt look out of place. There are air conditioning units and fans moving. We had some CG palm trees with three levels of pre-programmed motion to dial in, like high, medium and low, for ambient movement, but nothing too drastic. Then on the flybys of the Show of Force, we ran another simulation on that to create the air resistance of the planes flying through.When the main IED goes off, we shot that with the cast, and it plays out as they come through the gate. Its predominately compressed air, some pyrotechnics, cork, dust and debris, safe stuff that you could fire and light. There are a lot of lighting effects built into that explosion. When the smoke goes off, flashbulbs go off, which provide the necessary brightness and impact. Then, we shot it for real with seven cameras and three buried. We did it twice. The whole crew was there watching it. It was like a big party when they set that off.Simon Stanley-Camp, Visual Effects SupervisorThe fighter jet in the Show of Force sequences was entirely CG.Over a period of 95 minutes, the action unfolds in real-time. One of the first questions I asked Alex was, What is the sky? You imagine that its blue the whole time, Stanley-Camp remarks. [Even though shooting took place during the British summer], were sitting in their winter, so the soldiers are always in full fatigues, and the insurgents are running around with jumpers, coats and sweatshirts. We got a couple of magical days of beautiful skies with lots of texture and clouds. It looked great, and Alex said, This is the look. Anytime there was a spare camera and it was a good sky, we shot it. We didnt have to do so many replacements, probably about five. We had a couple of sunny days where we had to bring in shadow casters for consistency so the sun wasnt going in and out. What did require extensive work were the masonry, bullet hits and explosions. There were a ton of special effects work there. A lot of what we were doing was a heal and reveal painting them out and letting them pop back in, then moving them because with all of the wind, the practical ones are never going to go off in the right place. Maybe because they were too close or too far away. We would reposition and augment them with our own version of CG bullet holes and hits.The dust simulations featured in the Show of Force sequences were created using Houdini.Numerous explosions were captured in-camera. When the main IED goes off, we shot that with the cast, and it plays out as they come through the gate, Stanley-Camp remarks. Its predominately compressed air, some pyrotechnics, cork, dust and debris, safe stuff that you could fire and light. There are a lot of lighting effects built into that explosion. When the smoke goes off, flashbulbs go off, which provide the necessary brightness and impact. Then, we shot it for real with seven cameras and three buried. We did it twice. The whole crew was there watching it. It was like a big party when they set that off. We filled that up with a set extension for the top shot, and as the phosphorous started to die out and fall away, we took over with CG bright phosphorous that lands and rolls around. Then, additional smoke to carry it onto camera. The special effects guys had a spare explosion ready to go, so I shot that as well for an element we didnt use in the end, other than for reference on how combustible it was, how much dust and billowing smoke it let off.Muzzle flashes were specific to the rifles, rather than relying on a generic one.Assisting the platoon are screeching U.S. fighter jets that stir up massive amounts of dust as they fly overhead. The Show of Force happens three times, Stanley-Camp notes. Thats purely effects-generated. Its a Houdini simulation. We had a little bit of help from fans blowing trees and laundry on set. Any ambient real stuff I could get to move, I did. Readability was important. The Show of Force occurs quickly. You cut back inside to be with the soldiers in the house. You dont linger outside and see the dust settling, blowing away and clearing. The first Show of Force we sped up to almost double the speed it was filmed. Its the one time we used the crane. On the whole, the action is always with the soldiers. Its handheld. Its Steadicam. You are a soldier.When theyre being dragged up the drive into the house, the legs are meant to be broken in weird and awkward angles. We did a lot with repositioning angles. If you look at the before and after, you go, Oh, my god, theyre at horrible angles. However, if you look at it straight on and are not comparing it against a normal leg, its less noticeable. We did quite a lot of bending, warping and breaking of legs!Simon Stanley-Camp, Visual Effects SupervisorAn effort was made to always have practical elements in-camera.The fighter jet was entirely CG. You could get in it, Stanley-Camp reveals. Its a full textured build. The canopy is reflecting anything that would be in shot from the HDRI. What was real were the Bradley Fighting Vehicles. We had two Bradleys and two drivers. The Bradleys were redressed with armor plating fitted on the sides to make them bulkier than when they came to us raw. The gun turret was modified and the barrel added. It didnt fire, so thats all us. The major misconception of the Bradleys is that it fires a big pyrotechnic shell. But the shell doesnt explode on contact. It punches holes through things. When it fires, what we see coming out the end is dust, debris, a little puff and a tiny bit of gunk. Ive seen bigger tanks where the whole tank shakes when they fire. There is none of that. The Bradleys are quick and nimble reconnaissance vehicles.Unfolding in real-time, Warfare was shot over a period of 28 days at Bovingdon Airfield in Hertfordshire, U.K.Muzzle flashes are plentiful. We had about six different types of rifles, so we broke those down and shot extensively, Stanley-Camp states. We did a days effects shoot against black that included every rifle shot from every angle. More interesting from a technical perspective, we looked at different frame rates to shoot any of the live-action gun work to capture as much of the muzzle flashes as possible. Alex said he had to replace a lot of them during Civil War because they had all sorts of rolling shutter problems. We experimented with different frame rates and ended up shooting at 30 frames per second to capture the most of the muzzle flash, and that gave us the least rolling shutter effect. Muzzle flashes are a bright light source. Once the grenade has gone off and the rooms are filled with smoke, the muzzle flash illuminates in a different way; it lights the room and smoke. How much atmospherics were in the room depended on how bright the muzzle flash registered.The flattage sets were sturdy enough to allow shooting to take place on the rooftops.Not as much digital augmentation was required for wounds than initially thought. The house is probably three feet off the ground, and we were also able to dig some holes, Stanley-Camp reveals. There were trapdoors in the floor with leg-sized holes that you could slip your knee into, refit the tiles around the leg, and then [use] the prosthetic leg. Usually, from the knee down was replaced. Because of open wounds, arterial veins are exposed, I thought there should be a bit of pumping blood, so we put a little blood movement on the legs and shins. Otherwise, not too much. It stood up. When theyre being dragged up the drive into the house, the legs are meant to be broken in weird and awkward angles. We did a lot with repositioning angles. If you look at the before and after, you go, Oh, my god, theyre at horrible angles. However, if you look at it straight on and are not comparing it against a normal leg, its less noticeable. We did quite a lot of bending, warping and breaking of legs!The Bradley Fighting Vehicles were practical, then digitally enhanced.Drone footage provides audience members with a sense of direction. Initially, the map was barely going to be seen, Stanley-Camp remarks. It was a live play on set, on monitor, and that was it. I did those upfront, played them on the day, and the performance works. Those have stayed in. But the exposition grew, and we did another seven or eight map iterations telling the story where the soldiers and tanks are. One of those shots is four minutes long. I was going to do it as CG or motion capture, and Alex was like, I hate motion capture. Even with these tiny ants moving around, youll know. I looked for studios high enough to get wide enough. 60 feet is about as high as I could get. Then I said, Why dont we shoot it from a drone? This was toward the end of post. We went back to Bovingdon Airfield for two days and had brilliant weather. We shot that on the runway because of the size of the place. It was biggest carpet of bluescreen you can imagine. I had soldiers and insurgents walking the full length of that. Then I took those bluescreen elements and inserted them into the maps.Requiring extensive CG work were the masonry, bullet hits and explosions.The IED device explosion consisted of compressed air, pyrotechnics, cork, dust and debris, which was then heightened digitally to make it feel more lethal.Skies were altered to get the desired mood for shots.Cinesite served as the sole vendor on Warfare and was responsible for approximately 200 visual effects shots.The Show of Force shots were always going to be challenging. There is a lot of reference online, and everybody thinks they know what it should look like, Stanley-Camp remarks. Those shots work in context. Im pleased with them. Warfare has been praised for its realistic portrayal of soldiers in action. Im delighted and disappointed because no one knows there are visual effects, and there has been nothing said about the visual effects yet. In this climate, Warfare should be seen by a lot of people. It takes a snapshot of a moment. Like Ray has been saying, This is one of the thousand of operations that happen on a weekly basis that went wrong.
    Like
    Love
    Wow
    Angry
    Sad
    162
    · 0 Commentaires ·0 Parts
  • Lumines Arise launches Nov 11, PS5 demo available now
    blog.playstation.com
    Since we first announced Lumines Arise during the State of Play in June, weve been inundated with the same question from fans: When will the demo be available?! And the answer isright now! You can play the limited-time Lumines Arise Demo on PlayStation 5 now through September 3 and try out three single-player stages and help us network test the all-new multiplayer Burst Battle mode.We also have a release date for the full gameNovember 11, 2025. Pre-orders start today (and include a 10% discount for PS Plus subscribers!)go to the PS Store page for that and to download the demo.Lumines for allNever played a Lumines game before? Or forgot how it works? Or never got it in the first place? Good news: Arise is incredibly easy for anyone to get into, thanks to an excellent interactive tutorial that walks you through everything, step-by-step. (And even old pros wont wanna miss the intro to new mechanics like Burst!)The Demo only features one difficulty (Easy the final game will have four different levels), but youll also find robust options to fit every play style under Accessibility in the Options menu. Want to just groove to the music and not worry about time pressure, or a Game Over when you top out? Try the No Stress Lumines options for that! Want to strip away the visual flourishes to focus more on the gameplay? Theres options for that! Or playing on your PlayStation Portal and want to zoom in to get the most out of your portable screen real estate? Theres options for that, too!Play VideoAn all-new multiplayer experienceBurst Battle represents a complete reinvention of multiplayer Lumines, borrowing from the competitive-puzzle-game greats, but adding a twist all its own.Now, both players have an entire playfield to themselves and can send garbage blocks to attack their opponent. You generate these attacks by clearing 22 (or larger) Squares, or by triggering the all-new Burst mechanic (where you have a few Timeline passes to build a single color match as large as possible). The bigger the Burst, the larger the deluge your opponent will face! Meanwhile, garbage blocks can pile up on the sides, shrinking the available playfieldonly matching blocks adjacent to garbage will clear it out. This ebb and flow can get super tense and really fun, I hope you try it out!The Demo features a taste of Burst Battle via matchmaking, but the full version of the game will offer friend / CPU matches, custom matches, and local play. And youll get to select your favorite stage music / block-visuals that you unlocked in the single-player Journey mode to use in multiplayer; its kind of like having your own theme song as you head into battle!Everyones hereincluding Astro Bot?Starting today, you can pre-order the Standard or Digital Deluxe Edition of Lumines Arise on PlayStation Store. And as mentioned above, PS Plus members get a 10% discount on the pre-order.The Digital Deluxe Edition (also available as an upgrade to the Standard Edition) includes the full game and four exclusive Loomii in-game avatars. You can customize your Loomii in-game to match your personality, and the set in the Digital Deluxe Edition includes skins based on Tetris Effect: Connected, Rez Infinite, Humanity, and, whats thisAstro Bot is appearing as a guest as well! A big thank you to our friends at Team Asobi for making this crossover possible. The image above is just a previewthe final look of these avatars will be revealed soon.Also, because it wouldnt be Lumines Arise news without some new music, a new single from the soundtrack has been released. Hydelics hypnotically thumping anthem Dreamland is the sonic backdrop of the Chameleon Groove stage from the Demo, and is available now on Bandcamp with a release soon on your favorite streaming services. We know that after you play the demo, youll want to add this to your favorite daily playlist.A quick note for PS VR2 owners: unfortunately VR mode couldnt make it in time for this demo, but we can confirm it will be available at launch on November 11! Thank you for all your passion and excitement for VR, and in this case, for your patience. (And maybe youll get a glimpse of Arise in VR somewhere sometime before launch after all?)We hope youll check out the Demo, tell us what you think, and get ready for the launch of the full game on November 11.
    Like
    Love
    Wow
    Angry
    Sad
    170
    · 0 Commentaires ·0 Parts
  • Fresh Tracks review
    www.polygon.com
    Forests, glaciers, and ancient fortresses pass me by as I ski through the mythical realm of Norwyn. Guitars are shredding in the background, the northern light is in the sky But Ive no time to gaze up, as I'm too busy dodging rocks, jumping over rivers, and slaying enemies with a flaming sword.
    Like
    Love
    Wow
    Angry
    Sad
    230
    · 0 Commentaires ·0 Parts
  • Optimizing PWAs For Different Display Modes
    smashingmagazine.com
    Progressive web apps (PWA) are a fantastic way to turn web applications into native-like, standalone experiences. They bridge the gap between websites and native apps, but this transformation can be prone to introducing design challenges that require thoughtful consideration.We define our PWAs with a manifest file. In our PWAs manifest, we can select from a collection of display modes, each offering different levels of browser interface visibility:fullscreen: Hides all browser UI, using the entire display.standalone: Looks like a native app, hiding browser controls but keeping system UI.minimal-ui: Shows minimal browser UI elements.browser: Standard web browser experience with full browser interface.Oftentimes, we want our PWAs to feel like apps rather than a website in a browser, so we set the display manifest member to one of the options that hides the browsers interface, such as fullscreen or standalone. This is fantastic for helping make our applications feel more at home, but it can introduce some issues we wouldnt usually consider when building for the web.Its easy to forget just how much functionality the browser provides to us. Things like forward/back buttons, the ability to refresh a page, search within pages, or even manipulate, share, or copy a pages URL are all browser-provided features that users can lose access to when the browsers UI is hidden. There is also the case of things that we display on websites that dont necessarily translate to app experiences.Imagine a user deep into a form with no back button, trying to share a product page without the ability to copy a URL, or hitting a bug with no refresh button to bail them out!Much like how we make different considerations when designing for the web versus designing for print, we need to make considerations when designing for independent experiences rather than browser-based experiences by tailoring the content and user experience to the medium.Thankfully, were provided with plenty of ways to customise the web.Using Media Queries To Target Display ModesWe use media queries all the time when writing CSS. Whether its switching up styles for print or setting breakpoints for responsive design, theyre commonplace in the web developers toolkit. Each of the display modes discussed previously can be used as a media query to alter the appearance of documents depending.Media queries such as @media (min-width: 1000px) tend to get the most use for setting breakpoints based on the viewport size, but theyre capable of so much more. They can handle print styles, device orientation, contrast preferences, and a whole ton more. In our case, were interested in the display-mode media feature.Display mode media queries correspond to the current display mode.Note: While we may set display modes in our manifest, the actual display mode may differ depending on browser support.These media queries directly reference the current mode:@media (display-mode: standalone) will only apply to pages set to standalone mode.@media (display-mode: fullscreen) applies to fullscreen mode. It is worth noting that this also applies when using the Fullscreen API.@media (display-mode: minimal-ui) applies to minimal UI mode.@media (display-mode: browser) applies to standard browser mode.It is also worth keeping an eye out for the window-controls-overlay and tabbed display modes. At the time of writing, these two display modes are experimental and can be used with display_override. display-override is a member of our PWAs manifest, like display, but provides some extra options and power.display has a predetermined fallback chain (fullscreen -> standalone -> minimal-ui -> browser) that we cant change, but display-override allows setting a fallback order of our choosing, like the following:"display_override": ["fullscreen", "minimal-ui"]window-controls-overlay can only apply to PWAs running on a desktop operating system. It makes the PWA take up the entire window, with window control buttons appearing as an overlay. Meanwhile, tabbed is relevant when there are multiple applications within a single window.In addition to these, there is also the picture-in-picture display mode that applies to (you guessed it) picture-in-picture modes.We use these media queries exactly as we would any other media query. To show an element with the class .pwa-only when the display mode is standalone, we could do this:.pwa-only { display: none;}@media (display-mode: standalone) { .pwa-only { display: block; }}If we wanted to show the element when the display mode is standalone or minimal-ui, we could do this:@media (display-mode: standalone), (display-mode: minimal-ui) { .pwa-only { display: block; }}As great as it is, sometimes CSS isnt enough. In those cases, we can also reference the display mode and make necessary adjustments with JavaScript:const isStandalone = window.matchMedia("(display-mode: standalone)").matches;// Listen for display mode changeswindow.matchMedia("(display-mode: standalone)").addEventListener("change", (e) => { if (e.matches) { // App is now in standalone mode console.log("Running as PWA"); }});Practical ApplicationsNow that we know how to make display modifications depending on whether users are using our web app as a PWA or in a browser, we can have a look at how we might put these newly learnt skills to use.Tailoring Content For PWA UsersUsers who have an app installed as a PWA are already converted, so you can tweak your app to tone down the marketing speak and focus on the user experience. Since these users have demonstrated commitment by installing your app, they likely dont need promotional content or installation prompts.Display More Options And FeaturesYou might need to directly expose more things in PWA mode, as people wont be able to access the browsers settings as easily when the browser UI is hidden. Features like changing font sizing, switching between light and dark mode, bookmarks, sharing, tabs, etc., might need an in-app alternative.Platform-Appropriate FeaturesThere are features you might not want on your web app because they feel out of place, but that you might want on your PWA. A good example is the bottom navigation bar, which is common in native mobile apps thanks to the easier reachability it provides, but uncommon on websites.People sometimes print websites, but they very rarely print apps. Consider whether features like print buttons should be hidden in PWA mode.Install PromptsA common annoyance is a prompt to install a site as a PWA appearing when the user has already installed the site. Ideally, the browser will provide an install prompt of its own if our PWA is configured correctly, but not all browsers do, and it can be finicky. MDN has a fantastic guide on creating a custom button to trigger the installation of a PWA, but it might not fit our needs.We can improve things by hiding install prompts with our media query or detecting the current display mode with JavaScript and forgoing triggering popups in the first place.We could even set this up as a reusable utility class so that anything we dont want to be displayed when the app is installed as a PWA can be hidden with ease./* Utility class to hide elements in PWA mode */.hide-in-pwa { display: block;}@media (display-mode: standalone), (display-mode: minimal-ui) { .hide-in-pwa { display: none !important; }}Then in your HTML:<div class="install-prompt hide-in-pwa"> <button>Install Our App</button></div><div class="browser-notice hide-in-pwa"> <p>For the best experience, install this as an app!</p></div>We could also do the opposite and create a utility class to make elements only show when in a PWA, as we discussed earlier.Strategic Use Of Scope And Start URLAnother way to hide content from your site is to set the scope and start_url properties. These arent using media queries as weve discussed, but should be considered as ways to present different content depending on whether a site is installed as a PWA.Here is an example of a manifest using these properties:{ "name": "Example PWA", "scope": "/dashboard/", "start_url": "/dashboard/index.html", "display": "standalone", "icons": [ { "src": "icon.png", "sizes": "192x192", "type": "image/png" } ]}scope here defines the top level of the PWA. When users leave the scope of your PWA, theyll still have an app-like interface but gain access to browser UI elements. This can be useful if youve got certain parts of your app that you still want to be part of the PWA but which arent necessarily optimised or making the necessary considerations.start_url defines the URL a user will be presented with when they open the application. This is useful if, for example, your app has marketing content at example.com and a dashboard at example.com/dashboard/index.html. It is likely that people who have installed the app as a PWA dont need the marketing content, so you can set the start_url to /dashboard/index.html so the app starts on that page when they open the PWA.Enhanced TransitionsView transitions can feel unfamiliar, out of place, and a tad gaudy on the web, but are a common feature of native applications. We can set up PWA-only view transitions by wrapping the relevant CSS appropriately:@media (display-mode: standalone) { @view-transition { navigation: auto; }}If youre really ambitious, you could also tweak the design of a site entirely to fit more closely with native design systems when running as a PWA by pairing a check for the display mode with a check for the device and/or browser in use as needed.Browser Support And TestingBrowser support for display mode media queries is good and extensive. However, its worth noting that Firefox doesnt have PWA support, and Firefox for Android only displays PWAs in browser mode, so you should make the necessary considerations. Thankfully, progressive enhancement is on our side. If were dealing with a browser lacking support for PWAs or these media queries, well be treated to graceful degradation.Testing PWAs can be challenging because every device and browser handles them differently. Each display mode behaves slightly differently in every browser and OS combination.Unfortunately, I dont have a silver bullet to offer you with regard to this. Browsers dont have a convenient way to simulate display modes for testing, so youll have to test out your PWA on different devices, browsers, and operating systems to be sure everything works everywhere it should, as it should.RecapUsing a PWA is a fundamentally different experience from using a web app in the browser, so considerations should be made. display-mode media queries provide a powerful way to create truly adaptive Progressive Web Apps that respond intelligently to their installation and display context. By leveraging these queries, we can do the following:Hide redundant installation prompts for users who have already installed the app,Provide appropriate navigation aids when making browser controls unavailable,Tailor content and functionality to match user expectations in different contexts,Create more native-feeling experiences that respect platform conventions, andProgressively enhance the experience for committed users.The key is remembering that PWA users in standalone mode have different needs and expectations than standard website visitors. By detecting and responding to display modes, we can create experiences that feel more polished, purposeful, and genuinely app-like.As PWAs continue to mature, thoughtful implementations and tailoring will become increasingly important for creating truly compelling app experiences on the web. If youre itching for even more information and PWA tips and tricks, check out Ankita Masands Extensive Guide To Progressive Web Applications.Further Reading On SmashingMagCreating A Magento PWA: Customizing Themes vs. Coding From Scratch, Alex HusarHow To Optimize Progressive Web Apps: Going Beyond The Basics, Gert SvaikoHow To Decide Which PWA Elements Should Stick, Suzanne ScaccaUniting Web And Native Apps With 4 Unknown JavaScript APIs, Juan Diego Rodrguez
    Like
    Love
    Wow
    Sad
    Angry
    182
    · 0 Commentaires ·0 Parts
  • Oh, wie aufregend! Google hat 180 Länder und die neueste "Agentic"-Funktion in die Welt der Künstlichen Intelligenz katapultiert, während die traditionelle Suche langsam aber sicher auf dem Sterbebett liegt. Wer hätte gedacht, dass wir eines Tages die Frage "Wie war der Tag?" an einen Algorithmus richten würden und dabei das Gefühl haben, dass wir mehr Emotionen bekommen als bei einem echten Gespräch?

    Schnallen Sie sich an, denn die Ära des Suchens ist vorbei – jetzt ist es Zeit, dass die Maschinen uns sagen, was wir denken sollten! Oder wie war das? Vielleicht können wir bald auch unsere eigenen Gedanken digitalisieren und die Menschheit ganz ersetzen!

    #Google #K
    Oh, wie aufregend! Google hat 180 Länder und die neueste "Agentic"-Funktion in die Welt der Künstlichen Intelligenz katapultiert, während die traditionelle Suche langsam aber sicher auf dem Sterbebett liegt. Wer hätte gedacht, dass wir eines Tages die Frage "Wie war der Tag?" an einen Algorithmus richten würden und dabei das Gefühl haben, dass wir mehr Emotionen bekommen als bei einem echten Gespräch? Schnallen Sie sich an, denn die Ära des Suchens ist vorbei – jetzt ist es Zeit, dass die Maschinen uns sagen, was wir denken sollten! Oder wie war das? Vielleicht können wir bald auch unsere eigenen Gedanken digitalisieren und die Menschheit ganz ersetzen! #Google #K
    180 دولة ومميزات "Agentic": جوجل تطور "وضع AI" والبحث التقليدي يحتضر!
    arabhardware.net
    The post 180 دولة ومميزات "Agentic": جوجل تطور "وضع AI" والبحث التقليدي يحتضر! appeared first on عرب هاردوير.
    Like
    Love
    Wow
    Sad
    Angry
    165
    · 1 Commentaires ·0 Parts
CGShares https://cgshares.com