• GAMINGBOLT.COM
    The Duskbloods Roles Lets You Mark Other Players as Companions or Rivals
    NewsThe Duskbloods Roles Lets You Mark Other Players as Companions or RivalsYou can fight or team up with specific players, earning rewards by completing goals and witnessing "unique interactions."Posted By Ravi Sinha | On 04th, Apr. 2025 While FromSoftwares Elden Ring Nightreign feels like a novel concept for the studio with its multiplayer, The Duskbloods sees it introducing several new features for the format. Speaking to Nintendo in a new Creators Interview, director Hidetaka Miyazaki revealed how players can further shape their interactions with other players via roles. In online play, roles give players special responsibilities and objectives that often lead to unique interactions and relationships between players based on their corresponding roles. Some examples include a player with the Destined Rivals role who must locate another player who is their rival and defeat them. Conversely, Destined Companion involves finding a player designated as your companion. While completing them doesnt guarantee winning the match, they count as personal goals and confer rewards. Note that these names arent final, but roles add an extra layer of role-playing since theyre assigned by customizing ones blood history and fate. Interestingly, Miyazaki admitted this was akin to a tabletop RPG, even if it wasnt entirely intentional. It may reflect my own interests a bit. It might seem a little unorthodox at first, but I hope players will give it a try. The Duskbloods is a PvEvP title where the Bloodsworn battle for the First Blood during humanitys twilight (read: demise). Its out next year for Nintendo Switch 2 and offers a dozen playable characters with unique abilities and weapons. Head here for more details. Tagged With: Atomfall Publisher:Rebellion Developments Developer:Rebellion Developments Platforms:PS5, Xbox Series X, PS4, Xbox One, PCView MoreMonster Hunter Wilds Publisher:Capcom Developer:Capcom Platforms:PS5, Xbox Series X, PCView MoreSouth of Midnight Publisher:Microsoft Developer:Compulsion Games Platforms:Xbox Series X, PCView More Amazing Articles You Might Want To Check Out! The Duskbloods Roles Lets You Mark Other Players as Companions or Rivals You can fight or team up with specific players, earning rewards by completing goals and witnessing "unique int... Nintendo Switch 2 Pre-Orders Delayed, Still Launching on June 5th The delay is to "Assess the potential impact of tariffs and evolving market conditions." A new pre-order date ... Effects of Higher Priced Games on the Nintendo Switch 2 Will Become Clearer in Year 2 Analyst For context, Mario Kart World is priced at $79.99 for the digital release, making it the most expensive standa... The Duskbloods Features Over 12 Playable Characters Each character has unique weapons, abilities, and a ranged weapon, and players can customize them "to a certai... Nintendo Was Likely Caught Off-Guard by Backlash to Switch 2 Game Pricing Former PR "I do think that they are a bit surprised by how severe this reaction is," said Krysta Yang, who previously wo... Nintendo Switch 2 Does Not Use Switch Hardware for Backwards Compatibility Nintendo has traditionally used older console hardware in new consoles to enable backwards compatibility on sy... View More
    0 Comments 0 Shares 105 Views
  • WWW.POLYGON.COM
    Grand Theft Auto 5 crashes into Xbox Game Pass on April 15
    Grand Theft Auto 5 is making its way back to Xbox Game Pass on April 15 after being removed early last year.The long-in-the-tooth crime simulator will be made available to folks with either a Standard or Ultimate subscription. If youre on PC, an active membership will also grant you access to Grand Theft Auto 5s recently released Enhanced edition, which boasts ray tracing features like ambient occlusion and global illumination, support for AMD FSR1/FSR3 and NVIDIA DLSS 3, faster loading times, and a whole lot more.Of course, with Grand Theft Auto 5 also comes access to Grand Theft Auto Online, including all expansions up to Oscar Guzman Flies Again.Grand Theft Auto 5 first launched on PlayStation 3 and Xbox 360 in 2013, but continues to show impressive staying power even in the face of the impending Grand Theft Auto 6 launch later this year. And if youre a lapsed player, it feels like now is the perfect time to reacquaint yourself with one of the best, most expansive games in the series.
    0 Comments 0 Shares 111 Views
  • WCCFTECH.COM
    Inworld AI GDC 2025 Q&A AAA Games Want to Be Secret, But Theres Going to Be Announcements in the Summer
    Inworld AI had abig presence at GDC 2024, where it demonstrated new tech demos of its AI Character Engine in collaboration with gaming giants like Microsoft, Ubisoft, and NVIDIA. One year later, during the recent GDC 2025, their presence was undoubtedly way more understated, with less flashy partnerships to talk about. That doesn't mean there's no development going on behind the scenes, though. During the recent convention in San Francisco, we caught up with Inworld AI CEO Kylan Gibbs to discover what they've been up to lately.Let's talk about your company's evolution over the past few years.We've been around for almost four years now. The first thing we started out with was this character engine that was a server-side application connected to your game engines via an SDK. It was largely meant to abstract away a lot of the complexity of AI, mainly for designers and narrative writers. The biggest learning that we had was that people wanted much more control, and they wanted the logic and everything to run locally.A big focus for us product-wise has been shifting away from that server-side application to take the logic and tools that we built our own engine with and turn them into a series of libraries that developers can actually use directly in the engine to build their own AI engines effectively. That means it's a C++ runtime that can be transformed as needed for other engines.That has been the transition from character engine to framework. As part of that, we've had a focus on observability and telemetry. One of the challenges is that, with AI, a lot of game developers don't have the transparency that they need to actually understand when something breaks, what went wrong, and when something is good, what might happen.That's our portal tool, which allows developers to access the telemetry built into that framework. The big thing, though, is we need to bring not just the logic locally but, ideally, the models locally, which is what every game developer wants, so we've had a huge focus on that as well.What we've built is a tool that allows us to use our cloud to actually distill down models that can be used locally. Of course, the challenge there is that a lot of consumer hardware is not ready to run everything locally. What we end up building into a lot of these applications is what we call a hybrid inference model where you have the actual model locally stored, but it detects if it can run on the hardware and if not, it backs up to a cloud version. For example, if it lands on a PC with a GeForce RTX 5090, you run it locally. If it lands on a Nintendo Switch, you're going to use the cloud.The other big focus that we had is what we call controlled evolution. The biggest challenge with AI overall right now for games and for consumer apps in general is if you launch a game today with a given model, and you keep that model, in six months, it will be outdated because AI is moving too quickly. You need to basically constantly be able to select from all the third-party or our own models that are available, figure out which one is the best at that given time, and then do a bunch of optimization on it based on your user usage.We try to work with developers so that they do not have to make a $20 million commitment to a specific cloud provider model provider but to use whatever the best model is at any given time and optimize that specifically for their use case, because every model is built for these kinds of huge general-purpose tasks. We need to do one thing super well, and so we do a lot of work there.Because AAA games and all the largest studios that we work at Inworld with have obviously very long development cycles, the biggest launches today are largely mobile browser-based applications. The AAA ones take a little longer. The ones that I think are most exciting are Status, for example, which is from a company called Wishroll. It's a game where you roleplay as a character in another universe's Twitter.Crazy idea. But they hit 500K users in 19 days from launch with an average spend of an hour and a half per user per day, which is crazy traffic and the whole thing is powered by your achievements, the content. It's just mindblowingly creative in terms of what they built.The other one is Little Umbrella. They have another part of the company called Playroom, which might be familiar. They built Death by AI as their first game and just released another one called The Last Show, which is effectively a JackBox-style party game powered by AI. Those are super fun because they lean into AI orchestrating multiplayer scenarios in real time.A few other cool ones are Streamlabs, where we created a streaming assistant in a collaboration between us, Streamlabs with Logitech and Nvidia. The game that we're using for it is Fortnite. In that case, you have this system that's living alongside the game in real time, seeing what's happening in the game, understanding the game's state, observing the user comments, hearing what the streamers are saying and being able to take complex actions, like do I need to overclock the GPU? Do I need to change the camera settings? Do I need to trigger an in-game event? All of those different things can actually happen, and they have to happen with millisecond latency. So, to make it all work performantly, that kind of mix of hybrid and local inference has to be required.Speaking of StreamLabs, does it have functionality for a sort of gaming coach where it can monitor how you're proceeding with the game?Yes, with Streamlabs, that's basically how it performs. In this case, these are professional streamers often using it, so they really don't need coaching. But if you were a player going into the game, you'd belike, what the heck does this item do, right? What's the best next thing for me to do? It can do all of that.The biggest class of use cases that we're seeing, which I call companions and assistants, are two different varieties of companions: disembodied and embodied companions. Disembodied is your Streamlabs assistant. It's outside the game, able to observe it, but not actually literally within the game. It's often used for coaching, assistants, questions, and live walkthroughs.The other is embodied. You would use it for onboarding, which is a huge use case. Instead of having your blocks of text and everything starting, a character sees what you're doing, gives you suggestions, tells you how to play the game, and gives you comments. It can also be used later on, for example, for things like difficulty assistance. Maybe if you're stuck, it can show and tell how you're going to do this.There are other use cases like player emulation, especially when you're doing multiplayer co-op games and MMOs. You jump in, and you're in hour one. You want to get a feel for the game, but you don't want to die, so how do we make it feel like you're playing with other players, maybe even with speech and everything else? Or, maybe you and I are playing a co-op game and you drop off, and then I want a character that comes along and makes it feel like I'm still playing with you. There are a lot of different use cases in that companion assistance space that are super exciting.Is the monitoring integrated as an SDK within the program itself, or does it have the functionality to read video inputs, for example?The logic is integrated into the application itself as much as possible. We actually integrate all the model understanding in it. You can embed local visual models that can understand things in real time. Really, the constraint is what hardware you want to run on. We have a demo that is fully running on an NVIDIA GeForce RTX 5090, an AMD Radeon RX 7900 XTX, or a Tenstorrent QuietBox. In that case, you can run it all locally. In that case, your application is just as old-school as it can be. It just happens to have AI logic in there and models that are embedded. That's where I think the industry needs to be going now, because everybody doesn't have the hardware power, we're still in the case where some of that needs to back up to the cloud. But really, the only thing you're ever using in the cloud is just a stored model, and you have an endpoint to it, but we try and keep all that logic locally because the developers need control over that.For video monitoring, one example I've heard from NVIDIA showing off their gaming co-pilot assistant is being able to monitor a region of the screen. Say you're looking at a mini-map, you're playing a MOBA and you want to know when something disappears. How difficult would it be for either an end user or programmer to set up the variables to have it monitored for something like that?That's a great question. You can think about two things. The ideal for this is to do a full screen view visual language model or OCR. With OCR, you're basically taking screen captures. The reason you want a visual language model ideally is because it gives you that spatial awareness, but we see two ways to do that.For a developer, what you'd probably do is set it up so that you're having it pointed at specific pixels on the screen and understanding based on those. What we often actually push people to think about as well is that sometimes you don't need to understand vision because you have the game state. You have code on the backend.People often miss that, or they're like, we're trying to just understand the visuals, whereas actually, the code in the backend is telling me everything I need to know. It's kind of that dance between what I am actually not able to capture based on the game code and what I need that visual for.What's the possibility of using Inworld AI for quality assurance testing?You could use it. To be honest, our focus is primarily on player-facing AI. As I engage with more studios, my answer is you should build that yourself. And my reason for that is, I just think it's going to be continually driven down in terms of cost to use these large language models. Anything that's QA is a bit different, but for any kind of content creation productivity stuff, I think they just need to build it in-house and do it themselves.For QA specifically, we don't build agents for testing or anything else. You could build that yourself using our tech. It's ultimately a more infrastructural piece of technology. We have some groups trying to create player emulating bots that they can send into the world and use.We don't build a specific solution for QA testing, but we've often seen it used for prototype testing. In this scenario, you might set up a world with a general cityscape, and I want to see a hundred different varieties of this where the agents are responding in slightly different ways. It helps with rapid prototyping so that you can identify, out of that set of a hundred different options, which one is the most fun or engaging.But in terms of core quality assurance or bug fixing, it's something that developers could build, but my honest response is use our tech if you want for that, but it's probably a good area for you to build yourself because it's going to be a core part of your workflows in the future.The main reason I was asking for it is that it can monitor the game state, setting up variables where you're looking at, say, unintended interactions.Right, that's super interesting. As I was mentioning, that telemetry piece that we have is super valuable there. Because it's built into the game code, you can set it up so that you're running telemetry against any part of the game. If you want to detect what types of character responses or NPC interactions tend to result in the player completing the mission, you need to know what kind of AI stuff is actually happening there.So, I guess I would say this. We don't do general QA. We certainly are really focused on making sure that you can QA the crap out of your AI. Anything that is AI in there, we need to give you all the data, all the metadata, everything that you possibly need so that you can figure out how that actually works.I think it's essential because I honestly think one of the broken parts of AI today is it's all a black box and if you're building and iterating on a game and doing playtest, you need to know when it breaks, how it breaks, and how that's all connected. We don't do QA for the broader space of game development, but as people are integrating AI, you need to QA the crap out of that. And that's where the telemetry piece comes in.'Crowds are one of those areas that have just not really evolved in about 10 years. For example, instead of just random people walking back and forth, as you kind of see in every game or standing still, they notice each other. Someone says something, someone walks up, they start having a conversation, they decide to do something, and they go off. As a player, you might not be able to put your finger on what is more immersive in that case, but it just feels more alive. We do a lot of that kind of stuff in terms of environmental awareness, too. We don't just power characters; we can power any part of the game state. How does the environment adapt to different people? How do you create different parts of quests or event generation?'Have you noticed any issues that you're resolving with Inworld AI, such as hallucinations?Not as much anymore, because we have the ability to distill down these models and train them for specific tasks and run a lot of filters over them. The hallucinations can be controlled as much as you want and you can also perform data structure validation. So if you're outputting, for example, a JSON format, you can constrain it to specific JSON formats, certain lengths, and certain types of words.Where hallucination comes in, for example, is with that game that I mentioned earlier, Status. They kind of take advantage of it to a degree because they want characters to come up with crazy ideas but still stay within character.It depends on how you define hallucination. In some cases, breaking outside of IP norms is one form of hallucination. Another form of hallucination is coming up with completely made-up stuff that doesn't make sense in the game and breaks the data structures. We focus a lot on that former one because we work with many IP holders who are super sensitive to it. On the other side, that's a pretty solved problem, but both are solvable. One just requires a lot moremachine learning depth there.Can you talk about the dynamic crowd tech that you are working on?Yeah, I love this. One of the big problems I encounter when I engage with studio heads who work on any kind of open-world game is that there are two ways that people have tried to build better player experiences. They just make worlds bigger and bigger, and they think a bigger world means more playtime. And I'm like, I can't go on horseback for another 20 minutes. That's one part of it. The other is graphical fidelity. Effectively, they try to consistently increase the graphical fidelity, thinking that if they have a bigger world with higher fidelity, people will like it.Dynamic crowds is part of the general solution to that, which is how to make the world feel more alive. Crowds are one of those areas that have just not really evolved in about 10 years. For example, instead of just random people walking back and forth, as you kind of see in every game or standing still, they notice each other. Someone says something, someone walks up, they start having a conversation, they decide to do something, and they go off.As a player, you might not be able to put your finger on what is more immersive in that case, but it just feels more alive. We do a lot of that kind of stuff in terms of environmental awareness, too. We don't just power characters; we can power any part of the game state. How does the environment adapt to different people? How do you create different parts of quests or event generation?Maybe, if you have just completed a quest, I want to generate an event. For example, OK, I just saved this cat. I'm walking up to an ice-cream vendor. The cat jumps on the ice cream vendor, they shoo it away, and someone comes over. It's those little parts of the world that come more alive and make it feel more immersive, which is almost like a new form of fidelity that we are pushing now.Is there a toolset that Inworld AI is building so that developers can integrate a sample of the technology?That's a great question. As I mentioned, we build these templates with our framework. As soon as we start working with developers, we provide them with a sample to get started and understand how the tech works, and then they build around that. But it's not a black box component that they plug in. It's basically a chunk of code that they get to go in and change because how you might want to build crowds is different from another, not to mention how it interfaces with your assets and the rest of your system. That's why we think in terms of templates rather than components.Absolutely, especially when you're trying to adapt to the same game and engine framework across a broad spectrum of devices, some of which may be intended to be played offline.Exactly. That's where the story of local hybrid is really important, because most people want to launch their games on multiple devices. How do we create a sense of player parity? We all know there's different graphics, right? If I play The Witcher 3 on another device, it's a completely different graphical experience. There's also that kind of level of how do we give the sense of parity while also recognizing the different constraints of the different devices.Do you think Metahumans are a technology worth investing in, or is itpetering out?Honestly, I see so many people - especially because people who come to us want to feel innovative, where there's this idea that it needs to be hyper-realistic fidelity. Every time you go to Metahumans, you end up being like, oh man, facial animations are really hard and anything becomes difficult. Also, players get this uncanny valley effect.Generally, we've seen people migrate away from that direction towards more stylized characters that engage people. For example, Metaphor: ReFantazio has super high-fidelitycharacters, but it's not Metahumans. So, I feel like there is certainly a transition in the other direction. I feel like there are certain interested parties who want to maximize that fidelity so you can maximize your GPU capacity, but I personally have just consistently seen stylized characters and worlds play out a little bit better, and it makes the development a lot easier. It also allows you to feel more differentiated for your players because, otherwise, every game feels like it has the same Metahumans in it. I don't necessarily want to say it's petering out, but there's certainly a recognition that it's not the right solution for most players.Tell me a little bit about what Inworld AI is building with Nanobit for Winked.In that case, they had this interactive novel type of game. It's a great experience where they release every few weeks or months a new pack of episodes. Those take a while to develop, and what happens is people get very attached to the characters during those experiences, and then go oh, I'm going to wait for the next month to get my favorite character back in the next episode.In that case, they integrated these characters as a kind of stopgap. Now, every time you finish an episode, you can have a conversation with a character from the world. The work we did there was, how do we make these beloved characters that people are really attached to feel the same as the ones in the human-written stories so that people can continue experiencing them?A lot of it was about how to achieve that dialogue quality without breakingthe bank. A lot of that was custom model training to fit the specific character persona.Lastly, for players who want to experience Inworld AI technology firsthand, can you talk about the next commercial release in which they might be able to see that technology in a game?I can't mention any of the AAA games because they all want to be secret, but there's probably going to be some stuff this Summer that will hopefully see some very large titles announced. We will also have another large showcase happening around June where we'll be showing off some new case studies and have our own event. So, I would look around the summer for some big stuff to happen.Thank you for your time.Deal of the Day
    0 Comments 0 Shares 108 Views
  • WWW.GAMESPOT.COM
    How To Complete Captain Sims Quest in Atomfall
    Captain Grant Sims is the leader of the Protocol military force in Atomfall, and he can be found inside the force's headquarters in Wyndham Village. While Captain Sims is a daunting figure due to how much trouble he's caused in the village and surrounding areas, you can get on his good side with a few good deeds. This effort can result in learning of a new escape route out of the Quarantine Zone, but can Captain Sims and Protocol really be trusted? We'll go over how to complete the entire Captain Sims investigation and the ending it gives you in Atomfall in the full walkthrough below. Finding and speaking with Captain SimsFirst and foremost, you need to speak with Captain Sims to get started on his investigation. As soon as you enter Wyndham Village in Atomfall, you'll be instructed by the Protocol soldiers to go and talk to the captain at the Village Hill location. This is found in the center of two, just north of St. Katherine's Church. Inside the Village Hall, you'll find Captain Sims standing behind his desk at the end of the first room. Talk to the captain, and he'll initially be wary of you. However, to earn his trust, all you need to do is complete an investigation for him. You can pick between two investigations, and both take place in Wyndham Village. The first is diving into a murder that occurred at St. Katherine's Church. This kicks off the Murder in the Church investigation, which we've already covered in a previous walkthrough. This investigation doesn't take too long, and it has a couple of different endings. The other investigation you can complete for Captain Sims takes place at the local bakery. The baker's husband has been acting strangely, and the captain wants you to find out what's going on. This is another short investigation. After you complete just one of the investigations, return to Captain Sims and report your findings. This will make the captain trust you a little more, but now he wants you to look into a larger matter. Find and confront Dr. GarrowCaptain Sims now wants you to interrogate a known fugitive, Dr. Garrow. Dr. Garrow is a scientist who's responsible for the Quarantine Zone being put up, and she's locked up in the Skethermoor Prison. The doctor apparently tried to escape using a strange device, and Captain Sims wants you to find out more about what it is.We have covered how to find Dr. Garrow in Skethermoor Prison in a previous walkthrough. However, that walkthrough doesn't totally apply if you go through Captain Sims before attempting to find Dr. Garrow in the prison. Captain Sims gives you permission to enter the prison, and when this happens, you won't have to fight through all the Protocol soldiers that surround the prison. You can easily get through the prison, without the use of a Signal Redirector, and find Dr. Garrow.If you try to break out Dr. Garrow before speaking to Captain Sims, it's highly recommended to follow our walkthrough, as that will allow you to get through with much fewer issues.Either way you go about it, Dr. Garrow is located on the bottom floor of Skethermoor Prison. Keep heading straight in the prison, and you'll continue to head down different levels. Eventually, you'll find Dr. Garrow in a holding cell in the center of a large warehouse.Speak with Dr. Garrow to hear her story. You can choose to break her out of the prison if you believe her story, but you mainly want to loot the Signal Redirector in a nearby room by Dr. Garrow's prison cell. If you have the Signal Redirector, you can choose to leave Dr. Garrow or break her out, which can lead to a different ending in Atomfall. With the Signal Redirector, return to Captain Sims. Recover the Radio Part in Casterfell Woods When you return to Captain Sims again, you can show him the Signal Redirector you looted. Now, the captain lets you in on his escape plan, but he's still missing one crucial element before the escape can happen. The Wyndham Village citizens supposedly stole a key radio part from Protocol and bartered it for supplies from the Druids in Casterfell Woods. Your job is to track down and retrieve the radio part. If you have already completed some quests for Mother Jago in Atomfall, it's entirely possible you already have this radio part. In which case, hand over the part to the captain and you can proceed with the investigation. However, if you need to find the radio part, you can do so by heading to Casterfell Woods via the Sewer Tunnels in Wyndham Village or backtracking to Slatten Dale and entering the woods through there. Either way, you need to visit the Speaking Cave in Casterfell Woods, which is marked on your map if you're tracking the Captain Sims investigation. You can also find it at coordinates 23.0 E, 85.4 N.The Speaking Cave is right in the middle of a Druid camp, so you want to try and avoid them as much as possible. You also need to contend with plenty of Druids inside the cave. However, if you've completed Mother Jago quests, then you might be able to walk in without being harmed. The radio part you're looking for inside the Speaking Cave is located at the very far end of the winding paths you'll find near the entrance. Keep heading straight until you find a large wooden structure of a deity surrounded by blue plants. Head behind this wooden man and enter a small, blue-lit cave. Here, you'll find the radio part for Captain Sims on the ground near some potatoes. You can also find an audio log of Captain Sims talking about the "Atomfall Project."Return to Captain Sims with the part to proceed with the investigation. Enter the Oberon DigsiteAfter returning the radio part to Captain Sims, you've gained his full trust. To proceed with Protocol's escape plan, you need to enter the Windscale Plant in the Interchange, find Oberon, and destroy it. This involves entering the Interchange, powering on all four Data Room Stores, and turning on the Central Processing Unit. Once the CPU has been activated from the central room in the Interchange, a new entrance will open up that leads you straight into the Windscale Plant. Take the entrance and you'll find yourself in a new, rainy area surrounded by thrall enemies. Keep heading straight through the Windscale Plant, trying your best to avoid the thralls and other enemies, and eventually you'll reach doors to enter the Oberon Digsite area. When you first enter the digsite, you're greeted with a view of the Oberon meteor, which is massive and glowing purple in the center of a large cavern. Destroy Oberon and escape with ProtocolNow that you have entered the digsite, it's time to blow it to smithereens. To do this, head toward the Oberon meteor. Here, you'll find two robots patrolling both sides of the large rock. You might also see four cylinders with the word "Explosives" printed on a large tank at the top of them. You need to visit each one of these cylinders and arm a bomb that's found at the base of them. There are two bombs each on the left and right sides of Oberon.Once you've visited and armed all four bombs, go back near the entrance to the digsite. On your left (if you're looking at Oberon from the entrance doors), you'll see a large building. Go towards the building and enter the hallways. Keep proceeding down the hallways, which have a few haz-mat enemies in them, and find the command center for the digsite. You'll know you're there when you see a ton of command stations and levers.One of the levers, located on the top left side of the command center, is labeled "Self Destruct Console." If you have armed all four bombs below, pull the lever. You're now given three minutes to escape the Oberon Digsite area and the Windscale Plant before everything is destroyed.Make your way through the Windscale Plant again, and then go back to Wyndham Village when you're safely in the Interchange. The easiest route to the village is through Data Store Room A. In the village, go to the Village Hall to find a note left behind by Captain Sims. He states that Druids attacked the village, and Protocol retreated to Skethermoor.Go to Skethermoor, making sure to follow your Captain Sims quest marker, and you'll find the captain ready with a UK-themed chopper right next to the prison. You can talk to the captain, board the chopper, and then escape the Quarantine Zone.Atomfall ends as soon as you get on the chopper.
    0 Comments 0 Shares 55 Views
  • GAMERANT.COM
    The Switch 2 Direct Was a Blessing For Most Nintendo Fans, a Curse For Others
    The long-awaited Nintendo Switch 2 Direct finally delivered a detailed look at the next-gen console, from hardware specs to a packed lineup of upcoming games. But while many fans came away excited, othersparticularly fans of Nintendos cozier franchisesfelt a little let down. Chief among the disappointments was the complete absence of Animal Crossing, one of Nintendos most beloved and successful series. Given that it's been five years since the launch of Animal Crossing: New Horizons, many assumed a new entry might be revealed at the Switch 2 showcase. Which would have made perfect senseNew Horizons was a major system-seller for the original Switch and remains a cultural touchstone. Yet Nintendo didnt mention the franchise at all, not even to announce an upgraded version of New Horizons for the new hardware.
    0 Comments 0 Shares 77 Views
  • GAMEDEV.NET
    Middle Unreal Engine developer looking for a job
    Hello! As a unreal engine developer i'm looking for a job. My name is Danila Shalin, I am a Middle+ Unreal Engine Developer with 3 years of experience. Since childhood, I was passionate about game development and turned my passion into a professional activity, specializing in creating high-quality code, designing architecture, and solving complex problems. In my professional practice, Ive been involved in developing and suppo
    0 Comments 0 Shares 77 Views
  • WWW.POLYGON.COM
    The Apple of Her Eye walkthrough for Monster Hunter Wilds
    The Apple of Her Eye is a new side quest in Monster Hunter Wilds where you help Alma find her missing glasses with some help from a Wudwud named Ayejack.When you complete this side mission, youll be able to customize the look of Alma, and edit her clothes and accessories.This Monster Hunter Wilds guide provides a step-by-step walkthrough on how to complete The Apple of Her Eye. It will also show you the location of the sleeping Wudwud, Ayejack.The Apple of Her Eye walkthroughWe couldnt test if this quest unlocks sooner, but we were able to complete The Apple of Her Eye at Hunter Rank 24 (after rolling credits on the series of main story quests and unlocking High Rank). Before you start, make sure you have downloaded and applied the Title Update 1 patch. (The game suggests restarting your console or platform if your game isnt updated.)Once those requirements are taken care of, you can initiate the side mission by talking to Alma. Here is the full walkthrough on how to complete this quest.Go talk to Alma and she will tell you she lost her special glasses and you must investigate by talking to Wudwuds in the Scarlet Forest. Follow the green quest marker and talk a Wudwud named Scampshroom in area two of the Scarlet Forest. This region is marked on the map and on the eastern side of the Scarlet Forest.) It will mention that red good things come here. How to find a glasses-wearing Wudwud in Area 2Next, the mission tells you to Find a glasses-wearing Wudwud in Area 2. If you follow the narrow path where you found Scampshroom East, there is a little rock alcove where an unnamed Wudwud is digging. It can be hard to see the Wudwud at first because its dark and not marked, but you can see where it is below (it was next to an amber deposit for us). Walk up to it and a cut scene will play that will reveal it has Almas glasses, but it runs away. Go back to the Wudwud hideout and the Wudwud Thunk (who is located at the entrance of the Wudwud Hideout) will tell you to visit during at night during a time of plenty. Go inside a tent to rest until the festivities. This next step is important: Make sure you change the environment to plenty and the time to nighttime when you go to rest. Itll cost 500 Guild Points to do this. You need to climb some vines that will allow you to access another part of the Hideout. Go talk to Thunk again, hell tell you to aim for up. Walk past Thunk and towards the Wudwud named Musharpye, the item trader. Use the exit just beside the item trade, but as you exit, turn around. There will be vines leading up high into the tree. You can see an image of it below. There will be a ton of Wudwud at the top celebrating by a large fire. Talk to at least four of them with information and theyll tell you that Ayejack, the Wudwud who found the red sparklies, aka Almas glasses, is sleeping below.Sleeping Ayejack location in The Apple of Her EyeGo back down to the main Hideout area where the Item Trader and Thunk were. You can find Ayejack sleeping in a little cocoon-like bed made of vines and leaves hanging from the ceiling. He is in a room adjacent to the Item Trader and you can see a few images of Ayejacks location below.When you wake Ayejack, the creature will request that you trade it Eastern honey for the glasses. If you dont already have some, you can get Eastern honey from giant vigorwasps right in the Scarlet Forest. Once you have the item, Ayejack will give you the glasses back and you will unlock character customization for Alma in your tent. If youre looking for creatures to fight in Monster Hunter Wilds, our monster list provides a full rundown of every encounter, while we have specific guides for Doshaguma, Lala Barina, Nu Udra, Guardian Rathalos, Rey Dau, and Ajarakan.
    0 Comments 0 Shares 51 Views
  • LIFEHACKER.COM
    Trump Just Delayed the TikTok Ban (Again)
    Good news for TikTok fans: the ban won't be banned on Saturday.On Friday, President Trump announced on Truth Social his intentions to sign an executive order to "keep TikTok up and running for an additional 75 days." That gives the government and TikTok until June 18 to find a buyer, or else kick the can down the rode another time. This is the second time Trump has delayed the effects of the so-called TikTok banofficially known as the Protecting Americans from Foreign Adversary Controlled Applications Act. That act required TikTok's parent company, ByteDance, to find a United States-based buyer, or else face a ban in the country entirely. ByteDance did not sell, and, as such, the app went dark in the U.S. shortly before the deadline. TikTok brought its services back online the next day, following assurances from Trump that the company would not face repercussions for doing so. In fact, Trump signed an executive order on the first day of his second term to delay the ban for 75 days. Since then, various companies, including Amazon, Oracle, and even Mr. Beast, have discussed buying TikTok in accordance with the law. But as no company has made an official deal with ByteDance, the company faced an April 5 deadlinewithout Trump's extension, it would have been banned again. To be clear, it's not evident that Trump even has the legal authority to delay the ban. The act was passed by Congress, signed into law by President Biden, then affirmed by the Supreme Court. Presidents typically cannot defy the other branches of government and ignore laws they do not like. However, Trump's Justice Department is not enforcing the law, so here we are. In his announcement, Trump said his administration was working with China (where TikTok's parent company ByteDance is located) in good faith, acknowledging the country is not happy with Trump's "reciprocal tariffs." The Trump administration's unexpectedly extreme tariffs have sent shockwaves through the global economy, triggering reciprocal tariffs from countries around the worldincluding China. It isn't clear how the current tariff situation will affect TikTok negotiations down the line.
    0 Comments 0 Shares 71 Views
  • WWW.ENGADGET.COM
    Microsoft's latest Copilot updates include a mobile version of the multimodal Vision tool
    Microsoft just announced several updates to its Copilot AI assistant, and some sound downright useful. Its bringing Copilot Vision to mobile, but with some new features. For the uninitiated, this software originally launched for the Edge web browser and gave Copilot the ability to see and comment on the contents of websites.The company is upping its game for the mobile version, adding some multimodal functionality. Itll be able to integrate with your phones camera to enable an interactive experience with the real world. Microsoft says it can analyze both real-time video from the camera and photos stored on the deviceMicrosoft gives an example of Copilot Vision analyzing a video of plants to determine if they are healthy or not and suggesting actions to take. Well see if it can actually perform that kind of nuanced reasoning. Modern AI companies love to promise the world and then, well, you know the rest. In any event, the mobile version of Vision is available today in the Copilot app for iOS and Android. The web version is also coming to Windows.Microsoft is bringing Copilot Search to Bing to seamlessly blend the best of traditional and generative search together to help you find what you need. The company is now calling Bing your AI-powered search and answer engine. Like most AI web search tools, this provides summaries to answer queries.Microsoft says this can take the form of a simple paragraph, like Gemini AI for Google searches, but that it also can provide images and data from your favorite publishers and content owners. Copilot Search is rolling out today.The company also introduced something called Copilot Memory. This is Microsofts attempt to bring more personalization to Copilot. After all, its tough to have a true AI companion when it doesnt remember anything about you. With this addition, Copilot will be able to remember specific details about your life, like your favorite food, the types of films you enjoy and your nephews birthday and his interests.The company touts that the software will recommend actions based on what it remembers. To that end, Microsoft says Copilot will be able to do stuff like buy tickets to events, order flowers and make dinner reservations. It says the service will work with most websites across the web. Well see how that works out.The update brings some other tools to the table, like the ability to auto-generate podcasts based on specific topics and offer shopping advice based on sales history across the web. These updates begin rolling out today, but it may not hit every user for a bit. Microsoft says availability will expand in the coming weeks and months.This article originally appeared on Engadget at https://www.engadget.com/ai/microsofts-latest-copilot-updates-include-a-mobile-version-of-the-multimodal-vision-tool-182752162.html?src=rss
    0 Comments 0 Shares 71 Views
  • WWW.TECHRADAR.COM
    Chinese brands $2,000 Ryzen AI Max+ mini PC set to go on sale, with the first unit personally signed by the CEO of AMD
    The GMKTec EVO-X2 sets new standards for mini PCs, with unmatched AI performance and a powerful Strix Halo APU.
    0 Comments 0 Shares 68 Views