• UNITY.COM
    Unite 2024: Celebrating with our community, a look ahead, and the 16th Unity Awards
    Unite 2024 in Barcelona has wrapped, and what an event it was! Over three action-packed days, more than 2,000 developers, studios, and partners came together to celebrate game development. From the high-energy Keynote to deep-dive breakout sessions anchored by the Unity Engine Roadmap, attendees had the chance to explore new products and features, learn from expert-led sessions, and connect with fellow developers.It was fantastic to meet so many of you in person. We’ll keep the momentum going by continuing to engage with you IRL and online – you can stay updated through our Events Hub and Unity Discussions. A huge thank you to everyone who participated and to our sponsors for making this event possible. Let’s keep building amazing games together! Our amazing community of Made with Unity creators continue to show all of us what’s possible by bringing their games and experiences to players around the world. This year at Unite, we showcased 20 developers in our U/Game area so attendees could try out their projects.Folks attending the event got early access to upcoming titles including Stampede: Racing Royale (Sumo Digital / Secret Mode), Starship Home (Creature), and Lost Skies (Bossa Studios). The teams behind Silica (Martin “Dram” Melichárek), Worldless (Noname Studios), Void Crew (Hutlihut Games), Synth Riders (Kluge Interactive), Lost Skies (Bossa Studios), Stampede: Racing Royale (Sumo Digital), and Phasmophobia (Kinetic Games) were also on hand to show off their games and chat with other developers. In our mobile showcase area, people were able to pick up and play games such as MONOPOLY GO! (Scopely), Paper Trails (Newfangled Games), and Marvel Contest of Champions (Kabam), as well as interact with some social impact creators (and former Unity for Humanity grant winners) who were showcasing The Light Within (Pomsky Games), KATOA: Oceans (Sankari Studios), Boddle Learning (Boddle), and Amaru (Six Wing Studios).To keep the party going for those who couldn’t join us live in Barcelona, we streamed live on Twitch throughout the entire show. We had a great time chatting with the community about new games and building in Unity. We also brought some experts in to share best practices and new developments in multiplayer creation, graphics, lighting, collaboration, web, AI, and more.Last but not least, we had a blast hanging out with you, playing games and giving away complimentary keys to play later. Watch the replay and let us know where you’d like to learn more. Special thanks to our special guests from SLOW BROS., Scopely, Playable Worlds, Kinetic Studio, and Turbo Makes Games.We dove into future updates and releases of the Unity Engine in front of a packed auditorium. The Roadmap was a chance to return to our roots by giving you insight into what’s next for Unity.We introduced the concept of a generational release, which will allow us to predictably deliver new capabilities within a generation that can easily be incorporated into existing projects. We then went into more details about the next version of the Unity Engine being released within the Unity 6 generation: the Unity 6.1 Update.The Unity 6.1 Update will include much-requested features like foldable and large screen support, optimized Deferred+ rendering in GPU Resident Drawer, a new build profile for Meta Quest, and a build target for Facebook Instant Games. We will share more as we get closer to its release in early 2025, and we will continue to support Unity 6 for as long as necessary to better serve our customers.While we’re excited about the possibilities Unity 6 opens up for you, we also revealed that we’re hard at work building the NEXT generation of the Unity Engine, working on core principles of simplicity, iteration speed, and power – all aimed at helping all of you to make your visions real:Simplicity: Reduce the complexity within the Unity Editor and make the default choice the best choice. Whether it’s a single unified renderer, an easy-to-use set of UI tools, or streamlined multiplayer, our first priority is to make workflows simpler for you.Iteration speed: Implement your vision with fewer roadblocks and greater efficiency so you can go for those stretch targets that once felt out of reach.Power: New tooling to scale your game projects, hit your target framerate, reach your chosen platforms, and build larger, richer worlds for your players.We’re still a ways off, but we want to hear from you – what are you most excited about? Let us know by joining in on the Unite 2024 Roadmap Discussion, and be sure to watch the live session replay on YouTube. We’re prioritizing creating learning materials that help you maximize your success on Unity, and we’ve got resources galore to help you explore what’s possible with Unity 6.Learn more about how Unity 6 was used in Fantasy Kingdom in Unity 6, Megacity Metro, and Time Ghost. We’ll provide you with samples, demos, and scenes to show you how to get the most out of your project alongside the release of Unity 6 on October 17, 2024. And don’t forget to bookmark your favorite technical guides and e-books as we continue to update them to better reflect the new capabilities in Unity 6.Outside of the main event, we also had dedicated programming for customers and users across Appfest, the Industry Executive Summit, and the Education and Industry Mixer. These events let us connect with you based on what interests you most.A few highlights include:Top mobile game and app leaders gathered for Appfest, a two-day event, to network, share knowledge and best practices, and help advance the industry as a whole.The Industry Executive Summit brought together leaders from companies like Deutsche Bahn, Cincinnati Children’s Hospital, Icon Group, BMW, and Capgemini to share compelling stories of how they’re leveraging Unity’s platform to solve real-world challenges. Above all, the summit demonstrated how Unity’s evolving platform is meeting diverse industry needs to drive innovation and growth across sectors.We partnered with the local university, Universitat de Barcelona, to host the Education and Industry Mixer Meetup, where nearly 100 local students mingled with game developers to have fun, play games, and learn about careers in gaming.Your voice matters! Following the reveal of our 16th Unity Awards nominees, head over to the official Unity Awards voting page to support your favorite games, Asset Store publishers, and community creators. Each vote helps us shine a spotlight on the incredible talent and hard work within the Unity community. Voting closes on October 4 at 11 pm CET, so don’t miss your chance to vote for your favorites.Another reason this year’s awards ceremony is extra special is that it’s the first time the Unity Awards showcase will feature a livestream event. We’re thrilled to not only celebrate the achievements of our community but also share some news and updates from upcoming Unity games. Join us on October 23, 2024, at 7 pm CET for an event dedicated to the creators who have made some of your favorite games of this past year (and years to come).An amazing event like this would not be possible without our incredible customers, community, partners, and sponsors. Follow all the latest Unity happenings on Unity Discussions or on Discord, X, Facebook, LinkedIn, Instagram, YouTube, and Twitch. On-demand session recordings from Unite 2024 will be available soon on YouTube.
    0 Commentaires 0 Parts 48 Vue
  • TECHCRUNCH.COM
    Google’s AI search numbers are growing, and that’s by design
    Google started testing AI-summarized results in Google Search, AI Overviews, two years ago, and continues to expand the feature to new regions and languages. By the company’s estimation, it’s been a big success. AI Overviews is now used by more than 1.5 billion users monthly across over 100 countries. AI Overviews compiles results from around the web to answer certain questions. When you search for something like “What is generative AI?” AI Overviews will show AI-generated text at the top of the Google Search results page. While the feature has dampened traffic to some publishers, Google sees it and other AI-powered search capabilities as potentially meaningful revenue drivers and ways to boost engagement on Search. Last October, the company launched ads in AI Overviews. More recently, it started testing AI Mode, which lets users ask complex questions and follow-ups in the flow of Google Search. The latter is Google’s attempt to take on chat-based search interfaces like ChatGPT search and Perplexity. During its Q1 2025 earnings call on Thursday, Google highlighted the growth of its other AI-based search products as well, including Circle to Search. Circle to Search, which lets you highlight something on your smartphone’s screen and ask questions about it, is now available on more than 250 million devices, Google said — up from around 200 million devices as of late last year. Circle to Search usage rose close to 40% quarter-over-quarter, according to the company. Google also noted in its call that visual searches on its platforms are growing at a steady clip. According to CEO Sundar Pichai, searches through Google Lens, Google’s multimodal AI-powered search technology, have increased by 5 billion since October. The number of people shopping on Lens was up over 10% in Q1, meanwhile. The growth comes amid intense regulatory scrutiny of Google’s search practices. The U.S. Department of Justice has been pressuring Google to spin off Chrome after the court found that the tech giant had an illegal online search monopoly. A federal judge has also ruled that Google has an adtech monopoly, opening the door to a potential breakup.
    0 Commentaires 0 Parts 45 Vue
  • VENTUREBEAT.COM
    Subway Surfers and Crossy Road’s crossover brings together mobile classics
    Subway Surfers and Crossy Road recently crossed over, bringing together two of the longest-running titles in mobile gaming.Read More
    0 Commentaires 0 Parts 50 Vue
  • VENTUREBEAT.COM
    The new AI calculus: Google’s 80% cost edge vs. OpenAI’s ecosystem
    Explore the Google vs OpenAI AI ecosystem battle post-o3. Deep dive into Google's huge cost advantage (TPU vs GPU), agent strategies & model risks for enterpriseRead More
    0 Commentaires 0 Parts 52 Vue
  • WWW.THEVERGE.COM
    Gmail gets a slider on Android tablets, AI on the side
    Google is rolling out Gmail updates for mobile users across Android and iOS, with some design updates and new access to AI features. Android tablet and foldable owners will have a more flexible Gmail app interface that lets them drag the divider to adjust the list and conversation panes to whatever size they want in landscape view. You can also drag the divider all the way to one side to switch to a single pane view if preferred. The update is rolling out now for Workspace and personal accounts. Google is also pushing a Material Design 3 update to Gmail on iOS that puts it in parity with the Android and redesigned web versions, including the pill-shaped buttons on the bottom and a rounded search bar on top. The update is rolling out now to both Workspace and personal accounts. Additionally, Google Calendar on iOS will now let you create and modify birthday events like you already can on Android. Finally, Gemini’s image generator is coming to the Gmail app sidebar on both Android and iOS for Workspace users. Like in Google’s Workspace apps on the web, you can generate images within the Gmail app and then save them, copy them, or insert them directly into your email draft.
    0 Commentaires 0 Parts 40 Vue
  • TOWARDSDATASCIENCE.COM
    Behind the Magic: How Tensors Drive Transformers
    Introduction Transformers have changed the way artificial intelligence works, especially in understanding language and learning from data. At the core of these models are tensors (a generalized type of mathematical matrices that help process information) . As data moves through the different parts of a Transformer, these tensors are subject to different transformations that help the model make sense of things like sentences or images. Learning how tensors work inside Transformers can help you understand how today’s smartest AI systems actually work and think. What This Article Covers and What It Doesn’t This Article IS About: The flow of tensors from input to output within a Transformer model. Ensuring dimensional coherence throughout the computational process. The step-by-step transformations that tensors undergo in various Transformer layers. This Article IS NOT About: A general introduction to Transformers or deep learning. Detailed architecture of Transformer models. Training process or hyper-parameter tuning of Transformers. How Tensors Act Within Transformers A Transformer consists of two main components: Encoder: Processes input data, capturing contextual relationships to create meaningful representations. Decoder: Utilizes these representations to generate coherent output, predicting each element sequentially. Tensors are the fundamental data structures that go through these components, experiencing multiple transformations that ensure dimensional coherence and proper information flow. Image From Research Paper: Transformer standard archictecture Input Embedding Layer Before entering the Transformer, raw input tokens (words, subwords, or characters) are converted into dense vector representations through the embedding layer. This layer functions as a lookup table that maps each token vector, capturing semantic relationships with other words. Image by author: Tensors passing through Embedding layer For a batch of five sentences, each with a sequence length of 12 tokens, and an embedding dimension of 768, the tensor shape is: Tensor shape: [batch_size, seq_len, embedding_dim] → [5, 12, 768] After embedding, positional encoding is added, ensuring that order information is preserved without altering the tensor shape. Modified Image from Research Paper: Situation of the workflow Multi-Head Attention Mechanism One of the most critical components of the Transformer is the Multi-Head Attention (MHA) mechanism. It operates on three matrices derived from input embeddings: Query (Q) Key (K) Value (V) These matrices are generated using learnable weight matrices: Wq, Wk, Wv of shape [embedding_dim, d_model] (e.g., [768, 512]). The resulting Q, K, V matrices have dimensions [batch_size, seq_len, d_model]. Image by author: Table showing the shapes/dimensions of Embedding, Q, K, V tensors Splitting Q, K, V into Multiple Heads For effective parallelization and improved learning, MHA splits Q, K, and V into multiple heads. Suppose we have 8 attention heads: Each head operates on a subspace of d_model / head_count. Image by author: Multihead Attention The reshaped tensor dimensions are [batch_size, seq_len, head_count, d_model / head_count]. Example: [5, 12, 8, 64] → rearranged to [5, 8, 12, 64] to ensure that each head receives a separate sequence slice. Image by author: Reshaping the tensors So each head will get the its share of Qi, Ki, Vi Image by author: Each Qi,Ki,Vi sent to different head Attention Calculation Each head computes attention using the formula: Once attention is computed for all heads, the outputs are concatenated and passed through a linear transformation, restoring the initial tensor shape. Image by author: Concatenating the output of all heads Modified Image From Research Paper: Situation of the workflow Residual Connection and Normalization After the multi-head attention mechanism, a residual connection is added, followed by layer normalization: Residual connection: Output = Embedding Tensor + Multi-Head Attention Output Normalization: (Output − μ) / σ to stabilize training Tensor shape remains [batch_size, seq_len, embedding_dim] Image by author: Residual Connection Feed-Forward Network (FFN) In the decoder, Masked Multi-Head Attention ensures that each token attends only to previous tokens, preventing leakage of future information. Modified Image From Research Paper: Masked Multi Head Attention This is achieved using a lower triangular mask of shape [seq_len, seq_len] with -inf values in the upper triangle. Applying this mask ensures that the Softmax function nullifies future positions. Image by author: Mask matrix Cross-Attention in Decoding Since the decoder does not fully understand the input sentence, it utilizes cross-attention to refine predictions. Here: The decoder generates queries (Qd) from its input ([batch_size, target_seq_len, embedding_dim]). The encoder output serves as keys (Ke) and values (Ve). The decoder computes attention between Qd and Ke, extracting relevant context from the encoder’s output. Modified Image From Research Paper: Cross Head Attention Conclusion Transformers use tensors to help them learn and make smart decisions. As the data moves through the network, these tensors go through different steps—like being turned into numbers the model can understand (embedding), focusing on important parts (attention), staying balanced (normalization), and being passed through layers that learn patterns (feed-forward). These changes keep the data in the right shape the whole time. By understanding how tensors move and change, we can get a better idea of how AI models work and how they can understand and create human-like language. The post Behind the Magic: How Tensors Drive Transformers appeared first on Towards Data Science.
    0 Commentaires 0 Parts 51 Vue
  • WWW.USINE-DIGITALE.FR
    Uber veut déployer des taxis autonomes Volkswagen dès 2026 aux États-Unis
    Des trajets en monospace autonome pourraient bientôt être disponibles dans la version américaine de l'application Uber. La société de VTC et...
    0 Commentaires 0 Parts 53 Vue
  • WWW.GAMESPOT.COM
    You Won't Find Switch 2 Consoles On Amazon, But You Can Get Games & Gear
    When Nintendo Switch 2 preorders opened on April 24, some interested buyers undoubtedly found themselves searching Amazon, wondering why in the world the biggest online retailer in the US seemed to forget about a new Nintendo console. But the Switch 2's absence from Amazon was an expected continuation of a story that has been unfolding over the past year. Amazon simply doesn't carry first-party Nintendo products these days, which means you'll only find third-party games and accessories for the upcoming console on Amazon's storefront.No Switch 2 preorders at Amazon? No surprise there.If you regularly buy first-party Nintendo Switch games, the Switch 2's absence on Amazon's US storefront probably wasn't too surprising. To be clear, this article is specifically about Amazon in the US. Amazon UK, for instance, still carries Nintendo products, including the Switch 2.Last spring, Amazon "sold out" of Paper Mario: The Thousand-Year Door preorders. But when the retailer never restocked the remastered version of the GameCube classic, it became clear something else was happening. From that point forward, Nintendo's 2024 and 2025 Switch games weren't listed on Amazon. The list of absences included numerous notable releases: Luigi's Mansion 2 HD, Zelda: Echoes of Wisdom, Mario & Luigi: Brothership, Nintendo World Championships: NES Edition, Super Mario Party Jamboree, Donkey Kong Country Returns HD, and Xenoblade Chronicles X: Definitive Edition. Amazon also stopped restocking older Nintendo Switch exclusives that were previously available. You also can't buy digital games, eShop gift cards, or Switch Online subscriptions from Amazon.Continue Reading at GameSpot
    0 Commentaires 0 Parts 22 Vue
  • GAMERANT.COM
    One Piece: Most Clever Strategies Used by Luffy, Ranked
    The protagonist of One Piece, Monkey D. Luffy, is a man who acts first and thinks later. He is the type of person who trusts his instincts and dives headfirst into danger. He has a clear vision of what he wants to achieve, and he surrounds himself with people who can help him make those visions come true.
    0 Commentaires 0 Parts 29 Vue
  • WWW.POLYGON.COM
    How to persuade in Oblivion Remastered
    There is a persuasion system in The Elder Scrolls 4: Oblivion Remastered that allows you to charm NPCs into giving you certain bonuses. If you successfully persuade an NPC, they can give you valuable information or discounts while bartering. It’s an important part of the game if you want to get intel and lower the price of items. This guide will explain how to persuade in Oblivion Remastered, alongside some tips on how to pass persuasion checks. How to persuade in Oblivion Remastered To persuade an NPC, first, talk to them, then select the “Persuading” button on the bottom left-hand corner of the screen. This will bring you to a screen that shows a circular menu displaying the NPC’s disposition number, a value that determines how much that NPCs likes you. The higher their disposition number is, the more that character likes you. Some characters have a minimum threshold for the disposition number that you must exceed in order to reap certain benefits, like additional dialog or discounts. One way to increase the disposition of another character towards you is by bribing them. If you bribe a character, you can increas their dispotion toward you without playing the finicky minigame, but it also costs money. So just keep that in mind as you go along. When you start the persuading minigame you’ll see a circle with four options — Boast, Admire, Joke, and Coerce — and a number in the middle. You need to figure out what the character likes and dislikes among these actions. If a character likes one of these options, their disposition will increase. If they dislike one of these option, their disposition will decrease by a certain number (as indicated by the number highlighted in the graphic below). As you play, the number in the middle (the character’s disposition) will gradually decrease with time so there is a bit of pressure. Each time you play the minigame, you must select each of the four options at least one time. This means you will inevitably say something your target doesn’t like, decreasing their disposition level. The goal of the game is to maximize what they do like and minimize what they don’t like, so you end with a net increase in disposition. Oblivion Remastered persuasion tips You’ll know what a character likes because their disposition will increase if they like a certain action and decrease if they don’t. You can get an idea of what a character likes or dislikes by looking at their faces when you hover an option. If a character doesn’t like it when you admire them, they’ll have a frowny face, for instance. (The change in face can be subtle sometimes, so don’t worry if you make a mistake as you get to know an NPC.) As you play, you will see each conversation option has rings that line it. The number of rings you see indicates the “strength” of your words, so to speak. So if an NPC likes it when you boast and you have four rings next to that option, they’ll be really happy in response to a boast. If an NPC dislikes it when you admire them and there are four rings next to that option, then selecting that option will decrease disposition by a lot. Each time you select an option, the layered rings rotate clockwise so that can help you predict when it might be optmical to select certain action options. It’s complicated on paper, but you can keep it simple as you’re starting out. Do your best to have three to four rings on options an NPC really loves, and one to two when they really dislike a certain way of talking. NPCs can be pretty generous and you can play as much as you like, so pick someone you don’t care about to get some practice.
    0 Commentaires 0 Parts 26 Vue