• GaLiTe and AGaLiTe: Efficient Transformer Alternatives for Partially Observable Online Reinforcement Learning
    www.marktechpost.com
    In real-world settings, agents often face limited visibility of the environment, complicating decision-making. For instance, a car-driving agent must recall road signs from moments earlier to adjust its speed, yet storing all observations is unscalable due to memory limits. Instead, agents must learn compressed representations of observations. This challenge is compounded in ongoing tasks, where essential past information can only sometimes be retained efficiently. Incremental state construction is key in partially observable online reinforcement learning (RL), where recurrent neural networks (RNNs) like LSTMs handle sequences effectively, though theyre tough to train. Transformers capture long-term dependencies but come with higher computational costs.Various approaches have extended linear transformers to address their limitations in handling sequential data. One architecture uses a scalar gating method to accumulate values over time, while others add recurrence and non-linear updates to enhance learning from sequential dependencies, although this can reduce parallelization efficiency. Additionally, some models selectively calculate sparse attention or cache previous activations, allowing them to attend to longer sequences without significant memory cost. Other recent innovations reduce the complexity of self-attention, improving transformers ability to process long contexts efficiently. Though transformers are commonly used in offline reinforcement learning, their application in model-free settings is still emerging.Researchers from the University of Alberta and Amii developed two new transformer architectures tailored for partially observable online reinforcement learning, addressing issues with high inference costs and memory demands typical of traditional transformers. Their proposed models, GaLiTe and AGaLiTe, implement a gated self-attention mechanism to manage and update information efficiently, providing a context-independent inference cost and improved performance in long-range dependencies. Testing in 2D and 3D environments, like T-Maze and Craftax, showed these models outperformed or matched the state-of-the-art GTrXL, reducing memory and computation by over 40%, with AGaLiTe achieving up to 37% better performance on complex tasks.The Gated Linear Transformer (GaLiTe) enhances linear transformers by addressing key limitations, particularly the lack of mechanisms to remove outdated information and the reliance on the kernel feature map choice. GaLiTe introduces a gating mechanism to control information flow, allowing selective memory retention and a parameterized feature map to compute key and query vectors without needing specific kernel functions. For further efficiency, the Approximate Gated Linear Transformer (AGaLiTe) utilizes a low-rank approximation to reduce memory demands, storing recurrent states as vectors rather than matrices. This approach achieves significant space and time savings compared to other architectures, especially in complex reinforcement learning tasks.The study evaluates the proposed AGaLiTe model across several partially observable RL tasks. In these environments, agents require memory to handle different levels of partial observability, such as recalling single cues in T-Maze, integrating information over time in CartPole, or navigating through complex environments like Mystery Path, Craftax, and Memory Maze. AGaLiTe, equipped with a streamlined self-attention mechanism, achieves high performance, surpassing traditional models like GTrXL and GRU in effectiveness and computational efficiency. The results indicate that AGaLiTes design significantly reduces operations and memory usage, offering advantages for RL tasks with extensive context requirements.In conclusion, Transformers are highly effective for sequential data processing but face limitations in online reinforcement learning due to high computational demands and the need to maintain all historical data for self-attention. This study introduces two efficient alternatives to transformer self-attention, GaLiTe, and AGaLiTe, which are recurrent-based and designed for partially observable RL tasks. Both models perform competitively or better than GTrXL, with over 40% lower inference costs and over 50% reduced memory usage. Future research may improve AGaLiTe with real-time learning updates and applications in model-based RL approaches like Dreamer V3.Check out the Paper. All credit for this research goes to the researchers of this project. Also,dont forget to follow us onTwitter and join ourTelegram Channel andLinkedIn Group. If you like our work, you will love ournewsletter.. Dont Forget to join our55k+ ML SubReddit. Sana Hassan+ postsSana Hassan, a consulting intern at Marktechpost and dual-degree student at IIT Madras, is passionate about applying technology and AI to address real-world challenges. With a keen interest in solving practical problems, he brings a fresh perspective to the intersection of AI and real-life solutions. Upcoming Live LinkedIn event, 'One Platform, Multimodal Possibilities,' where Encord CEO Eric Landau and Head of Product Engineering, Justin Sharps will talk how they are reinventing data development process to help teams build game-changing multimodal AI models, fast
    0 Comments ·0 Shares ·106 Views
  • Build the Smallest LLM From Scratch With Pytorch (And Generate Pokmon Names!)
    towardsai.net
    Author(s): Tapan Babbar Originally published on Towards AI. Source: Image by AuthorSo, there I was, toying with a bunch of Pokmon-inspired variations of my cats name trying to give it that unique, slightly mystical vibe. After cycling through names like Flarefluff and Nimblepawchu, it hit me: why not go full-on AI and let a character-level language model handle this? It seemed like the perfect mini-project, and what better way to dive into character-level models than creating a custom Pokmon name generator?Beneath the vast complexity of large language models (LLMs) and generative AI lies a surprisingly simple core idea: predicting the next character. Thats really it! Every incredible model from conversational bots to creative writers boils down to how well they anticipate what comes next. The magic of LLMs? Its in how they refine and scale this predictive ability. So, lets strip away the hype and get to the essence.Were not building a massive model with millions of parameters in this guide. Instead, were creating a character-level language model that can generate Pokmon-style names. Heres the twist: our dataset is tiny, with only 801 Pokmon names! By the end, youll understand the basics of language modeling and have your own mini Pokmon name generator in hand.Heres how each step is structured to help you follow along:Goal: A quick overview of what were aiming to achieve.Intuition: The underlying idea no coding required here.Code: Step-by-step PyTorch implementation.Code Explanation: Breaking down the code so its clear whats happening.If youre just here for the concepts, skip the code youll still get the big picture. No coding experience is necessary to understand the ideas. But if youre up for it, diving into the code will help solidify your understanding, so I encourage you to give it a go!The Intuition: From Characters to NamesImagine guessing a word letter by letter, where each letter gives you a clue about whats likely next. You see Pi, and your mind jumps to Pikachu because ka often follows Pi in the Pokmon world. This is the intuition well teach our model, feeding it Pokmon names one character at a time. Over time, the model catches on to this naming styles quirks, helping it generate fresh names that sound Pokmon-like.Ready? Lets build this from scratch in PyTorch!Step 1: Teaching the Model Its First AlphabetGoal:Define the alphabet of characters the model can use and assign each character a unique number.Intuition:Right now, our model doesnt know anything about language, names, or even letters. To it, words are just a sequence of unknown symbols. And heres the thing: neural networks understand only numbers its non-negotiable! So, to make sense of our dataset, we need to assign a unique number to each character.In this step, were building the models alphabet by identifying every unique character in the Pokmon names dataset. This will include all the letters, plus a special marker to signify the end of a name. Each character will be paired with a unique identifier, a number that lets the model understand each symbol in its own way. This gives our model the basic building blocks for creating Pokmon names and helps it begin learning which characters tend to follow one another.With these numeric IDs in place, were setting the foundation for our model to start grasping the sequences of characters in Pokmon names, all from the ground up!import pandas as pdimport torchimport stringimport numpy as npimport reimport torch.nn.functional as Fimport matplotlib.pyplot as pltdata = pd.read_csv('pokemon.csv')["name"]words = data.to_list()print(words[:8])#['bulbasaur', 'ivysaur', 'venusaur', 'charmander', 'charmeleon', 'charizard', 'squirtle', 'wartortle']# Build the vocabularychars = sorted(list(set(' '.join(words))))stoi = {s:i+1 for i,s in enumerate(chars)}stoi['.'] = 0 # Dot represents the end of a worditos = {i:s for s,i in stoi.items()}print(stoi)#{' ': 1, 'a': 2, 'b': 3, 'c': 4, 'd': 5, 'e': 6, 'f': 7, 'g': 8, 'h': 9, 'i': 10, 'j': 11, 'k': 12, 'l': 13, 'm': 14, 'n': 15, 'o': 16, 'p': 17, 'q': 18, 'r': 19, 's': 20, 't': 21, 'u': 22, 'v': 23, 'w': 24, 'x': 25, 'y': 26, 'z': 27, '.': 0}print(itos)#{1: ' ', 2: 'a', 3: 'b', 4: 'c', 5: 'd', 6: 'e', 7: 'f', 8: 'g', 9: 'h', 10: 'i', 11: 'j', 12: 'k', 13: 'l', 14: 'm', 15: 'n', 16: 'o', 17: 'p', 18: 'q', 19: 'r', 20: 's', 21: 't', 22: 'u', 23: 'v', 24: 'w', 25: 'x', 26: 'y', 27: 'z', 0: '.'}Code Explanation:We create stoi, which maps each character to a unique integer.The itos dictionary reverses this mapping, allowing us to convert numbers back into characters.We include a special end-of-word character (.) to indicate the end of each Pokmon name.Step 2: Building Context with N-gramsGoal:Enable the model to guess the next character based on the context of preceding characters.Intuition:Here, were teaching the model by building a game: guess the next letter! The model will try to predict what comes next for each character in a name. For example, when it sees Pi, it might guess k next, as in Pikachu. Well turn each name into sequences where each character points to its next one. Over time, the model will start spotting familiar patterns that define the style of Pokmon names.Well also add a special end-of-name character after each name to let the model know when its time to wrap up.Character N-grams. Source: Image by AuthorThis example shows how we use a fixed context length of 3 to predict each next character in a sequence. As the model reads each character in a word, it remembers only the last three characters as context to make its next prediction. This sliding window approach helps capture short-term dependencies but feel free to experiment with shorter or longer context lengths to see how it affects the predictions.block_size = 3 # Context lengthdef build_dataset(words): X, Y = [], [] for w in words: context = [0] * block_size # start with a blank context for ch in w + '.': ix = stoi[ch] X.append(context) Y.append(ix) context = context[1:] + [ix] # Shift and append new character return torch.tensor(X), torch.tensor(Y)X, Y = build_dataset(words[:int(0.8 * len(words))])print(X.shape, Y.shape) # Check shapes of training dataCode Explanation:Set Context Length: block_size = 3 defines the context length, or the number of preceding characters used to predict the next one.Create build_dataset Function: This function prepares X (context sequences) and Y (next character indices) from a list of words.Initialize and Update Context: Each word starts with a blank context [0, 0, 0]. As characters are processed, the context shifts forward to maintain the 3-character length.Store Input-Output Pairs: Each context (in X) is paired with the next character (in Y), building a dataset for model training.Convert and Check Data: Converts X and Y to tensors, preparing them for training, and checks their dimensions. This dataset now captures patterns in character sequences for generating new names.Step 3: Building the Neural NetworkGoal:Train the model by predicting each next character and adjusting weights based on prediction accuracy.Intuition:Heres where it gets interesting! Well create a simple setup with three layers that work together to predict the next letter based on the previous three. Again, think of it like guessing letters in a word game: each time the model gets it wrong, it learns from the mistake and adjusts, improving with each try.As it practices on real Pokmon names, it gradually picks up the style and patterns that make these names unique. Eventually, after going over the list enough times, it can come up with new names that have that same Pokmon vibe!# Initialize parametersg = torch.Generator()C = torch.randn((27, 10), generator=g)W1 = torch.randn((30, 200), generator=g)b1 = torch.randn(200, generator=g)W2 = torch.randn((200, 27), generator=g)b2 = torch.randn(27, generator=g)parameters = [C, W1, b1, W2, b2]for p in parameters: p.requires_grad = Truefor i in range(100000): ix = torch.randint(0, X.shape[0], (32,)) emb = C[X[ix]] h = torch.tanh(emb.view(-1, 30) @ W1 + b1) logits = h @ W2 + b2 loss = F.cross_entropy(logits, Y[ix]) for p in parameters: p.grad = None loss.backward() for p in parameters: p.data -= 0.1 * p.gradCode Explanation:We initialize weights and biases for the embedding layer (C) and two linear layers (W1, W2) with random values.Each parameter is set to requires_grad=True, enabling backpropagation, which adjusts these parameters to minimize prediction errors.We select a mini-batch of 32 random samples from the training data (Xtr), allowing us to optimize the model more efficiently by processing multiple examples at once.For each batch, we use embeddings, and pass them through the hidden layer (W1) with tanh activation, and calculate logits for output.Using cross-entropy loss, the model learns to reduce errors and improve predictions with each step.Training the model. Source: Image by AuthorStep 4: Finding the Probability of the Next CharacterGoal:To generate new Pokmon names by predicting one character at a time based on the input sequence, using the models learned probabilities.Intuition:During training, it optimized its weights to capture the likelihood of each character following another in typical Pokmon names. Now, using these learned weights (W1, W2, b1, b2), we can generate entirely new names by predicting one character at a time. At this step, were making our model guess the next letter that should follow a given sequence, such as pik.The model doesnt directly understand letters, so the input characters are first converted into numbers representing each character. These numbers are then padded to match the required input size and fed into the models layers. The layers are like filters trained to predict what typically follows each character. After passing through these layers, the model provides a list of probabilities for each possible character it might select next, based on what its learned from the Pokmon names dataset. This gives us a weighted list of potential next characters, ranked by likelihood.Source: Image by AuthorIn the example above, you can see that the characters a and i have a high likelihood of following the sequence pik.input_chars = "pik" # Example input to get probabilities of next characters# Convert input characters to indices based on stoi (character-to-index mapping)context = [stoi.get(char, 0) for char in input_chars][-block_size:] # Ensure context fits block sizecontext = [0] * (block_size - len(context)) + context # Pad if shorter than block size# Embedding the current contextemb = C[torch.tensor([context])]# Pass through the network layersh = torch.tanh(emb.view(1, -1) @ W1 + b1)logits = h @ W2 + b2# Compute the probabilitiesprobs = F.softmax(logits, dim=1).squeeze() # Squeeze to remove unnecessary dimensions# Print out the probabilities for each characternext_char_probs = {itos[i]: probs[i].item() for i in range(len(probs))}Code Explanation:We convert the context indices into an embedded representation, a numerical format that can be fed into the model layers.We use the models layers to transform the embedded context. The hidden layer (h) processes it, and the output layer (logits) computes scores for each possible character.Finally, we apply the softmax function to the logits, giving us a list of probabilities. This probability distribution is stored in next_char_probs, mapping each character to its likelihood.Step 5: Generating New Pokmon NamesGoal:Using the probabilities from Step 4, we aim to generate a complete name by selecting each next character sequentially until a special end-of-name marker appears.Intuition:The model has learned typical character sequences from Pokmon names and now applies this by guessing each subsequent letter based on probabilities. It keeps selecting characters this way until it senses the name is complete. Some generated names will fit the Pokmon style perfectly, while others might be more whimsical capturing the creative unpredictability that fascinates generative models. Here are a few names generated by our model:dwebblesimikyubaltarillpupidonburrsolapatranmeowomankwormantisbuneglisawhirlixhydolaudinjadiglerskipedenneoncontext = [0] * block_sizefor _ in range(20): out = [] while True: emb = C[torch.tensor([context])] h = torch.tanh(emb.view(1, -1) @ W1 + b1) logits = h @ W2 + b2 probs = F.softmax(logits, dim=1) ix = torch.multinomial(probs, num_samples=1, generator=g).item() context = context[1:] + [ix] out.append(ix) if ix == 0: break print(''.join(itos[i] for i in out))Code Explanation:Using softmax on logits, we get probabilities for each character.torch.multinomial chooses a character based on these probabilities, adding variety and realism to generated names.Thats it! You can even experiment by starting with your name as a prefix and watch the model transform it into a Pokmon-style name.Future ImprovementsThis model offers a basic approach to generating character-level text, such as Pokmon names, but its far from production-ready. Ive intentionally simplified the following aspects to focus on building intuition, with plans to expand on these concepts in a follow-up article.Dynamic Learning Rate: Our current training setup uses a fixed learning rate of 0.1, which might limit convergence efficiency. Experimenting with a dynamic learning rate (e.g., reducing it as the model improves) could yield faster convergence and better final accuracy.Overfitting Prevention: With a relatively small dataset of 801 Pokmon names, the model may start to memorize patterns rather than generalize. We could introduce techniques like dropout or L2 regularization to reduce overfitting, allowing the model to better generalize to unseen sequences.Expanding Context Length: Currently, the model uses a fixed block_size (context window) that may limit it from capturing dependencies over long sequences. Increasing this context length would allow it to better understand patterns over longer sequences, creating names that feel more complex and nuanced.Larger Dataset: The models ability to generalize and create more diverse names is limited by the small dataset. Training on a larger dataset, possibly including more fictional names from different sources, could help it learn broader naming conventions and improve its creative range.Temperature Adjustment: Experiment with the temperature setting, which controls the randomness of the models predictions. A lower temperature will make the model more conservative, choosing the most likely next character, while a higher temperature encourages creativity by allowing more varied and unexpected choices. Fine-tuning this can help balance between generating predictable and unique Pokmon-like names.Final Thoughts: Gotta Generate Em All!This is one of the simplest character-level language models, and its a great starting point. By adding more layers, using larger datasets, or increasing the context length, you can improve the model and generate even more creative names. But dont stop here! Try feeding it a different set of names think dragons, elves, or mystical creatures and watch how it learns to capture those vibes. With just a bit of tweaking, this model can become your go-to generator for names straight out of fantasy worlds. Happy training, and may your creations sound as epic as they look!The full source code and the Jupyter Notebook are available in the GitHub repository. Feel free to reach out if you have ideas for improvements or any other observations.References:Source: Pokemon love GIF on giphyJoin thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming asponsor. Published via Towards AI
    0 Comments ·0 Shares ·127 Views
  • Build a Local CSV Query Assistant Using Gradio and LangChain
    towardsai.net
    Build a Local CSV Query Assistant Using Gradio and LangChain 0 like November 15, 2024Share this postLast Updated on November 15, 2024 by Editorial TeamAuthor(s): Vikram Bhat Originally published on Towards AI. This member-only story is on us. Upgrade to access all of Medium.In this blog, well walk through creating an interactive Gradio application that allows users to upload a CSV file and query its data using a conversational AI model powered by LangChains create_pandas_dataframe_agent and Ollama's Llama 3.2. This guide will focus on building a local application where the user can upload CSVs, ask questions about the data, and receive answers in real-time.You can find the complete code for this application in the GitHub repository.Gradio is a powerful alternative to Streamlit, offering many new features that make building machine learning applications easy. Gradio excels with simple interfaces and impressive integration capabilities. Some standout features include native support for various data types (such as images, audio, and text), dynamic UI updates, and easy integration with popular libraries like TensorFlow, PyTorch, and LangChain.In this tutorial, we leverage LangChains experimental create_pandas_dataframe_agent, which allows us to analyze simple CSVs without the need to implement complex Retrieval-Augmented Generation (RAG) systems. This makes it ideal for users who want to quickly query CSV data in a conversational manner without the overhead of building a full-fledged RAG system.Additionally, Ollama enables us to run the entire system locally, using Read the full blog for free on Medium.Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming asponsor. Published via Towards AITowards AI - Medium Share this post
    0 Comments ·0 Shares ·127 Views
  • Half-Life 2 20th Anniversary Update Includes Developer Commentary, Imrpoved Graphics, and a 2-Hour Documentary
    www.ign.com
    Valve is honoring the 20th anniversary of Half-Life 2 with a celebration that includes a brand-new two-hour documentary, a developer commentary update, and more.The studio updated fans on the festivities on its website today, revealing a gift basket full of goodies for fans new and old. While its not quite the Half-Life 3 announcement fans have waited two decades for, its still more than enough to help with the wait.Half-Life 2 20th Anniversary celebration includes a Documentary and a Major Game UpdateTopping the list of announcements is a two-hour making-of video from the Secret Tape team behind the Half-Life 1 documentary. Valve says die-hard fans can start watching the Half-Life 2 documentary today for a behind-the-scenes look at how the studio handled running out of money, getting hacked, building its PC storefront Steam, and more. Its an in-depth peek behind the curtain at a pivotal moment in gaming history, but the festivities dont stop there.Half-Life 2 owners can also now enjoy a new update that adds everything from additional content to ironed out wrinkles. This includes access to the original Episode One and Episode Two expansions, which have been bundled into to the base experience at no additional cost. Valve adds that it reassembled the original Half-Life 2 team to provide commentary tracks for the entire game.Steam Workshop support and Steam game recording are just two more bonuses that have been added to help celebrate Half-Life 2s 20th anniversary, with even more included in the form of general touch-ups and fixes. Some of the updates changes include rebalanced lighting, cleaner horizons, new graphics settings for things like blood and fire effects, and an overhaul for aim-assist.Half-Life's Biggest MomentsIGN's Twenty Questions - Guess the game!IGN's Twenty Questions - Guess the game!To start:...try asking a question that can be answered with a "Yes" or "No".000/250Every map in Half-Life 2 has been looked over by Valve level designers to fix longstanding bugs, restore content and features lost to time, and improve the quality of a few things like lightmap resolution and fog, the studio explains.If youre interested in Half-Life 2 and want to check out the 20th anniversary update, youre in luck; Valve has made the game free to own until November 18. That means you get the entire sequel experience, including its two expansions, at no cost whatsoever as long as you click that download button in the next few days.Half-Life 2 has long been hailed as one of Valves best projects and one of the most important video games ever made. In our original review, we called it a amazing and gave it a 9.7/10. Fans have combed over every detail of the first-person sequel since its release on November 16, 2004, and once the 20th anniversary update is live, there will be even more to uncover. While its true that Half-Life 3 still hasnt been confirmed to be in the works, the goodies revealed today should help with the wait until a sequel is actually, really revealed. Theres also the VR-exclusive Half-Life: Alyx, which we gave a 10/10.For more on the world of Half-Life, you can read about Nvidias upcoming Half-Life 2 RTX remaster. The company released a trailer for the project just yesterday to help celebrate the 20th anniversary, revealing how its team is creating something that stays true to the original vision while giving it a fresh coat of paint.Michael Cripe is a freelance contributor with IGN. He started writing in the industry in 2017 and is best known for his work at outlets such as The Pitch, The Escapist, OnlySP, and Gameranx.Be sure to give him a follow on Twitter @MikeCripe.
    0 Comments ·0 Shares ·105 Views
  • The Best Buy Early Black Friday 3-Day Sale, Round 2 Starts Today: The Deals You Shouldn't Miss
    www.ign.com
    Best Buy just kicked off its second round of Early Black Friday deals. This time around, the best deals are limited to gaming PCs and laptops, TVs, and some gaming peripherals. The final round of Early Black Friday deals should go live on Friday, November 22. One significant advantage that Best Buy has over online-only retailers like Amazon is that you have the option of buying online and picking it up in store. Chances are there's a retail location within driving distance of your home. Not only do you avoid the hassle of shipping, especially during the busy holiday season, but it's easy to return the monitor if you're dissatisfied.Black Friday TV DealsThe 65" Samsung OLED TV is definitely the best TV deal currently available, but it has at this price for a few days now. The best new deal is the 48" LG B4 OLED TV for just $599.99. This deal was previously only available to My Best Buy Plus members, but not anymore. This would make a great gaming monitor thanks to its HDMI 2.1 inputs with up to 120Hz refresh rate. Rtings mentions that in Game Optimizer mode, the image quality is comparable to the much more expensive LG C4.65" Samsung S90C 4K OLED Smart TV48" LG B4 4K OLED Smart TV75" Insignia F50 4K Fire TV75" Samsung DU6950 4K Smart TV85" Hisense QD7 4K 144Hz QLED Smart TV85" Hisense QD6 4K QLED Google TV85" TCL Q5 4K QLED Fire TVBlack Friday Gaming PC DealsBest Buy completely refreshed its selection of early Black Friday gaming PC deals. If you're on a budget, the iBuyPower Scale RTX 4060 gaming PC for $699.99 is a tremendous deal and a great candidate for 1080p gaming. Another gaming PC deserving of mention is the CyberPowerPC Gamer Supreme, which features the AMD Ryzen 7 7800X3D CPU (predecessor to the new 9800X3D) and a powerful RTX 4070 Super GPU. This rig can handle 1440p and 4K gaming.iBuyPower Scale Intel Core i5-14400F RTX 4060 Gaming PCiBuyPower Y40 Intel Core i7-14700F RTX 4070 Gaming PCCyberPowerPC Gamer Supreme AMD Ryzen 7 7800X3D RTX 4070 Super Gaming PCASUS ROG Intel Core i7-14700KF RTX 4070 SUPER Gaming PCASUS ROG Intel Core i7-14700F RTX 4060 Ti Gaming PCHP OMEN 35L AMD Ryzen 7 8700G RTX 4060 Ti Gaming PCFor more early Black Friday deals on gaming laptops and PCs, check out both the Dell Early Black Friday Sale and the HP Early Black Friday Sale. These computers are made-to-order and usually take longer to ship out, but the consistency in build quality and customer service makes them worth it.Black Friday Gaming Laptop DealsBest Buy has also refreshed its early Black Friday gaming laptop deals. The Acer Nitro V 15" laptop boasts an RTX 4050 GPU for a budget price of $699.99. On the other end of the spectrum, the Acer Predator Helios with a massive 18" display, powerful i9-14900HX CPU, and best-in-class RTX 4090 GPU, is available for $2,399.99. That's the best price we've seen so far for an RTX 4090 gaming laptop for Black Friday.Acer Nitro V 15" AMD Ryzen 5 7535HS RTX 4050 Gaming LaptopAcer Predator Helios 18 QHD+ Intel Core i9-14900HX RTX 4090 Gaming LaptopASUS ROG Strix 18" QHD Intel Core i9-14900HX RTX 4080 Gaming LaptopHP OMEN 16" Intel Core i7-13620H RTX 4060 Gaming LaptopHP OMEN Transcend 14" 3K OLED Intel Core Ultra 9-185H RTX 4070 Gaming LaptopAcer Predator Helios Neo 16 Intel Core i9-14900HX RTX 4060 Gaming LaptopASUS TUF 15" Intel Core i7-13620H RTX 4070 Gaming LaptopASUS ROG Zephyrus G16 16" Intel Core i7-13620H RTX 4070 Gaming LaptopASUS ROG Zephyrus G16 16" QHD+ OLED Intel Core Ultra 9 185H RTX 4090 Gaming LaptopMSI Crosshair 16" Intel Core i7-14650HX RTX 4070 Gaming LaptopBlack Friday Gaming Hardware DealsIf you're a My Best Buy Plus member, you get exclusive savings on the Asus ROG Ally Ryzen Z1 Extreme edition gaming handheld. It's currently on sale for $499.99 for everyone, but My Best Buy Plus members get an additional $50 on top, making it $449.99. This extra discount may be unlocked for non-members starting November 29. In our Asus ROG Ally review, Robert Anderson wrote that "the ROG Ally has the potential to be a serious contender against the Steam Deck. While it's not perfect, there's a lot to love about this powerful new handheld."$449.99 for My Best Buy Plus MembersAsus ROG Ally AMD Ryzen Z1 Extreme 512GB Gaming HandheldPS5, PCLogitech G923 Racing Wheel and PedalsLogitech G923 Racing Wheel and PedalsBackbone One Mobile Gaming Controller (Lightning)Black Friday Sony Headphone DealsThe Sony headphone and earbud deals have started early for Black Friday. These are manufacturer discounts, so the discounted pricing should be available at other retailers if you prefer them, like Amazon, Walmart, or Target. The Sony WH-1000XM4 is the best deal here. It's current price of $199.99 is $30 cheaper than last week. Although we haven't reviewed this specific headphone, we have reviewed its XM5 successor. In our Sony WH1000XM5 review. Kevin Lee wrote that it's "hands down the best sounding and most impressive noise-cancelling headphones." Although the XM5 is also on sale, it's actually higher than last week ($299.99), so I would recommend waiting for another price drop.Should You Buy Now or Wait for Black Friday?At this point, Black Friday is only a few weeks away and in most cases it's better to wait. There are some deals here that are probably as low as they will go, even on Black Friday. However, it's very likely that the same deals you see now will also be available on November 29, even if they go out of stock.Eric Song is the IGN commerce manager in charge of finding the best gaming and tech deals every day. When Eric isn't hunting for deals for other people at work, he's hunting for deals for himself during his free time.
    0 Comments ·0 Shares ·105 Views
  • Link Tank: Free Action Film Lineup in November Only on Vizio WatchFree+
    www.denofgeek.com
    Our friends from Vizio are turning up the heat on the holiday season with a curated list of the hottest free action titles on WatchFree+ in November!WatchFree+ has thousands of free premium movies and shows that can be streamed on demand on VIZIO TVs, or ANYONE can enjoy titles like these and more using the VIZIO mobile app! All you need is a free VIZIO Account to start enjoying WatchFree+.Current offerings include Black Hawk Down (2001), Training Day (2001), The Crow (1994), Cloverfield (2008), Charlies Angels: Full Throttle (2003), Angels & Demons (2009), Sicario: Day of the Soldado (2018), Behind Enemy Lines (2001), Behind Enemy Lines II: Axis of Evil (2006), Behind Enemy Lines: Colombia (2009), and Vampire in Brooklyn (1995).Read more at Vizio.comTo celebrate the 40th anniversary of The Terminator, Den of Geek has an exclusive reveal of two incredible new posters from Vice Press inspired by the film! This wonderful artwork was created by Florey. Sometimes the fact that the Nintendo Switch is a hybrid console and handheld works against it, such as when playing Mario & Luigi: Brothership.For the vast majority of my Switch gaming, the console stays safely ensconced in its dock by the TV, with my hands on a Pro Controller, playing it exactly like I play my PlayStation 5. On some level, I know Im Doing It Wrong by not taking advantage of the systems hybrid nature and mobility, but playing Brothershipan unlikely return to one of Nintendos best handheld franchisesreally brought that disconnect into focus.Read more at The A.V. ClubSilo showrunner Graham Yost shares how the Apple TV+ show has broadened in season 2, offering a new plot to go along with the additional setting.Season 1 was a tense, claustrophobic paranoid conspiracy drama that ended with mechanic-turned-sheriff Juliette (Rebecca Ferguson) venturing out into the wastelandNow, in Season 2, the story has fractured into two separate stories: one following Juliette as she tries to find her way home and another following the world Juliette left behind, where she has gone from leader to symbol. But this latter storyline, following an uprising in the silo, is far bigger than its counterpart.Read more at InverseLove Actually director Richard Curtis jokes that Heretic star Hugh Grant had to work harder in my movies when he was pretending to be nice.Curtis said that for over a decade, no one took any risks with Grant, but he finds it lovely that the actor has carved out a new niche for himself in the later stages of his career. The two worked together on romantic comedies like Four Weddings and a Funeral, Notting Hill and Love Actually. However, the Emmy Award-winning director said he would still like to see Grant return to the charming, rom-com roles of his earlier years.Read more at The WrapThe Boys will be ending after its upcoming fifth season, but showrunner Eric Kripke doesnt necessarily see that as a total mistake.For some, the news came as a gut punch, while others were happy that it would be bowing out peacefully, not pushing the storyline further and becoming the very thing that it has been dunking on after all these years. And, although he will be sad to see it go, series creator, Eric Kripke, seems to align more with the latter camp, certain that he doesnt want to push the show too far over the edge and disappoint the fandom in that way.Read more at Collider
    0 Comments ·0 Shares ·100 Views
  • Predator Badlands Will Make a Big Change to the Predator Movie Formula
    www.denofgeek.com
    By this point, the interstellar hunters known as Yautja have appeared in several movies, comics, and video games. Yet, each Predator movie follows the same basic structure: were introduced to human protagonists, we see them get killed by a mysterious force, and its not until late in the film that we finally see the ugly motherf-r in its full glory.Not so for upcoming new entry Predator: Badlands, which is seemingly changing up the usual formula. This film puts the Predator front and center, leading the charge, according to director Dan Trachtenberg in an interview with Empire.Hes still badass, but theres something there that touches you emotionally, too, Trachtenberg trased. Creating a character you connect with, but are also super-intimidated by, has been challenging. But exciting.That last part stands out, as the Predator franchise isnt historically known for the emotional depth of their main characters. The heroes of the original 1987 film worked better as hunks of American exceptionalism who get torn apart by a bigger threat than they do as people with interiority. Danny Glover finds bits of depth to his overwhelmed L.A. cop in Predator 2 (1990), but both 2010s Predators and 2018s The Predator go back to macho men spouting one-liners and sneering at one another.In fact, the only properly rich hero in the franchise thus far came from Trachtenbergs first outing in the series, Prey (2022). As played by Amber Midthunder, Comanche warrior Naru wants to prove herself to her tribe and move out from under the shadow of her brother. Her plight gets even more rich when European colonizers arrive, a sign of the battle to come.For Trachtenberg, the complexity of Preys hero meant that he had to continue pushing the franchise forward in its next entry. The director needed to find another essential piece of cinema that does what Prey did spirituallypushing the franchises boundaries, letting us root for a hero we rarely get to root forbut in a different way. And that transformed into this big idea of rooting for the Predator.With a change in focus also comes a change in setting. At this point, we still dont know much about the plot of Predator: Badlands. However, we do know that it takes place on a wasted world in the far future and will star Elle Fanning. That description might bring to mind Predators, which followed a group of human tough guys stranded on a planet used as a Yautja hunting ground. But where that film had an ensemble cast with character actors such as Walton Goggins and Mahershala Ali, Badlands will involve a pair of sisters.She faced intense challenges on this moviedramatically, physically, logistically, Trachtenberg says of his lead human. He doesnt say more, but we cant help but wonder if those challenges relate to the movies shift in focus. If the Predator is the hero here, does that mean Fanning will have to do the stuff usually done by the antagonist, ripping out spines and skinning victims?Well, probably not. But it will be exciting to see how these human sisters challenge our Yautja protagonist.Predator: Badlands releases on Nov. 7, 2025.
    0 Comments ·0 Shares ·100 Views
  • Explore the Dark Narrative of Shines Over: The Damned on Xbox
    news.xbox.com
    Immerse yourself in a world filled with mystery and darkness. Developed by Juan-Mod Studio, Shines Over: The Damned features a unique horror narrative that grips you from the start. Explore vast lands with your faithful dog, a German Shepard. Face the horror and experience a unique atmosphere.Your loyal German Shepherd is more than just a companion; it plays a crucial role in navigating the world and providing emotional depth during key moments in the story.Shines Over: The Damned is now accessible to a whole new audience, bringing its unique experience to Xbox players. Whether youre an Xbox Series X|S player or enjoying the game on Xbox One, you can dive into this immersive world November 26, and preorder today.Innovative Design Pulls you inThe art style of Shines Over: The Damned stands out for its dark and immersive atmosphere, complemented by a soundtrack that heightens the experience. Every visual element contributes to the narrative and world-building. Shines Over: The Damned has been developed with megascans technology a unique technology for hyper-realistic environments. Being an environmental game, all the environments have been made with realistic textures of elements with very high definition. This gives the game a realistic look and feel all the way through.A Game that Hits the Ground RunningSince its initial launch in March, 2024 on PlayStation 5, Shines Over: The Damned has received great reviews from around the gaming community as a pleasant surprise with a thought-provoking story. Its hard to define within a genre, and we agree with Softpedias review that Shines Over: The Damned deserves its own category. We are so excited to bring the game to Xbox and watch Xbox fans make it their next must-play Xbox title.Shines Over: The Damned is a journey you wont want to miss. This game redefines the horror experience with its rich narrative and captivating art style. Whether youre a fan of psychological thrillers, survival horror, or simply enjoy exploring rich and beautifully crafted game worlds, Shines Over: The Damned offers something for everyone. Dont wait preorder today and join the ever-growing community of players who have already been drawn into this unforgettable tale when it launches November 26.Shines Over: The DamnedFirenut Games$14.99Pre-orderShines Over: The Damned will take you into a unique experimental horror experience in first person, where your senses will be tested and your reality questioned. You have no name, no memory, no weapons. There are no friends to protect you. You are alone except for your faithful German Shepherd dog, who will guide you through this dark world and stay by your side. Immerse yourself in a mysterious and terrifying walking simulator with your faithful companion, where jump scares, tension, and environmental puzzles surround you, and danger lies in wait. Beware of the horrors that await you in Shines Over: The Damned. You have been warned
    0 Comments ·0 Shares ·141 Views
  • How to Ensure a Fun, Safe and Welcoming Gaming Experience for Your Kids on Xbox This Holiday Season
    news.xbox.com
    The holiday season is fast approaching, and with it comes the excitement of finding the perfect gift for the kid(s) in your life. For some, this means considering an Xboxan experience that can offer endless hours of fun, connection, creativity, and learning.But navigating the gaming world can seem daunting to some especially those that are new to gaming. From questions around online gaming communities, to regulating screen time, content appropriateness, and how to maintain safety it can feel overwhelming at times. Fortunately, Xbox makes it easy for adults to make the right choices for their families.The single most important thing you can do when setting up a new Xbox console is to select Yes when asked Will this console be used by kids?By spending just a few minutes setting up an Xbox Child Account (versus having them use your existing account), youre building a safer, more tailored experience for them and one thats right for your family (and thats easy to adjust as your little one grows up). Dont worry, you wont have to buy your games again to allow your child to play on the same console*. You will, however, have control over which of these games theyre allowed to play, how long they can play, who they can communicate with, and more.Once youve set up your Xbox console, you can keep the experience safe by downloading the Xbox Family Settings app for your Android or iOS device. This powerful tool allows you to easily tailor your childs gaming experience from anywhere, even if youre nowhere near your Xbox.Heres just some of what it can do:Screen Time Management: You can set time limits to manage how long your child can play each day. Parents (and grandparents, aunts and uncles!) tell us one of their favorite features is the ability for kids to request extra playtime, which you can easily approve or reject right from the app. You can even pause game time temporarily from the app; the perfect way to say dinners ready!Content Filters: The app allows you to tailor the content your child can access, blocking games that are too mature and ensuring they only see age-appropriate content that you approve.Purchase Limits and Approvals: Prevent unexpected purchases by approving each purchase through the ask-to-buy feature. To reward good behavior, you can easily add funds to a childs account for future purchases.Communication Controls: Decide who your child can communicate with during their gaming sessions through robust controls. You can manage your childs friend list and review incoming requests, and can restrict communication to just friends, or block it entirely.Activity Reporting: Stay informed with weekly activity reports that give you an overview of what your child is playing, how long theyve played, and any requests theyve made for additional time or new games. Pretty cool!Additionally, the Xbox Series X|S offers numerous features designed to make gaming accessible for everyone. You can find the full range of accessibility settingson Xboxs Accessibility site.Once youve curated the experience youd like your children to have, the only question is, what should you play together? Xbox Game Pass offers great value with access to hundreds of games for one monthly price, including a diverse range of family-friendly games that everyone can enjoy. From action-packed adventures to educational games, theres something for everyone; check out an updated list of titles here just click the Family & Kids box underneath Genres on the left rail or in the Game Pass area on your Xbox console.Playing games can stimulate creativity, improve hand-eye coordination, and offer a sense of achievement. They can also be a fantastic way for children to unwind and connect with friends, or be the center of your familys game night, especially in todays digital age.With the robust features built into your Xbox console and managed by the Xbox Family Settings app, you can ensure that your childs gaming experience is as safe and controlled as you need it to be.For more information about the Xbox Family Settings app, visitXbox.com/family-app.*Entitlements are shared only if child accounts are using the same console set as the Home console. More details on designating a Home console can be found here.
    0 Comments ·0 Shares ·137 Views
  • From Dated Red Brick Rancher to Dreamy Southern Farmhouse
    www.countryliving.com
    Overwhelmed. That was the word that came to mind the very first time David Bowen laid eyes on his familys vacation-home-to-be in central Georgia. He and his wife, Melissa, had been on the hunt for a rural property where they could gather with friends, family, and one very lucky grandchild to enjoy a slice of the simple life. I grew up hunting, fishing, and hiking on my familys farm, and Melissa and I wanted a place where we could pass those loves on to new generations, says David. They bought some land about two hours away from their hometown of Suwanee, Georgia, and were delighted when, a few years later, the house next door went up for sale. Surprisingly, I had never seen it, and there wasnt a lot of information about the property online, so I did my fair share of snooping on Google Earth before driving over, David says. When he finally arrived, he discovered a scenic long driveway, rolling fields, bosky tracts, a picture-perfect lake, andcue the screeching sounda 1980s-era rancher with a low-slung roof and so much red brick. It could not have been more at odds with its picturesque and pastoral surroundings. Knowing that the setting called for a classic Southern farmhouse, David called in for reinforcements, including Perry, Georgia-based designer James Farmer. This was a renovation that was going to take vision and a lot of work. I asked him to come down, and I laid it all outI told him, Were going to turn this thing into an old, Low Country-inspired house, and were going to do it right with a big front porch, dormer windows, a metal roof, white siding, wood floors, blue ceilingsall of it, says David. And to his credit, James said, I see where youre going with this, and I think its going to be fantastic. The goal was to create a traditional farmhouse and hunting lodgenot some stuffy antebellum mansion.With a plan in place, they took on their respective roles on the project. David concentrated on construction I know just enough about architecture to be dangerous, he saysand James took on the challenges that come with Reagan-era interior design choices (carpet in the bathrooms, a bad-linoleum galley kitchen), making way for layers of old-fashioned charm. The goal was to create a traditional farmhouse and hunting lodgenot some old, stuffy antebellum mansion, says James. Helen Norman, Floral Design by James FarmerDesigner James Farmer decked the front door out for fall with a garland using magnolia branches and other trimmings found on the property.Fortunately, James was delighted to find original heart-pine floors beneath the shag carpet and linoleum floors. He then chose paint-grade lumber to create classic Southern millwork like beadboard and shiplap throughout the house. The woodwork instantly gave it a cozier and older feel, he says. As did wallpaper, which played a large part in the overall transformation. One of James first selections was a classic toile (Royal Oak by Lewis and Wood), which he paired with painted shiplap wainscoting in the homes entry. The papers putty hue also inspired the homes overall color palette. I love what I call the un-colors, says James. Theyre not green, theyre not gray, theyre not browntheyre these hues that are beautiful because theyre a little ambiguous. Helen NormanWhen it came to furnishings, the Bowens were keen to create spaces that felt collected. Trouble was, they didnt actually have enough family heirlooms to spare for their weekend getaway. Thats where Jamess enthusiasm for antiques came in handy, with the designer layering in items like oil paintings, demijohns, and French baskets, not to mention a few of his personal possessions, including a heavy as lead console table in the living room and a family bench that David says is still on loan from the designer. And while James is passionate about patina, he is also a champion of what he calls tomorrows antiques, referring to artfully made modern- day items modeled after pieces with provenance. One example: The primitive-looking sideboard in the entry, which James says takes after an old Southern hunt board.In the end, it only took about two years of work to make the house feel a full century olderand the process was significantly less daunting than David had anticipated. Normally, remodeling is not fun. This was fun, he says. Thats the good thing about working with James. He trusted me, and I trusted him, and we got that classic Southern farmhouse we wanted all along.Keep reading below to see more of the beautiful results.Tour More Beautifully Restored Farmhouses:A Kitchen for CongregatingHelen Norman, styling by Natalie WaradyPreviously a dark galley layout with laminate cabinetry, the now open and airy kitchen was conceived with relaxation in mind. After fighting Atlanta traffic, you want a place you can put the groceries down and take a breath, says James. While he chose Carrara marble for the island, good ol Georgia heart-pine countertops line the perimeter. Melissas uncle built the cabinets (Noahs Custom Cabinets; 770-945-9824). Brass lighting adds warmth.Get the Look: Paint Color: Linen White by Benjamin MooreIsland Light Fixture: Sloane by Circa LightingRELATED: 60+ More Farmhouse Kitchen Ideas To Give Your Kitchen Charming, Timeless StyleA Relaxed Dining RoomHelen Norman, Floral Design by James FarmerA mix of black Windsor chairs and upholstered French side chairs provides a more relaxed look around English oak dining table. The open dining room is flanked by equally cozy seating areas at each end, one for TV watching, the other for conversation.Tawny Hues and Textures Helen Norman, styling by Natalie WaradyThe home, utilized for both guys hunting weekends and girls getaways, is rich with tailored neutrals. Throughout the living room, James embraced a beige grasscloth above shiplap wainscoting and rusty brown windowpane plaid fabric on the chairs and pillows. Its the perfect color of terrible tobacco, he says. Get the Look: Wallpaper:Ramie Bay by ThibautRELATED: 30+ Cozy Living Rooms Full of Decorating Ideas to Make You to Snuggle Up ForeverLayers of TextureHelen Norman, styling by Natalie Warady In the primary bedroom, another grasscloth wallcovering complements assorted checked fabrics atop the ebony black four-poster bed. Curtains of a more feminine tan and white toile fabric adds pattern and softness, while the painted V-groove wood ceiling adds age and warmth. Wallpaper: Windward Sisal in Beige by ThibautRELATED: More Wood Ceiling Ideas to Bring Your Room Country CharmCozy Guest RoomsHelen Norman, styling by Natalie WaradyA netural grasscloth wallcovering, taupe checked curtains, and pretty patterned pillows add color to the guest room. Above the woven rush headboard, a collection of baskets and farm landscape paintings combine for an unexpected gallery wall. Get the Look: Wallpaper:Costa Stripe in Gray by ThibautRELATED: 30+ Guest Bedroom Ideas to Create a Cozy and Welcoming SpaceDouble the Porch SpaceHelen NormanWhen a house has views on all sides, one porch simply wont do. Equipped with a long dining table and lots of sit-a-spell seating, the large porch along the backside of the house beckons long and lingering family gatherings.Seasonal TouchesHelen Norman, Floral Design by James FarmerCome fall, seasonal arrangements (featuring peachy Campanella and Free Spirit roses) offer an autumnal-hued alternative to mums. My mom called a peachy- apricot color Carl, like the mans name, instead of coral, says James. So Carl is my favorite fall color. Cozy layered blankets welcome those who wander up from the boathouse.RELATED: The Best Fall Porch Decor Ideas to Celebrate AutumnComplementary Outbuildings Helen Norman, styling by Natalie WaradyHelen NormanTour More Dreamy Homes:
    0 Comments ·0 Shares ·181 Views