• Plug and Play: Build a G-Assist Plug-In Today

    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems.
    NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels.

    G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow.
    Below, find popular G-Assist plug-ins, hackathon details and tips to get started.
    Plug-In and Win
    Join the hackathon by registering and checking out the curated technical resources.
    G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation.
    For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins.
    To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code.
    Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action.
    Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16.
    Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in.
    Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit.
    Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver.
    Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows.

    Popular plug-ins include:

    Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay.
    Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay.
    IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device.
    Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists.
    Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more.

    Get G-Assist 
    Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff.
    the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session.
    Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities.
    Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process.
    NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #plug #play #build #gassist #plugin
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #plug #play #build #gassist #plugin
    BLOGS.NVIDIA.COM
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file (plugin.py), requirements.txt, manifest.json, config.json (if applicable), a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU (Intel Pentium G Series, Core i3, i5, i7 or higher; AMD FX, Ryzen 3, 5, 7, 9, Threadripper or higher), specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-In(spiration) Explore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist(ance)  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. Save the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    Like
    Wow
    Love
    Sad
    25
    0 Commenti 0 condivisioni
  • The so-called "sample game" touted as the perfect opportunity to learn Unreal Engine 5 is nothing but a glorified marketing ploy! Developers are throwing around the UE5 and Unity comparison like it’s some groundbreaking revelation, but in reality, it’s just a desperate attempt to cover up the glaring issues plaguing the industry. Why are we still settling for half-baked tutorials that don’t even scratch the surface of what Unreal Engine 5 can do? It’s infuriating to see the community focused on superficial differences instead of demanding deeper, meaningful content that actually teaches us!

    Enough is enough! We deserve better than this mediocrity!

    #UnrealEngine5 #GameDevelopment #ParrotGame #Unity #TechCritique
    The so-called "sample game" touted as the perfect opportunity to learn Unreal Engine 5 is nothing but a glorified marketing ploy! Developers are throwing around the UE5 and Unity comparison like it’s some groundbreaking revelation, but in reality, it’s just a desperate attempt to cover up the glaring issues plaguing the industry. Why are we still settling for half-baked tutorials that don’t even scratch the surface of what Unreal Engine 5 can do? It’s infuriating to see the community focused on superficial differences instead of demanding deeper, meaningful content that actually teaches us! Enough is enough! We deserve better than this mediocrity! #UnrealEngine5 #GameDevelopment #ParrotGame #Unity #TechCritique
    WWW.CREATIVEBLOQ.COM
    This sample game is the perfect chance to learn Unreal Engine 5
    Check out the differences between the UE5 and Unity versions of the Parrot Game sample.
    1 Commenti 0 condivisioni
  • The Best Hidden-Gem Etsy Shops for Fans of Farmhouse Style

    Becky Luigart-Stayner for Country LivingCountry Living editors select each product featured. If you buy from a link, we may earn a commission. Why Trust Us?Like a well-made quilt, a classic farmhouse aesthetic comes together gradually—a little bit of this, a touch of that. Each addition is purposeful and personal—and isn’t that what home is all about, really? If this type of slowed-down style speaks to you, you're probably already well aware that Etsy is a treasure trove of finds both new and old to fit your timeless farmhouse aesthetic. But with more than eight million active sellers on its marketplace, sometimes the possibilities—vintage feed sacks! primitive pie safes! galvanized grain scoops!—can quickly go from enticing to overwhelming.To better guide your search for the finest farmhouse furnishings, we’ve gathered a go-to list of editor-and designer-beloved Etsy shops which, time and again, turn out hardworking, homespun pieces of heirloom quality. From beautiful antique bureaus to hand-block-printed table linens, the character-rich wares from these sellers will help you design the farmhouse of your dreams, piece by precious piece. Related Stories For Antique AmericanaAcorn and Alice Every good old-fashioned farmhouse could use some traditional Americana to set the tone, and this Pennsylvania salvage shop offers rustic touches loaded with authentic antique allure. Aged wooden wares abound, as well as a grab bag of cotton and burlap feed sacks, perfect for framing as sets or crafting into footstool covers or throw pillows. For French Country TextilesForest and LinenThere’s nothing quite like breezy natural fabrics to make you want to throw open all the windows and let that country air in while the pie cools. Unfussy and lightweight, the hand-crafted curtains, bedding, and table linens from these Lithuanian textile experts have a classic understated quality that would be right at home in the coziest guest room or most bustling kitchen. Warm, welcoming hues range from marigold yellow to cornflower blue, but soft gingham checkers and timeless French ticking feel especially farm-fresh. Our current favorite? These cherry-striped country cafe curtains. Becky Luigart-Stayner for Country LivingVintage red torchons feel right at home in a farmhouse kitchenFor Rustic RugsOld New HouseWhether or not you’re lucky enough to have gorgeous wide-plank floors, an antique area rug or runner can work wonders for giving a room instant character and warmth. This fifth-generation family-run retailer specializes in importing heirloom hand-knotted carpets dating back to the 1800s, with a focus on traditional designs from the masters in Turkey, India, Persia, and more. Their vast variety of sizes and styles offers something for every aesthetic, with one-of-a-kind patterns ranging from distressed neutrals to chain-stitched florals to ornate arabesques. For Pillows and ProvisionsHabitation BohemeIn true farmhouse fashion, this Indiana shop has curated an enticing blend of handcrafted and vintage homewares that work effortlessly well together. A line of cozy hand-stitched linen pillow coverssits prettily alongside a mix of found objects, from patinated brass candlesticks and etched cloisonné vases to sturdy stoneware crockery and woven wicker baskets. For Elegant Everyday DishwareConvivial ProductionSimple, yet undeniably stunning, the handcrafted dinnerware from this Missouri-based ceramist is designed with durability in mind. Produced in a single, time-tested shade of ivory white glaze, these practical stoneware cups, bowls, and plates make the perfect place settings for lively farm-to-table feasts with friends and family. Beautifully balancing softness and heft, each dish is meant to feel comfortable when being held and passed, but also to look attractive when stacked upon open shelving. For English Country Antiques1100 West Co.This Illinois antiques shop is stocked with all manner of versatile vintage vessels culled from the English countryside, from massive stoneware crocks to charming little escargot pots. Their collection of neutral containers can be adapted for nearly any provincial purpose, but we especially love their assortment of old advertising—from toothpaste pots to marmalade jars and ginger beer bottles galore—for a nice little nod to the quintessential country practice of repurposing what you’ve got. Brian Woodcock/Country LivingPretty English ironstone will always have our heart.For a Cozy GlowOlde Brick LightingConstructed by hand from cord to shade, the vintage-inspired lighting produced by this Pennsylvania retailer is a tribute to the iconic quality and character of old American fixtures. Nostalgic design elements include hand-blown glassand finishes ranging from matte black to brushed nickel and antique brass. To create an authentic farmhouse ambiance, check out their gooseneck sconces, enameled red and blue barn lights, and milky white striped schoolhouse flush mounts. For Enduring ArtifactsThrough the PortholeThe weathered, artisan-made wares curated by this California husband-and-wife duo have been hand-selected from around the globe for their time-etched character. From gorgeous gray-black terracotta vases and rust-colored Turkish clay pots to patinated brass cow bells and rustic reclaimed elm stools, each item is a testament to the lasting beauty of classic materials, with storied sun-bleaching and scratches befitting the most beloved, lived-in rooms. For Winsome Wall ArtEugenia Ciotola ArtThrough graceful brushstrokes and textural swirls of paint, Maryland-based artist Eugenia Ciotola has captured the natural joy of a life that’s simple and sweet. Her pieces celebrate quiet scenes of bucolic beauty, from billowing bouquets of peonies to stoic red barns sitting in fields of wavy green. For a parlor gallery or gathering space, we gravitate toward her original oils on canvas—an impasto still life, perhaps, or a plainly frocked maiden carrying a bountiful bowl of lemons—while her stately farm animal portraitswould look lovely in a child’s nursery.For Time-Tested Storage SolutionsMaterials DivisionFunction is forefront for this farmhouse supplier operating out of New York, whose specialized selection of vintage provisions have lived out dutiful lives of purpose. Standouts include a curated offering of trusty antique tool boxes and sturdy steel-clad trunks whose rugged patina tells the story of many-a household project. Meanwhile, a hardworking mix of industrial wire and woven wood gathering baskets sits handsomely alongside heavy-duty galvanized garbage bins and antique fireplace andirons.For Pastoral PrimitivesComfort Work RoomFull of history and heritage, the old, hand-fabricated furnishings and primitive wooden tools in this unique Ukrainian antique shop are rural remnants of simpler times gone by. Quaint kitchen staples like chippy chiseled spoons, scoops, and cutting boards make an accessible entry point for the casual collector, while scuffed up dough troughs, butter churns, washboards, and barrels are highly desirable conversation pieces for any antique enthusiast who’s dedicated to authentic detail. Becky Luigart-Stayner for Country LivingAntique washboards make for on-theme wall art in a laundry roomFor Heirloom-Quality CoverletsBluegrass QuiltsNo layered farmhouse look would be complete without the homey, tactile touch of a hand-pieced quilt or two draped intentionally about the room. From harvest-hued sawtooth stars to playful patchwork pinwheels, each exquisite blanket from this Kentucky-based artisan is slow-crafted in traditional fashion from 100% cotton materials, and can even be custom stitched from scratch to match your personal color palette and decorative purpose. For a classic country aesthetic, try a log cabin, double diamond, or star patch pattern. For Hand-Crafted GiftsSelselaFeaturing a busy barnyard’s worth of plucky chickens, cuddly sheep, and happy little Holstein cows, this Illinois woodworker’s whimsical line of farm figurines and other giftable goodiesis chock-full of hand-carved charm. Crafted from 100% recycled birch and painted in loving detail, each creature has a deliberately rough-hewn look and feel worthy of any cozy and collected home. For Open-Concept CabinetryFolkhausA hallmark of many modern farmhouses, open-concept shelving has become a stylish way to show that the practical wares you use everyday are the same ones you’re proud to put on display. With their signature line of bracketed wall shelves, Shaker-style peg shelves, and raw steel kitchen rails, the team at Folkhaus has created a range of open storage solutions that beautifully balances elevated design and rustic utility. Rounding out their collection is a selection of open-shelved accent pieces like bookcases, benches, and console tables—each crafted from character-rich kiln-dried timber and finished in your choice of stain.Related StoryFor Antique Farmhouse FurnitureCottage Treasures LVThe foundation of a well-furnished farmhouse often begins with a single prized piece. Whether it’s a slant-front desk, a primitive jelly cabinet, or a punched-tin pie safe, this established New York-based dealer has a knack for sourcing vintage treasures with the personality and presence to anchor an entire space. Distressed cupboards and cabinets may be their bread and butterbut you’ll also find a robust roundup of weathered farm tables, Windsor chairs, and blanket chests—and currently, even a rare 1500s English bench. For Lively Table LinensMoontea StudioAs any devotee of slow decorating knows, sometimes it’s the little details that really bring a look home. For a spot of cheer along with your afternoon tea, we love the hand-stamped table linens from this Washington-based printmaker, which put a peppy, modern spin on farm-fresh produce. Patterned with lush illustrations of bright red tomatoes, crisp green apples, and golden sunflowers—then neatly finished with a color-coordinated hand-stitched trim—each tea towel, placemat, and napkin pays homage to the hours we spend doting over our gardens. For Traditional TransferwarePrior TimeThere’s lots to love about this Massachusetts antiques shop, which admittedly skews slightly cottagecorebut the standout, for us, is the seller’s superior selection of dinner and serving ware. In addition to a lovely lot of mottled white ironstone platters and pitchers, you’ll find a curated mix of Ridgeway and Wedgwood transferware dishes in not only classic cobalt blue, but beautiful browns, greens, and purples, too.Becky Luigart-Stayner for Country LivingPretty brown transferware could be yours with one quick "add to cart."For Folk Art for Your FloorsKinFolk ArtworkDesigned by a West Virginia watercolor and oils artist with a penchant for painting the past, these silky chenille floor mats feature an original cast of colonial characters and folksy scenes modeled after heirloom textiles from the 18th and 19th centuries. Expect lots of early American and patriotic motifs, including old-fashioned flags, Pennsylvania Dutch fraktur, equestrian vignettes, and colonial house samplers—each made to mimic a vintage hooked rug for that cozy, homespun feeling.For Historical ReproductionsSchooner Bay Co.Even in the most painstakingly appointed interior, buying antique originals isn’t always an option. And that’s where this trusted Pennsylvania-based retailer for historical reproductions comes in. Offering a colossal collection of framed art prints, decorative trays, and brass objects, these connoisseurs of the classics have decor for every old-timey aesthetic, whether it’s fox hunt prints for your cabin, Dutch landscapes for your cottage, or primitive animal portraits for your farmstead.For General Store StaplesFarmhouse EclecticsHand-plucked from New England antique shops, estate sales, and auctions, the salvaged sundries from this Massachusetts-based supplierare the type you might spy in an old country store—wooden crates emblazoned with the names of local dairies, antique apple baskets, seed displays, signs, and scales. Whether you’re setting up your farmstand or styling your entryway, you’ll have plenty of storage options and authentic accents to pick from here. Becky Luigart-Stayner for Country LivingSo many food scales, so little time.Related StoriesJackie BuddieJackie Buddie is a freelance writer with more than a decade of editorial experience covering lifestyle topics including home decor how-tos, fashion trend deep dives, seasonal gift guides, and in-depth profiles of artists and creatives around the globe. She holds a degree in journalism from the University of North Carolina at Chapel Hill and received her M.F.A. in creative writing from Boston University. Jackie is, among other things, a collector of curiosities, Catskills land caretaker, dabbling DIYer, day hiker, and mom. She lives in the hills of Bovina, New York, with her family and her sweet-as-pie rescue dog.
    #best #hiddengem #etsy #shops #fans
    The Best Hidden-Gem Etsy Shops for Fans of Farmhouse Style
    Becky Luigart-Stayner for Country LivingCountry Living editors select each product featured. If you buy from a link, we may earn a commission. Why Trust Us?Like a well-made quilt, a classic farmhouse aesthetic comes together gradually—a little bit of this, a touch of that. Each addition is purposeful and personal—and isn’t that what home is all about, really? If this type of slowed-down style speaks to you, you're probably already well aware that Etsy is a treasure trove of finds both new and old to fit your timeless farmhouse aesthetic. But with more than eight million active sellers on its marketplace, sometimes the possibilities—vintage feed sacks! primitive pie safes! galvanized grain scoops!—can quickly go from enticing to overwhelming.To better guide your search for the finest farmhouse furnishings, we’ve gathered a go-to list of editor-and designer-beloved Etsy shops which, time and again, turn out hardworking, homespun pieces of heirloom quality. From beautiful antique bureaus to hand-block-printed table linens, the character-rich wares from these sellers will help you design the farmhouse of your dreams, piece by precious piece. Related Stories For Antique AmericanaAcorn and Alice Every good old-fashioned farmhouse could use some traditional Americana to set the tone, and this Pennsylvania salvage shop offers rustic touches loaded with authentic antique allure. Aged wooden wares abound, as well as a grab bag of cotton and burlap feed sacks, perfect for framing as sets or crafting into footstool covers or throw pillows. For French Country TextilesForest and LinenThere’s nothing quite like breezy natural fabrics to make you want to throw open all the windows and let that country air in while the pie cools. Unfussy and lightweight, the hand-crafted curtains, bedding, and table linens from these Lithuanian textile experts have a classic understated quality that would be right at home in the coziest guest room or most bustling kitchen. Warm, welcoming hues range from marigold yellow to cornflower blue, but soft gingham checkers and timeless French ticking feel especially farm-fresh. Our current favorite? These cherry-striped country cafe curtains. Becky Luigart-Stayner for Country LivingVintage red torchons feel right at home in a farmhouse kitchenFor Rustic RugsOld New HouseWhether or not you’re lucky enough to have gorgeous wide-plank floors, an antique area rug or runner can work wonders for giving a room instant character and warmth. This fifth-generation family-run retailer specializes in importing heirloom hand-knotted carpets dating back to the 1800s, with a focus on traditional designs from the masters in Turkey, India, Persia, and more. Their vast variety of sizes and styles offers something for every aesthetic, with one-of-a-kind patterns ranging from distressed neutrals to chain-stitched florals to ornate arabesques. For Pillows and ProvisionsHabitation BohemeIn true farmhouse fashion, this Indiana shop has curated an enticing blend of handcrafted and vintage homewares that work effortlessly well together. A line of cozy hand-stitched linen pillow coverssits prettily alongside a mix of found objects, from patinated brass candlesticks and etched cloisonné vases to sturdy stoneware crockery and woven wicker baskets. For Elegant Everyday DishwareConvivial ProductionSimple, yet undeniably stunning, the handcrafted dinnerware from this Missouri-based ceramist is designed with durability in mind. Produced in a single, time-tested shade of ivory white glaze, these practical stoneware cups, bowls, and plates make the perfect place settings for lively farm-to-table feasts with friends and family. Beautifully balancing softness and heft, each dish is meant to feel comfortable when being held and passed, but also to look attractive when stacked upon open shelving. For English Country Antiques1100 West Co.This Illinois antiques shop is stocked with all manner of versatile vintage vessels culled from the English countryside, from massive stoneware crocks to charming little escargot pots. Their collection of neutral containers can be adapted for nearly any provincial purpose, but we especially love their assortment of old advertising—from toothpaste pots to marmalade jars and ginger beer bottles galore—for a nice little nod to the quintessential country practice of repurposing what you’ve got. Brian Woodcock/Country LivingPretty English ironstone will always have our heart.For a Cozy GlowOlde Brick LightingConstructed by hand from cord to shade, the vintage-inspired lighting produced by this Pennsylvania retailer is a tribute to the iconic quality and character of old American fixtures. Nostalgic design elements include hand-blown glassand finishes ranging from matte black to brushed nickel and antique brass. To create an authentic farmhouse ambiance, check out their gooseneck sconces, enameled red and blue barn lights, and milky white striped schoolhouse flush mounts. For Enduring ArtifactsThrough the PortholeThe weathered, artisan-made wares curated by this California husband-and-wife duo have been hand-selected from around the globe for their time-etched character. From gorgeous gray-black terracotta vases and rust-colored Turkish clay pots to patinated brass cow bells and rustic reclaimed elm stools, each item is a testament to the lasting beauty of classic materials, with storied sun-bleaching and scratches befitting the most beloved, lived-in rooms. For Winsome Wall ArtEugenia Ciotola ArtThrough graceful brushstrokes and textural swirls of paint, Maryland-based artist Eugenia Ciotola has captured the natural joy of a life that’s simple and sweet. Her pieces celebrate quiet scenes of bucolic beauty, from billowing bouquets of peonies to stoic red barns sitting in fields of wavy green. For a parlor gallery or gathering space, we gravitate toward her original oils on canvas—an impasto still life, perhaps, or a plainly frocked maiden carrying a bountiful bowl of lemons—while her stately farm animal portraitswould look lovely in a child’s nursery.For Time-Tested Storage SolutionsMaterials DivisionFunction is forefront for this farmhouse supplier operating out of New York, whose specialized selection of vintage provisions have lived out dutiful lives of purpose. Standouts include a curated offering of trusty antique tool boxes and sturdy steel-clad trunks whose rugged patina tells the story of many-a household project. Meanwhile, a hardworking mix of industrial wire and woven wood gathering baskets sits handsomely alongside heavy-duty galvanized garbage bins and antique fireplace andirons.For Pastoral PrimitivesComfort Work RoomFull of history and heritage, the old, hand-fabricated furnishings and primitive wooden tools in this unique Ukrainian antique shop are rural remnants of simpler times gone by. Quaint kitchen staples like chippy chiseled spoons, scoops, and cutting boards make an accessible entry point for the casual collector, while scuffed up dough troughs, butter churns, washboards, and barrels are highly desirable conversation pieces for any antique enthusiast who’s dedicated to authentic detail. Becky Luigart-Stayner for Country LivingAntique washboards make for on-theme wall art in a laundry roomFor Heirloom-Quality CoverletsBluegrass QuiltsNo layered farmhouse look would be complete without the homey, tactile touch of a hand-pieced quilt or two draped intentionally about the room. From harvest-hued sawtooth stars to playful patchwork pinwheels, each exquisite blanket from this Kentucky-based artisan is slow-crafted in traditional fashion from 100% cotton materials, and can even be custom stitched from scratch to match your personal color palette and decorative purpose. For a classic country aesthetic, try a log cabin, double diamond, or star patch pattern. For Hand-Crafted GiftsSelselaFeaturing a busy barnyard’s worth of plucky chickens, cuddly sheep, and happy little Holstein cows, this Illinois woodworker’s whimsical line of farm figurines and other giftable goodiesis chock-full of hand-carved charm. Crafted from 100% recycled birch and painted in loving detail, each creature has a deliberately rough-hewn look and feel worthy of any cozy and collected home. For Open-Concept CabinetryFolkhausA hallmark of many modern farmhouses, open-concept shelving has become a stylish way to show that the practical wares you use everyday are the same ones you’re proud to put on display. With their signature line of bracketed wall shelves, Shaker-style peg shelves, and raw steel kitchen rails, the team at Folkhaus has created a range of open storage solutions that beautifully balances elevated design and rustic utility. Rounding out their collection is a selection of open-shelved accent pieces like bookcases, benches, and console tables—each crafted from character-rich kiln-dried timber and finished in your choice of stain.Related StoryFor Antique Farmhouse FurnitureCottage Treasures LVThe foundation of a well-furnished farmhouse often begins with a single prized piece. Whether it’s a slant-front desk, a primitive jelly cabinet, or a punched-tin pie safe, this established New York-based dealer has a knack for sourcing vintage treasures with the personality and presence to anchor an entire space. Distressed cupboards and cabinets may be their bread and butterbut you’ll also find a robust roundup of weathered farm tables, Windsor chairs, and blanket chests—and currently, even a rare 1500s English bench. For Lively Table LinensMoontea StudioAs any devotee of slow decorating knows, sometimes it’s the little details that really bring a look home. For a spot of cheer along with your afternoon tea, we love the hand-stamped table linens from this Washington-based printmaker, which put a peppy, modern spin on farm-fresh produce. Patterned with lush illustrations of bright red tomatoes, crisp green apples, and golden sunflowers—then neatly finished with a color-coordinated hand-stitched trim—each tea towel, placemat, and napkin pays homage to the hours we spend doting over our gardens. For Traditional TransferwarePrior TimeThere’s lots to love about this Massachusetts antiques shop, which admittedly skews slightly cottagecorebut the standout, for us, is the seller’s superior selection of dinner and serving ware. In addition to a lovely lot of mottled white ironstone platters and pitchers, you’ll find a curated mix of Ridgeway and Wedgwood transferware dishes in not only classic cobalt blue, but beautiful browns, greens, and purples, too.Becky Luigart-Stayner for Country LivingPretty brown transferware could be yours with one quick "add to cart."For Folk Art for Your FloorsKinFolk ArtworkDesigned by a West Virginia watercolor and oils artist with a penchant for painting the past, these silky chenille floor mats feature an original cast of colonial characters and folksy scenes modeled after heirloom textiles from the 18th and 19th centuries. Expect lots of early American and patriotic motifs, including old-fashioned flags, Pennsylvania Dutch fraktur, equestrian vignettes, and colonial house samplers—each made to mimic a vintage hooked rug for that cozy, homespun feeling.For Historical ReproductionsSchooner Bay Co.Even in the most painstakingly appointed interior, buying antique originals isn’t always an option. And that’s where this trusted Pennsylvania-based retailer for historical reproductions comes in. Offering a colossal collection of framed art prints, decorative trays, and brass objects, these connoisseurs of the classics have decor for every old-timey aesthetic, whether it’s fox hunt prints for your cabin, Dutch landscapes for your cottage, or primitive animal portraits for your farmstead.For General Store StaplesFarmhouse EclecticsHand-plucked from New England antique shops, estate sales, and auctions, the salvaged sundries from this Massachusetts-based supplierare the type you might spy in an old country store—wooden crates emblazoned with the names of local dairies, antique apple baskets, seed displays, signs, and scales. Whether you’re setting up your farmstand or styling your entryway, you’ll have plenty of storage options and authentic accents to pick from here. Becky Luigart-Stayner for Country LivingSo many food scales, so little time.Related StoriesJackie BuddieJackie Buddie is a freelance writer with more than a decade of editorial experience covering lifestyle topics including home decor how-tos, fashion trend deep dives, seasonal gift guides, and in-depth profiles of artists and creatives around the globe. She holds a degree in journalism from the University of North Carolina at Chapel Hill and received her M.F.A. in creative writing from Boston University. Jackie is, among other things, a collector of curiosities, Catskills land caretaker, dabbling DIYer, day hiker, and mom. She lives in the hills of Bovina, New York, with her family and her sweet-as-pie rescue dog. #best #hiddengem #etsy #shops #fans
    WWW.COUNTRYLIVING.COM
    The Best Hidden-Gem Etsy Shops for Fans of Farmhouse Style
    Becky Luigart-Stayner for Country LivingCountry Living editors select each product featured. If you buy from a link, we may earn a commission. Why Trust Us?Like a well-made quilt, a classic farmhouse aesthetic comes together gradually—a little bit of this, a touch of that. Each addition is purposeful and personal—and isn’t that what home is all about, really? If this type of slowed-down style speaks to you, you're probably already well aware that Etsy is a treasure trove of finds both new and old to fit your timeless farmhouse aesthetic. But with more than eight million active sellers on its marketplace, sometimes the possibilities—vintage feed sacks! primitive pie safes! galvanized grain scoops!—can quickly go from enticing to overwhelming.To better guide your search for the finest farmhouse furnishings, we’ve gathered a go-to list of editor-and designer-beloved Etsy shops which, time and again, turn out hardworking, homespun pieces of heirloom quality. From beautiful antique bureaus to hand-block-printed table linens, the character-rich wares from these sellers will help you design the farmhouse of your dreams, piece by precious piece. Related Stories For Antique AmericanaAcorn and Alice Every good old-fashioned farmhouse could use some traditional Americana to set the tone, and this Pennsylvania salvage shop offers rustic touches loaded with authentic antique allure. Aged wooden wares abound (think vintage milk crates, orchard fruit baskets, and berry boxes), as well as a grab bag of cotton and burlap feed sacks, perfect for framing as sets or crafting into footstool covers or throw pillows. For French Country TextilesForest and LinenThere’s nothing quite like breezy natural fabrics to make you want to throw open all the windows and let that country air in while the pie cools. Unfussy and lightweight, the hand-crafted curtains, bedding, and table linens from these Lithuanian textile experts have a classic understated quality that would be right at home in the coziest guest room or most bustling kitchen. Warm, welcoming hues range from marigold yellow to cornflower blue, but soft gingham checkers and timeless French ticking feel especially farm-fresh. Our current favorite? These cherry-striped country cafe curtains. Becky Luigart-Stayner for Country LivingVintage red torchons feel right at home in a farmhouse kitchenFor Rustic RugsOld New HouseWhether or not you’re lucky enough to have gorgeous wide-plank floors, an antique area rug or runner can work wonders for giving a room instant character and warmth. This fifth-generation family-run retailer specializes in importing heirloom hand-knotted carpets dating back to the 1800s, with a focus on traditional designs from the masters in Turkey, India, Persia, and more. Their vast variety of sizes and styles offers something for every aesthetic, with one-of-a-kind patterns ranging from distressed neutrals to chain-stitched florals to ornate arabesques. For Pillows and ProvisionsHabitation BohemeIn true farmhouse fashion, this Indiana shop has curated an enticing blend of handcrafted and vintage homewares that work effortlessly well together. A line of cozy hand-stitched linen pillow covers (patterned with everything from block-printed blossoms to provincial pinstripes) sits prettily alongside a mix of found objects, from patinated brass candlesticks and etched cloisonné vases to sturdy stoneware crockery and woven wicker baskets. For Elegant Everyday DishwareConvivial ProductionSimple, yet undeniably stunning, the handcrafted dinnerware from this Missouri-based ceramist is designed with durability in mind. Produced in a single, time-tested shade of ivory white glaze, these practical stoneware cups, bowls, and plates make the perfect place settings for lively farm-to-table feasts with friends and family. Beautifully balancing softness and heft, each dish is meant to feel comfortable when being held and passed, but also to look attractive when stacked upon open shelving. For English Country Antiques1100 West Co.This Illinois antiques shop is stocked with all manner of versatile vintage vessels culled from the English countryside, from massive stoneware crocks to charming little escargot pots. Their collection of neutral containers can be adapted for nearly any provincial purpose (envision white ironstone pitchers piled high with fresh-picked hyacinths, or glass canning jars holding your harvest grains), but we especially love their assortment of old advertising—from toothpaste pots to marmalade jars and ginger beer bottles galore—for a nice little nod to the quintessential country practice of repurposing what you’ve got. Brian Woodcock/Country LivingPretty English ironstone will always have our heart.For a Cozy GlowOlde Brick LightingConstructed by hand from cord to shade, the vintage-inspired lighting produced by this Pennsylvania retailer is a tribute to the iconic quality and character of old American fixtures. Nostalgic design elements include hand-blown glass (crafted using cast-iron molds from over 80 years ago) and finishes ranging from matte black to brushed nickel and antique brass. To create an authentic farmhouse ambiance, check out their gooseneck sconces, enameled red and blue barn lights, and milky white striped schoolhouse flush mounts. For Enduring ArtifactsThrough the PortholeThe weathered, artisan-made wares curated by this California husband-and-wife duo have been hand-selected from around the globe for their time-etched character. From gorgeous gray-black terracotta vases and rust-colored Turkish clay pots to patinated brass cow bells and rustic reclaimed elm stools, each item is a testament to the lasting beauty of classic materials, with storied sun-bleaching and scratches befitting the most beloved, lived-in rooms. For Winsome Wall ArtEugenia Ciotola ArtThrough graceful brushstrokes and textural swirls of paint, Maryland-based artist Eugenia Ciotola has captured the natural joy of a life that’s simple and sweet. Her pieces celebrate quiet scenes of bucolic beauty, from billowing bouquets of peonies to stoic red barns sitting in fields of wavy green. For a parlor gallery or gathering space, we gravitate toward her original oils on canvas—an impasto still life, perhaps, or a plainly frocked maiden carrying a bountiful bowl of lemons—while her stately farm animal portraits (regal roosters! ruff collared geese!) would look lovely in a child’s nursery.For Time-Tested Storage SolutionsMaterials DivisionFunction is forefront for this farmhouse supplier operating out of New York, whose specialized selection of vintage provisions have lived out dutiful lives of purpose. Standouts include a curated offering of trusty antique tool boxes and sturdy steel-clad trunks whose rugged patina tells the story of many-a household project. Meanwhile, a hardworking mix of industrial wire and woven wood gathering baskets sits handsomely alongside heavy-duty galvanized garbage bins and antique fireplace andirons.For Pastoral PrimitivesComfort Work RoomFull of history and heritage, the old, hand-fabricated furnishings and primitive wooden tools in this unique Ukrainian antique shop are rural remnants of simpler times gone by. Quaint kitchen staples like chippy chiseled spoons, scoops, and cutting boards make an accessible entry point for the casual collector, while scuffed up dough troughs, butter churns, washboards, and barrels are highly desirable conversation pieces for any antique enthusiast who’s dedicated to authentic detail. Becky Luigart-Stayner for Country LivingAntique washboards make for on-theme wall art in a laundry roomFor Heirloom-Quality CoverletsBluegrass QuiltsNo layered farmhouse look would be complete without the homey, tactile touch of a hand-pieced quilt or two draped intentionally about the room. From harvest-hued sawtooth stars to playful patchwork pinwheels, each exquisite blanket from this Kentucky-based artisan is slow-crafted in traditional fashion from 100% cotton materials, and can even be custom stitched from scratch to match your personal color palette and decorative purpose. For a classic country aesthetic, try a log cabin, double diamond, or star patch pattern. For Hand-Crafted GiftsSelselaFeaturing a busy barnyard’s worth of plucky chickens, cuddly sheep, and happy little Holstein cows, this Illinois woodworker’s whimsical line of farm figurines and other giftable goodies (think animal wine stoppers, keychains, fridge magnets, and cake toppers) is chock-full of hand-carved charm. Crafted from 100% recycled birch and painted in loving detail, each creature has a deliberately rough-hewn look and feel worthy of any cozy and collected home. For Open-Concept CabinetryFolkhausA hallmark of many modern farmhouses, open-concept shelving has become a stylish way to show that the practical wares you use everyday are the same ones you’re proud to put on display. With their signature line of bracketed wall shelves, Shaker-style peg shelves, and raw steel kitchen rails, the team at Folkhaus has created a range of open storage solutions that beautifully balances elevated design and rustic utility. Rounding out their collection is a selection of open-shelved accent pieces like bookcases, benches, and console tables—each crafted from character-rich kiln-dried timber and finished in your choice of stain.Related StoryFor Antique Farmhouse FurnitureCottage Treasures LVThe foundation of a well-furnished farmhouse often begins with a single prized piece. Whether it’s a slant-front desk, a primitive jelly cabinet, or a punched-tin pie safe, this established New York-based dealer has a knack for sourcing vintage treasures with the personality and presence to anchor an entire space. Distressed cupboards and cabinets may be their bread and butter (just look at this two-piece pine hutch!) but you’ll also find a robust roundup of weathered farm tables, Windsor chairs, and blanket chests—and currently, even a rare 1500s English bench. For Lively Table LinensMoontea StudioAs any devotee of slow decorating knows, sometimes it’s the little details that really bring a look home. For a spot of cheer along with your afternoon tea, we love the hand-stamped table linens from this Washington-based printmaker, which put a peppy, modern spin on farm-fresh produce. Patterned with lush illustrations of bright red tomatoes, crisp green apples, and golden sunflowers—then neatly finished with a color-coordinated hand-stitched trim—each tea towel, placemat, and napkin pays homage to the hours we spend doting over our gardens. For Traditional TransferwarePrior TimeThere’s lots to love about this Massachusetts antiques shop, which admittedly skews slightly cottagecore (the pink Baccarat perfume bottles! the hobnail milk glass vases! the huge primitive bread boards!) but the standout, for us, is the seller’s superior selection of dinner and serving ware. In addition to a lovely lot of mottled white ironstone platters and pitchers, you’ll find a curated mix of Ridgeway and Wedgwood transferware dishes in not only classic cobalt blue, but beautiful browns, greens, and purples, too.Becky Luigart-Stayner for Country LivingPretty brown transferware could be yours with one quick "add to cart."For Folk Art for Your FloorsKinFolk ArtworkDesigned by a West Virginia watercolor and oils artist with a penchant for painting the past, these silky chenille floor mats feature an original cast of colonial characters and folksy scenes modeled after heirloom textiles from the 18th and 19th centuries. Expect lots of early American and patriotic motifs, including old-fashioned flags, Pennsylvania Dutch fraktur, equestrian vignettes, and colonial house samplers—each made to mimic a vintage hooked rug for that cozy, homespun feeling. (We have to admit, the folk art-inspired cow and chicken is our favorite.)For Historical ReproductionsSchooner Bay Co.Even in the most painstakingly appointed interior, buying antique originals isn’t always an option (don’t ask how many times we’ve been outbid at an estate auction). And that’s where this trusted Pennsylvania-based retailer for historical reproductions comes in. Offering a colossal collection of framed art prints, decorative trays, and brass objects (think magnifying glasses, compasses, paperweights, and letter openers), these connoisseurs of the classics have decor for every old-timey aesthetic, whether it’s fox hunt prints for your cabin, Dutch landscapes for your cottage, or primitive animal portraits for your farmstead.For General Store StaplesFarmhouse EclecticsHand-plucked from New England antique shops, estate sales, and auctions, the salvaged sundries from this Massachusetts-based supplier (who grew up in an 1850s farmhouse himself) are the type you might spy in an old country store—wooden crates emblazoned with the names of local dairies, antique apple baskets, seed displays, signs, and scales. Whether you’re setting up your farmstand or styling your entryway, you’ll have plenty of storage options and authentic accents to pick from here. Becky Luigart-Stayner for Country LivingSo many food scales, so little time.Related StoriesJackie BuddieJackie Buddie is a freelance writer with more than a decade of editorial experience covering lifestyle topics including home decor how-tos, fashion trend deep dives, seasonal gift guides, and in-depth profiles of artists and creatives around the globe. She holds a degree in journalism from the University of North Carolina at Chapel Hill and received her M.F.A. in creative writing from Boston University. Jackie is, among other things, a collector of curiosities, Catskills land caretaker, dabbling DIYer, day hiker, and mom. She lives in the hills of Bovina, New York, with her family and her sweet-as-pie rescue dog.
    Like
    Love
    Wow
    Sad
    Angry
    603
    0 Commenti 0 condivisioni
  • EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments

    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausannein Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025
    Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerialimage. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset.
    Key Takeaways:

    Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task.
    Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map.
    Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models.
    Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal.

    Challenge: Seeing the World from Two Different Angles
    The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-Viewbut are often limited to the ground plane, ignoring crucial vertical structures like buildings.

    FG2: Matching Fine-Grained Features
    The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map.

    Here’s a breakdown of their innovative pipeline:

    Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment.
    Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the verticaldimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view.
    Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoFpose.

    Unprecedented Performance and Interpretability
    The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research.

    Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems.
    “A Clearer Path” for Autonomous Navigation
    The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter.
    Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models
    #epfl #researchers #unveil #fg2 #cvpr
    EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments
    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausannein Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025 Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerialimage. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset. Key Takeaways: Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task. Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map. Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models. Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal. Challenge: Seeing the World from Two Different Angles The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-Viewbut are often limited to the ground plane, ignoring crucial vertical structures like buildings. FG2: Matching Fine-Grained Features The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map. Here’s a breakdown of their innovative pipeline: Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment. Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the verticaldimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view. Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoFpose. Unprecedented Performance and Interpretability The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research. Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems. “A Clearer Path” for Autonomous Navigation The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models #epfl #researchers #unveil #fg2 #cvpr
    WWW.MARKTECHPOST.COM
    EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments
    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025 Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerial (or satellite) image. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset. Key Takeaways: Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task. Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map. Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models. Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal. Challenge: Seeing the World from Two Different Angles The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-View (BEV) but are often limited to the ground plane, ignoring crucial vertical structures like buildings. FG2: Matching Fine-Grained Features The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map. Here’s a breakdown of their innovative pipeline: Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment. Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the vertical (height) dimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view. Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoF (x, y, and yaw) pose. Unprecedented Performance and Interpretability The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research. Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems. “A Clearer Path” for Autonomous Navigation The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models
    Like
    Love
    Wow
    Angry
    Sad
    601
    0 Commenti 0 condivisioni
  • Inside the thinking behind Frontify Futures' standout brand identity

    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds.
    This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape.
    But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material.
    Endless variation
    What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project.

    "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser."
    Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system."
    One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg."

    Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion."
    Core Philosophy
    In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.'
    "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past."

    The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?"
    Stripped back and skeletal typography
    The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type."
    Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form."

    As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly."
    In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font."
    Design Process
    The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azuriofrom Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base.
    "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content.
    "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together."

    In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality.
    "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point.
    A provocation for the industry
    In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking.
    For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium.
    This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone.
    #inside #thinking #behind #frontify #futures039
    Inside the thinking behind Frontify Futures' standout brand identity
    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds. This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape. But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material. Endless variation What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project. "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser." Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system." One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg." Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion." Core Philosophy In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.' "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past." The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?" Stripped back and skeletal typography The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type." Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form." As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly." In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font." Design Process The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azuriofrom Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base. "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content. "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together." In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality. "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point. A provocation for the industry In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking. For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium. This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone. #inside #thinking #behind #frontify #futures039
    WWW.CREATIVEBOOM.COM
    Inside the thinking behind Frontify Futures' standout brand identity
    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds. This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape. But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material. Endless variation What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project. "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser." Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system." One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg." Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion." Core Philosophy In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.' "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past." The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?" Stripped back and skeletal typography The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type." Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form." As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly." In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font." Design Process The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azurio (Cranny) from Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base. "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content. "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together." In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality. "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point. A provocation for the industry In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking. For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium. This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone.
    0 Commenti 0 condivisioni
  • How to Implement Insertion Sort in Java: Step-by-Step Guide

    Posted on : June 13, 2025

    By

    Tech World Times

    Uncategorized 

    Rate this post

    Sorting is important in programming. It helps organize data. Sorting improves performance in searching, analysis, and reporting. There are many sorting algorithms. One of the simplest is Insertion Sort.
    In this article, we will learn how to implement Insertion Sort in Java. We will explain each step in simple words. You will see examples and understand how it works.
    What Is Insertion Sort?
    Insertion Sort is a simple sorting algorithm. It works like how you sort playing cards. You take one card at a time and place it in the right position. It compares the current element with those before it. If needed, it shifts elements to the right. Then, it inserts the current element at the correct place.
    How Insertion Sort Works
    Let’s understand with a small list:
    Example List:Steps:

    First elementis already sorted.
    Compare 3 with 8. Move 8 right. Insert 3 before it →Compare 5 with 8. Move 8 right. Insert 5 after 3 →Compare 1 with 8, 5, 3. Move them right. Insert 1 at start →Now the list is sorted!
    Why Use Insertion Sort?
    Insertion Sort is simple and easy to code. It works well for:

    Small datasets
    Nearly sorted lists
    Educational purposes and practice

    However, it is not good for large datasets. It has a time complexity of O.
    Time Complexity of Insertion Sort

    Best Case: OAverage Case: OWorst Case: OIt performs fewer steps in nearly sorted data.
    How to Implement Insertion Sort in Java
    Now let’s write the code for Insertion Sort in Java. We will explain each part.
    Step 1: Define a Class
    javaCopyEditpublic class InsertionSortExample {
    // Code goes here
    }

    We create a class named InsertionSortExample.
    Step 2: Create the Sorting Method
    javaCopyEditpublic static void insertionSort{
    int n = arr.length;
    for{
    int key = arr;
    int j = i - 1;

    while{
    arr= arr;
    j = j - 1;
    }
    arr= key;
    }
    }

    Let’s break it down:

    arris the current value.
    j starts from the previous index.
    While arr> key, shift arrto the right.
    Insert the key at the correct position.

    This logic sorts the array step by step.
    Step 3: Create the Main Method
    Now we test the code.
    javaCopyEditpublic static void main{
    intnumbers = {9, 5, 1, 4, 3};

    System.out.println;
    printArray;

    insertionSort;

    System.out.println;
    printArray;
    }

    This method:

    Creates an array of numbers
    Prints the array before sorting
    Calls the sort method
    Prints the array after sorting

    Step 4: Print the Array
    Let’s add a helper method to print the array.
    javaCopyEditpublic static void printArray{
    for{
    System.out.print;
    }
    System.out.println;
    }

    Now you can see how the array changes before and after sorting.
    Full Code Example
    javaCopyEditpublic class InsertionSortExample {

    public static void insertionSort{
    int n = arr.length;
    for{
    int key = arr;
    int j = i - 1;

    while{
    arr= arr;
    j = j - 1;
    }
    arr= key;
    }
    }

    public static void printArray{
    for{
    System.out.print;
    }
    System.out.println;
    }

    public static void main{
    intnumbers = {9, 5, 1, 4, 3};

    System.out.println;
    printArray;

    insertionSort;

    System.out.println;
    printArray;
    }
    }

    Sample Output
    yamlCopyEditBefore sorting:
    9 5 1 4 3
    After sorting:
    1 3 4 5 9

    This confirms that the sorting works correctly.
    Advantages of Insertion Sort in Java

    Easy to implement
    Works well with small inputs
    Stable sortGood for educational use

    When Not to Use Insertion Sort
    Avoid Insertion Sort when:

    The dataset is large
    Performance is critical
    Better algorithms like Merge Sort or Quick Sort are available

    Real-World Uses

    Sorting small records in a database
    Teaching algorithm basics
    Handling partially sorted arrays

    Even though it is not the fastest, it is useful in many simple tasks.
    Final Tips

    Practice with different inputs
    Add print statements to see how it works
    Try sorting strings or objects
    Use Java’s built-in sort methods for large arrays

    Conclusion
    Insertion Sort in Java is a great way to learn sorting. It is simple and easy to understand. In this guide, we showed how to implement it step-by-step. We covered the logic, code, and output. We also explained when to use it. Now you can try it yourself. Understanding sorting helps in coding interviews and software development. Keep practicing and exploring other sorting methods too. The more you practice, the better you understand algorithms.
    Tech World TimesTech World Times, a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com
    #how #implement #insertion #sort #java
    How to Implement Insertion Sort in Java: Step-by-Step Guide
    Posted on : June 13, 2025 By Tech World Times Uncategorized  Rate this post Sorting is important in programming. It helps organize data. Sorting improves performance in searching, analysis, and reporting. There are many sorting algorithms. One of the simplest is Insertion Sort. In this article, we will learn how to implement Insertion Sort in Java. We will explain each step in simple words. You will see examples and understand how it works. What Is Insertion Sort? Insertion Sort is a simple sorting algorithm. It works like how you sort playing cards. You take one card at a time and place it in the right position. It compares the current element with those before it. If needed, it shifts elements to the right. Then, it inserts the current element at the correct place. How Insertion Sort Works Let’s understand with a small list: Example List:Steps: First elementis already sorted. Compare 3 with 8. Move 8 right. Insert 3 before it →Compare 5 with 8. Move 8 right. Insert 5 after 3 →Compare 1 with 8, 5, 3. Move them right. Insert 1 at start →Now the list is sorted! Why Use Insertion Sort? Insertion Sort is simple and easy to code. It works well for: Small datasets Nearly sorted lists Educational purposes and practice However, it is not good for large datasets. It has a time complexity of O. Time Complexity of Insertion Sort Best Case: OAverage Case: OWorst Case: OIt performs fewer steps in nearly sorted data. How to Implement Insertion Sort in Java Now let’s write the code for Insertion Sort in Java. We will explain each part. Step 1: Define a Class javaCopyEditpublic class InsertionSortExample { // Code goes here } We create a class named InsertionSortExample. Step 2: Create the Sorting Method javaCopyEditpublic static void insertionSort{ int n = arr.length; for{ int key = arr; int j = i - 1; while{ arr= arr; j = j - 1; } arr= key; } } Let’s break it down: arris the current value. j starts from the previous index. While arr> key, shift arrto the right. Insert the key at the correct position. This logic sorts the array step by step. Step 3: Create the Main Method Now we test the code. javaCopyEditpublic static void main{ intnumbers = {9, 5, 1, 4, 3}; System.out.println; printArray; insertionSort; System.out.println; printArray; } This method: Creates an array of numbers Prints the array before sorting Calls the sort method Prints the array after sorting Step 4: Print the Array Let’s add a helper method to print the array. javaCopyEditpublic static void printArray{ for{ System.out.print; } System.out.println; } Now you can see how the array changes before and after sorting. Full Code Example javaCopyEditpublic class InsertionSortExample { public static void insertionSort{ int n = arr.length; for{ int key = arr; int j = i - 1; while{ arr= arr; j = j - 1; } arr= key; } } public static void printArray{ for{ System.out.print; } System.out.println; } public static void main{ intnumbers = {9, 5, 1, 4, 3}; System.out.println; printArray; insertionSort; System.out.println; printArray; } } Sample Output yamlCopyEditBefore sorting: 9 5 1 4 3 After sorting: 1 3 4 5 9 This confirms that the sorting works correctly. Advantages of Insertion Sort in Java Easy to implement Works well with small inputs Stable sortGood for educational use When Not to Use Insertion Sort Avoid Insertion Sort when: The dataset is large Performance is critical Better algorithms like Merge Sort or Quick Sort are available Real-World Uses Sorting small records in a database Teaching algorithm basics Handling partially sorted arrays Even though it is not the fastest, it is useful in many simple tasks. Final Tips Practice with different inputs Add print statements to see how it works Try sorting strings or objects Use Java’s built-in sort methods for large arrays Conclusion Insertion Sort in Java is a great way to learn sorting. It is simple and easy to understand. In this guide, we showed how to implement it step-by-step. We covered the logic, code, and output. We also explained when to use it. Now you can try it yourself. Understanding sorting helps in coding interviews and software development. Keep practicing and exploring other sorting methods too. The more you practice, the better you understand algorithms. Tech World TimesTech World Times, a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com #how #implement #insertion #sort #java
    TECHWORLDTIMES.COM
    How to Implement Insertion Sort in Java: Step-by-Step Guide
    Posted on : June 13, 2025 By Tech World Times Uncategorized  Rate this post Sorting is important in programming. It helps organize data. Sorting improves performance in searching, analysis, and reporting. There are many sorting algorithms. One of the simplest is Insertion Sort. In this article, we will learn how to implement Insertion Sort in Java. We will explain each step in simple words. You will see examples and understand how it works. What Is Insertion Sort? Insertion Sort is a simple sorting algorithm. It works like how you sort playing cards. You take one card at a time and place it in the right position. It compares the current element with those before it. If needed, it shifts elements to the right. Then, it inserts the current element at the correct place. How Insertion Sort Works Let’s understand with a small list: Example List: [8, 3, 5, 1] Steps: First element (8) is already sorted. Compare 3 with 8. Move 8 right. Insert 3 before it → [3, 8, 5, 1] Compare 5 with 8. Move 8 right. Insert 5 after 3 → [3, 5, 8, 1] Compare 1 with 8, 5, 3. Move them right. Insert 1 at start → [1, 3, 5, 8] Now the list is sorted! Why Use Insertion Sort? Insertion Sort is simple and easy to code. It works well for: Small datasets Nearly sorted lists Educational purposes and practice However, it is not good for large datasets. It has a time complexity of O(n²). Time Complexity of Insertion Sort Best Case (already sorted): O(n) Average Case: O(n²) Worst Case (reversed list): O(n²) It performs fewer steps in nearly sorted data. How to Implement Insertion Sort in Java Now let’s write the code for Insertion Sort in Java. We will explain each part. Step 1: Define a Class javaCopyEditpublic class InsertionSortExample { // Code goes here } We create a class named InsertionSortExample. Step 2: Create the Sorting Method javaCopyEditpublic static void insertionSort(int[] arr) { int n = arr.length; for (int i = 1; i < n; i++) { int key = arr[i]; int j = i - 1; while (j >= 0 && arr[j] > key) { arr[j + 1] = arr[j]; j = j - 1; } arr[j + 1] = key; } } Let’s break it down: arr[i] is the current value (called key). j starts from the previous index. While arr[j] > key, shift arr[j] to the right. Insert the key at the correct position. This logic sorts the array step by step. Step 3: Create the Main Method Now we test the code. javaCopyEditpublic static void main(String[] args) { int[] numbers = {9, 5, 1, 4, 3}; System.out.println("Before sorting:"); printArray(numbers); insertionSort(numbers); System.out.println("After sorting:"); printArray(numbers); } This method: Creates an array of numbers Prints the array before sorting Calls the sort method Prints the array after sorting Step 4: Print the Array Let’s add a helper method to print the array. javaCopyEditpublic static void printArray(int[] arr) { for (int number : arr) { System.out.print(number + " "); } System.out.println(); } Now you can see how the array changes before and after sorting. Full Code Example javaCopyEditpublic class InsertionSortExample { public static void insertionSort(int[] arr) { int n = arr.length; for (int i = 1; i < n; i++) { int key = arr[i]; int j = i - 1; while (j >= 0 && arr[j] > key) { arr[j + 1] = arr[j]; j = j - 1; } arr[j + 1] = key; } } public static void printArray(int[] arr) { for (int number : arr) { System.out.print(number + " "); } System.out.println(); } public static void main(String[] args) { int[] numbers = {9, 5, 1, 4, 3}; System.out.println("Before sorting:"); printArray(numbers); insertionSort(numbers); System.out.println("After sorting:"); printArray(numbers); } } Sample Output yamlCopyEditBefore sorting: 9 5 1 4 3 After sorting: 1 3 4 5 9 This confirms that the sorting works correctly. Advantages of Insertion Sort in Java Easy to implement Works well with small inputs Stable sort (keeps equal items in order) Good for educational use When Not to Use Insertion Sort Avoid Insertion Sort when: The dataset is large Performance is critical Better algorithms like Merge Sort or Quick Sort are available Real-World Uses Sorting small records in a database Teaching algorithm basics Handling partially sorted arrays Even though it is not the fastest, it is useful in many simple tasks. Final Tips Practice with different inputs Add print statements to see how it works Try sorting strings or objects Use Java’s built-in sort methods for large arrays Conclusion Insertion Sort in Java is a great way to learn sorting. It is simple and easy to understand. In this guide, we showed how to implement it step-by-step. We covered the logic, code, and output. We also explained when to use it. Now you can try it yourself. Understanding sorting helps in coding interviews and software development. Keep practicing and exploring other sorting methods too. The more you practice, the better you understand algorithms. Tech World TimesTech World Times (TWT), a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com
    0 Commenti 0 condivisioni
  • Making a killing: The playful 2D terror of Psycasso®

    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.”

    Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage.
    #making #killing #playful #terror #psycasso
    Making a killing: The playful 2D terror of Psycasso®
    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.” Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage. #making #killing #playful #terror #psycasso
    UNITY.COM
    Making a killing: The playful 2D terror of Psycasso®
    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a fresh (if gruesome) twist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.” Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage.
    0 Commenti 0 condivisioni
  • How a US agriculture agency became key in the fight against bird flu

    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy
    Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up.

    While the US Department of Health and Human Servicespreviously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculturehas escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines.
    This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection.
    The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then.
    H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock.

    Get the most essential health and fitness news in your inbox every Saturday.

    Sign up to newsletter

    “We need to take this seriously because whenconstantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year.
    Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised.
    It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak.
    Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states.
    “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell.

    But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers.
    “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Preventionsays its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.”
    The USDA and HHS also diverge on vaccination. While the USDA has allocated million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus.
    The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says.

    The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist.
    However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme.
    “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala.
    “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.”
    Topics:
    #how #agriculture #agency #became #key
    How a US agriculture agency became key in the fight against bird flu
    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up. While the US Department of Health and Human Servicespreviously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculturehas escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines. This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection. The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then. H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock. Get the most essential health and fitness news in your inbox every Saturday. Sign up to newsletter “We need to take this seriously because whenconstantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year. Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised. It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak. Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states. “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell. But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers. “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Preventionsays its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.” The USDA and HHS also diverge on vaccination. While the USDA has allocated million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus. The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says. The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist. However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme. “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala. “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.” Topics: #how #agriculture #agency #became #key
    WWW.NEWSCIENTIST.COM
    How a US agriculture agency became key in the fight against bird flu
    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up. While the US Department of Health and Human Services (HHS) previously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculture (USDA) has escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines. This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection. The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then. H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock. Get the most essential health and fitness news in your inbox every Saturday. Sign up to newsletter “We need to take this seriously because when [H5N1] constantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year. Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised. It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a $1 billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak. Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states. “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell. But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers. “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Prevention (CDC) says its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.” The USDA and HHS also diverge on vaccination. While the USDA has allocated $100 million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled $776 million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus. The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says. The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist. However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme. “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala. “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.” Topics:
    0 Commenti 0 condivisioni
  • How to delete your 23andMe data

    DNA testing service 23andMe has undergone serious upheaval in recent months, creating concerns for the 15 million customers who entrusted the company with their personal biological information. After filing for Chapter 11 bankruptcy protection in March, the company became the center of a bidding war that ended Friday when co-founder Anne Wojcicki said she’d successfully reacquired control through her nonprofit TTAM Research Institute for million.
    The bankruptcy proceedings had sent shockwaves through the genetic testing industry and among privacy advocates, with security experts and lawmakers urging customers to take immediate action to safeguard their data. The company’s interim CEO revealed this week that 1.9 million people, around 15% of 23andMe’s customer base, have already requested their genetic data be deleted from the company’s servers.
    The situation became even more complex last week after more than two dozen states filed lawsuits challenging the sale of customers’ private data, arguing that 23andMe must obtain explicit consent before transferring or selling personal information to any new entity.
    While the company’s policies mean you cannot delete all traces of your genetic data — particularly information that may have already been shared with research partners or stored in backup systems — if you’re one of the 15 million people who shared their DNA with 23andMe, there are still meaningful steps you can take to protect yourself and minimize your exposure.
    How to delete your 23andMe data
    To delete your data from 23andMe, you need to log in to your account and then follow these steps:

    Navigate to the Settings section of your profile.
    Scroll down to the selection labeled 23andMe Data. 
    Click the View option and scroll to the Delete Data section.
    Select the Permanently Delete Data button.

    You will then receive an email from 23andMe with a link that will allow you to confirm your deletion request. 
    You can choose to download a copy of your data before deleting it.
    There is an important caveat, as 23andMe’s privacy policy states that the company and its labs “will retain your Genetic Information, date of birth, and sex as required for compliance with applicable legal obligations.”
    The policy continues: “23andMe will also retain limited information related to your account and data deletion request, including but not limited to, your email address, account deletion request identifier, communications related to inquiries or complaints and legal agreements for a limited period of time as required by law, contractual obligations, and/or as necessary for the establishment, exercise or defense of legal claims and for audit and compliance purposes.”
    This essentially means that 23andMe may keep some of your information for an unspecified amount of time. 
    How to destroy your 23andMe test sample and revoke permission for your data to be used for research
    If you previously opted to have your saliva sample and DNA stored by 23andMe, you can change this setting.
    To revoke your permission, go into your 23andMe account settings page and then navigate to Preferences. 
    In addition, if you previously agreed to 23andMe and third-party researchers using your genetic data and sample for research, you can withdraw consent from the Research and Product Consents section in your account settings. 
    While you can reverse that consent, there’s no way for you to delete that information.
    Check in with your family members
    Once you have requested the deletion of your data, it’s important to check in with your family members and encourage them to do the same because it’s not just their DNA that’s at risk of sale — it also affects people they are related to. 
    And while you’re at it, it’s worth checking in with your friends to ensure that all of your loved ones are taking steps to protect their data. 
    This story originally published on March 25 and was updated June 11 with new information.
    #how #delete #your #23andme #data
    How to delete your 23andMe data
    DNA testing service 23andMe has undergone serious upheaval in recent months, creating concerns for the 15 million customers who entrusted the company with their personal biological information. After filing for Chapter 11 bankruptcy protection in March, the company became the center of a bidding war that ended Friday when co-founder Anne Wojcicki said she’d successfully reacquired control through her nonprofit TTAM Research Institute for million. The bankruptcy proceedings had sent shockwaves through the genetic testing industry and among privacy advocates, with security experts and lawmakers urging customers to take immediate action to safeguard their data. The company’s interim CEO revealed this week that 1.9 million people, around 15% of 23andMe’s customer base, have already requested their genetic data be deleted from the company’s servers. The situation became even more complex last week after more than two dozen states filed lawsuits challenging the sale of customers’ private data, arguing that 23andMe must obtain explicit consent before transferring or selling personal information to any new entity. While the company’s policies mean you cannot delete all traces of your genetic data — particularly information that may have already been shared with research partners or stored in backup systems — if you’re one of the 15 million people who shared their DNA with 23andMe, there are still meaningful steps you can take to protect yourself and minimize your exposure. How to delete your 23andMe data To delete your data from 23andMe, you need to log in to your account and then follow these steps: Navigate to the Settings section of your profile. Scroll down to the selection labeled 23andMe Data.  Click the View option and scroll to the Delete Data section. Select the Permanently Delete Data button. You will then receive an email from 23andMe with a link that will allow you to confirm your deletion request.  You can choose to download a copy of your data before deleting it. There is an important caveat, as 23andMe’s privacy policy states that the company and its labs “will retain your Genetic Information, date of birth, and sex as required for compliance with applicable legal obligations.” The policy continues: “23andMe will also retain limited information related to your account and data deletion request, including but not limited to, your email address, account deletion request identifier, communications related to inquiries or complaints and legal agreements for a limited period of time as required by law, contractual obligations, and/or as necessary for the establishment, exercise or defense of legal claims and for audit and compliance purposes.” This essentially means that 23andMe may keep some of your information for an unspecified amount of time.  How to destroy your 23andMe test sample and revoke permission for your data to be used for research If you previously opted to have your saliva sample and DNA stored by 23andMe, you can change this setting. To revoke your permission, go into your 23andMe account settings page and then navigate to Preferences.  In addition, if you previously agreed to 23andMe and third-party researchers using your genetic data and sample for research, you can withdraw consent from the Research and Product Consents section in your account settings.  While you can reverse that consent, there’s no way for you to delete that information. Check in with your family members Once you have requested the deletion of your data, it’s important to check in with your family members and encourage them to do the same because it’s not just their DNA that’s at risk of sale — it also affects people they are related to.  And while you’re at it, it’s worth checking in with your friends to ensure that all of your loved ones are taking steps to protect their data.  This story originally published on March 25 and was updated June 11 with new information. #how #delete #your #23andme #data
    TECHCRUNCH.COM
    How to delete your 23andMe data
    DNA testing service 23andMe has undergone serious upheaval in recent months, creating concerns for the 15 million customers who entrusted the company with their personal biological information. After filing for Chapter 11 bankruptcy protection in March, the company became the center of a bidding war that ended Friday when co-founder Anne Wojcicki said she’d successfully reacquired control through her nonprofit TTAM Research Institute for $305 million. The bankruptcy proceedings had sent shockwaves through the genetic testing industry and among privacy advocates, with security experts and lawmakers urging customers to take immediate action to safeguard their data. The company’s interim CEO revealed this week that 1.9 million people, around 15% of 23andMe’s customer base, have already requested their genetic data be deleted from the company’s servers. The situation became even more complex last week after more than two dozen states filed lawsuits challenging the sale of customers’ private data, arguing that 23andMe must obtain explicit consent before transferring or selling personal information to any new entity. While the company’s policies mean you cannot delete all traces of your genetic data — particularly information that may have already been shared with research partners or stored in backup systems — if you’re one of the 15 million people who shared their DNA with 23andMe, there are still meaningful steps you can take to protect yourself and minimize your exposure. How to delete your 23andMe data To delete your data from 23andMe, you need to log in to your account and then follow these steps: Navigate to the Settings section of your profile. Scroll down to the selection labeled 23andMe Data.  Click the View option and scroll to the Delete Data section. Select the Permanently Delete Data button. You will then receive an email from 23andMe with a link that will allow you to confirm your deletion request.  You can choose to download a copy of your data before deleting it. There is an important caveat, as 23andMe’s privacy policy states that the company and its labs “will retain your Genetic Information, date of birth, and sex as required for compliance with applicable legal obligations.” The policy continues: “23andMe will also retain limited information related to your account and data deletion request, including but not limited to, your email address, account deletion request identifier, communications related to inquiries or complaints and legal agreements for a limited period of time as required by law, contractual obligations, and/or as necessary for the establishment, exercise or defense of legal claims and for audit and compliance purposes.” This essentially means that 23andMe may keep some of your information for an unspecified amount of time.  How to destroy your 23andMe test sample and revoke permission for your data to be used for research If you previously opted to have your saliva sample and DNA stored by 23andMe, you can change this setting. To revoke your permission, go into your 23andMe account settings page and then navigate to Preferences.  In addition, if you previously agreed to 23andMe and third-party researchers using your genetic data and sample for research, you can withdraw consent from the Research and Product Consents section in your account settings.  While you can reverse that consent, there’s no way for you to delete that information. Check in with your family members Once you have requested the deletion of your data, it’s important to check in with your family members and encourage them to do the same because it’s not just their DNA that’s at risk of sale — it also affects people they are related to.  And while you’re at it, it’s worth checking in with your friends to ensure that all of your loved ones are taking steps to protect their data.  This story originally published on March 25 and was updated June 11 with new information.
    0 Commenti 0 condivisioni