• Thinking Machine Labs is ex-OpenAI CTO Mira Muratis new startup
    techcrunch.com
    Former OpenAI CTO Mira Murati has announced her new startup. Unsurprisingly, its focused on AI.Called Thinking Machine Labs, the startup, which came out of stealth today, intends to build tooling to make AI work for [peoples] unique needs and goals, and to create AI systems that are more widely understood, customizable, and generally capable than those currently available.Murati is heading up Thinking Machine Labs as CEO. OpenAI co-founder John Schulman is the companys chief scientist, and Barret Zoph, OpenAIs ex-chief research officer, is CTO.In a blog post shared with TechCrunch, Thinking Machine Labs wrote that while AI capabilities have advanced dramatically, key gaps remain. The scientific communitys understanding of frontier AI systems lags behind rapidly advancing capabilities, read the blog post. Knowledge of how these systems are trained is concentrated within the top research labs, limiting both the public discourse on AI and peoples abilities to use AI effectively. And, despite their potential, these systems remain difficult for people to customize to their specific needs and values.Thinking Machine Labs plans to focus on building multimodal systems that work with people collaboratively, according to the blog post, and that can adapt to the full spectrum of human expertise and enable a broader spectrum of applications.[W]e are building models at the frontier of capabilities in domains like science and programming, read the blog post. Ultimately, the most advanced models will unlock the most transformative applications and benefits, such as enabling novel scientific discoveries and engineering breakthroughs.AI safety will be another core tenet of Thinking Machine Labs work. The company said that it plans to contribute to safety by preventing misuse of the models it releases, sharing best practices and recipes for how to build safe AI systems with the industry, and supporting external research on alignment by sharing code, datasets, and model specifications.Well focus on understanding how our systems create genuine value in the real world, Thinking Machine Labs wrote in its blog post. The most important breakthroughs often come from rethinking our objectives, not just optimizing existing metrics.Murati left OpenAI last October after six years at the company. At the time, she said she was stepping away to do her own exploration.Murati came to OpenAI in 2018 as VP of applied AI and partnerships. After being promoted to CTO in 2022, she led the companys work onChatGPT, the text-to-image AIDALL-E, and the code-generating systemCodex, which powered early versions ofGitHubs Copilotprogramming assistant.Mirati was briefly OpenAIs interim CEO after CEO Sam Altmans abrupt firing. Altman has described her as a close ally. For months, rumors have flown of Murati hiring high-profile AI researchers and staffers for an AI venture. Per a recent Wired piece, Murati has poached employees from rivals such as OpenAI, Character AI, and Google DeepMind.Thinking Machine Labs blog lists 29 employees, including Mirati. The company is hiring machine learning scientists and engineers and a research program manager.At one point, Murati was said to be in talks to raise over $100 million from unnamed VC firms.Prior to OpenAI, Murati spent three years at Tesla as a senior product manager of the Model X, the automakers crossover SUV, during which Tesla released early versions ofAutopilot, its AI-enabled driver-assistance software. She also was VP of product and engineering atLeap Motion, a startup building hand- and finger-tracking motion sensors for PCs.Murati joins a growing list of former OpenAI execs launching startups, including rivals such as Ilya Sutskevers Safe Superintelligence and Anthropic.
    0 Comments ·0 Shares ·48 Views
  • VC giant Insight Partners confirms January cyberattack
    techcrunch.com
    U.S.-based venture capital firm Insight Partners has confirmed that hackers breached its systems in January.When reached by TechCrunch, Insight Partners confirmed the cybersecurity incident in a statement published Tuesday. Calcalist first reported the breach earlier in the day.In its statement, Insight Partners said it detected that an unauthorized third party had accessed certain Insight information systems through a sophisticated social engineering attack on January 16. The company did not specify what kind of social engineering attack it experienced.As soon as this incident was detected, we moved quickly to contain, remediate, and start an investigation within a matter of hours, the statement said.The company, which has more than $90 billion in assets under its management and invested in various cybersecurity companies including Armis and Wiz, says it has notified stakeholders about the breach.Insight Partners hasnt confirmed the nature of the security incident and hasnt said whether any data was stolen from its systems. However, the company says its encouraging its partners to employ tightened security protocols irrespective of having shared data compromised, suggesting data may have been accessed.When asked by TechCrunch, Insight Partners spokesperson Natalie Sawyer declined to answer TechCrunchs questions, including whether the incident caused any disruptions to its business.Insight said in its published statement that it expects no additional disruption to its operations.
    0 Comments ·0 Shares ·50 Views
  • Wicked: Jonathan Fawkner Visual Effects Supervisor & Creative Director Framestore
    www.artofvfx.com
    InterviewsWicked: Jonathan Fawkner Visual Effects Supervisor & Creative Director FramestoreBy Vincent Frei - 18/02/2025 In 2021, Jonathan Fawkner shared insights into Framestores visual effects for No Time To Die. Today, he returns to discuss his latest work on Wicked.How did you and Framestore get involved on this show?This film has been long in gestation. Framestore had worked with Universal and Mark Platt to pitch and greenlight Wicked as far back as 2020.How did it feel to enter into the Wicked universe?On our first meeting with Jon Chu he was immediately inclusive. He shared his casting videos of Cynthia Erivo and Ariana Grande: he shared his love of the genre and his passion for this story. He was also clear about how he would be relying on us; he put his trust in us and it was inspiring. From the get go we were fully invested. I knew there was common ground with the work I had done before and felt it was absolutely the right fit.What are the sequences made by Framestore?Framestores sequences are interwoven throughout the film; we extended Nathan Crowleys sets for Munchkinland, including hectares of CG tulips, floating bubbles, and crowd expansion. Shiz University extensions and wider environment, and the buildings of Emerald City as seen from the ground. Doctor Dillamond and the other animal faculty were all made at Framestore including a caged lion cub and the Ozdust band. Framestore Pre-Production Services (FPS) were heavily involved with the film, and supplied previs and postvis on all of our sequences as well as the Defying Gravity number, which went to ILM for final VFX.What is your role on set and how do you work with other departments?We started shooting both parts in December 2022. Pablo Helman, Production VFX Supervisor, and I worked side by side largely splitting the work across the sequences that ILM and Framestore would handle. I spent a lot of time with 2nd Unit, who were providing a lot of VFX plates. I often found that I was the one member of the 2nd unit crew that was present for the main unit shoot, providing valuable eyes and ears. It also meant that I could add my own priorities to the shot list, knowing what I would need. We finished shooting in January 2024 after a small hiatus for the actors strikes.Can you tell us about the creative process behind designing the magical landscapes of Wicked? How did you approach capturing the unique aesthetic of Oz?Nathan Crowley had already established that Munchkinland would be the heart of the textile dye industry of Oz (the colour comes from tulips). He planted 9 million tulips which provided backdrop for closeup and medium shots of the cast. For the wides we needed to blend the flat Norfolk landscape into the rolling hills of Munchkinland. This meant our Montreal team had to not only build what must have been billions of our own digital tulips but also match the cadence of the subtle wind interaction, to effect a seamless blend from the Norfolk location to the Hertfordshire backlot. Then it was a case of designing a pleasing pattern for the fields and landscape cues to reinforce the epic scale.What were the biggest challenges in creating the environments for Wicked? Were there any specific scenes that required particularly complex solutions?The sets Nathan Crowley built were enormous and richly detailed, and so the extensions had to follow suit. Our design team came on in post-production so channelling the style and craftsmanship in camera is always a challenge: we need to honour the aesthetic while also motivating the camera after the fact. Its retrospective.Big as the sets were, they still left a lot to top up. The sort of light you get on an exterior set that stops at 50 is very different to one that is surrounded by tall buildings, so extensions are not simply a case of copy and paste. Shadows and ambient light are completely different, camera flares are no longer motivated etc. Keeping it photoreal requires some sleight of hand on each shot. Throughout the film we manipulated the plate to add direct light, extend shadows, design architectural glass to reflect light into the lens to replicate set lights placed by Alice Brooks (Director of Photography). What we inherited from production design was a beautiful architects sketch. What we needed was to act like an architectural designer: to assign the actualities. All the details, from window frames, to gutters to sculptures had to have a little Ozian touches and genuine appeal. To keep it cost effective the designs needed to work across multiple shots that were not necessarily framed with that in mind. As ever with set design in post, we dont have the luxury of letting the camera react to the set. We have to design the set to work with ALL the cameras after the fact.How did Framestore collaborate with the production team to ensure the environments complemented the storytelling and characters?The sets for certain events were physically separate at the studio but connected in Oz and across two movies too, so we were keen to offer environment solutions that kept everything connected and consistent while remaining compositionally satisfying. Jon Chu is very good at appreciating the implications of any particular design choice and we made sure to walk him through many options before embarking on long digital builds. We always start with architectural solutions and hes excellent at distilling the essence of the design issues but also reminding us of the story points that they serve.Were there any practical sets blended with your CG environments, and how did you achieve seamless integration?This is the case on nearly every set. Our sets were huge and in most cases work continued on them even after the unit started there. We were careful to match materials from the set but we also had shots where we were able to achieve a richer finish than the painted fibreglass, so our work replaced areas of the set that were perhaps not finished, or were never expected to feature, but did.Did you use any new technologies or techniques to enhance the realism and magic of the environments in Wicked?Wicked is fantastical and while we always strive to replicate reality of a physical scene, no one on this production ever pretended that Oz was a real place that encourages the film makers to allow themselves license to lean into design even at the expense of realism. That said, our work succeeds when it looks as hyper real as the physical production design or lighting and brings that same sense of delight that Jon is always striving for.For the digital crowd I used a technique that I had not previously encountered. Partnering with Dimension Studios, we photographed the background cast volumetrically. This enabled richly detailed cloth and facial performances without the need for complex rigs or crowd sims. Relighting was made easy by splitting geometry and texture. What was harder was separating different materials.On Wicked we utilised machine learning techniques to help with the separation, and this meant thousands of frames of dancing crowds did not pose quite so much of a problem as it might have done in years past. The limitation with this approach ended up being logistical. With a small window where cast, wardrobe, make-up and catering were available, there was a limit on data storage and bandwidth available on Dimensions mobile studio setup. This meant we could only store one minute of performance per agent. We had at least one minute of action we wanted from our crowd so I had to compose a concise menu of actions to be performed without breaks and without error. To make sure it was even possible, I put myself in costume and became one of the crowds munchkins.Doctor Dillamond is such a unique character. How did you approach his design and animation to balance his animalistic traits with his human-like qualities?Our Art Department had produced the concepts for Doctor Dillamond, so in VFX we began to cast about for a real goat to form the base for our character. Bizarrely, I happen to have a goat sanctuary down the road from where I live, and they had a goat that fit the bill with the right size and shape. This meant we had a basis for believability which was always Jons starting point.Dale Newton, our Animation Supervisor, and the animation team constantly adapted what they wanted out of facial muscles and groom from the asset team to capture a particular expression. Out of this Dillamond slowly appeared. We also worked with the costume department to get samples of the different fabrics and designs of the human professors at Shiz, so that we could do our best to match Dillamonds attire to that of his colleagues.What specific challenges did you face in bringing Doctor Dillamond to life, both visually and in terms of performance?Doctor Dillamond does not have much screen time, but in many ways he embodies the emotional MacGuffin for Wicked. Elphabas empathy with animals has to be quickly established and her sympathy with the plight of this animal in particular really has to land in order to achieve that.Characters based on animals with monocular vision are often anthropomorphised for cinema to bring their eyes and the emotional read closer to that of a human. I was very glad that Jon was not interested in this because it gave us an opportunity to deliver a performance on a creature that did not require too great a leap of faith from the audience. I mean, he talks and sings but he wasnt especially designed visually for that.When Peter Dinklage delivered his read, it gave us waypoints to hit in terms of expression but the animation team were also constantly proposing alternatives to Jon. You read a lot more subtlety in a human face than you do on a goat, so we couldnt just lift from Peter directly. We had to draw on smaller physical traits of his character to further enhance the overall performance. His ears are lovely signifiers, his eyes can look particularly askance and his glasses are a useful prop to overtly gaze over. This is the stuff that was particularly useful in his reaction shots, when there is no vocal performance to match.Can you elaborate on how Framestore worked with the actors and directors to ensure Doctor Dillamonds movements and expressions aligned with the narrative?Whenever we had Doctor Dillamond or any of his faculty in a scene, we had performers on set who would provide blocking with the actors. This enabled Jon to see his scene quite vividly and respond to the cast and the decisions they made, even if it was a tiny monkey on a stick. Of course, we tried to get clean passes with very minimal markers and relying on the muscle memory of the actors, but inevitably some of the best takes had the puppeteers still in the frame, which called for some quite heroic paint-out work.When patience and time allowed, usually on 2nd unit, we would record the output of the camera during the puppet pass and in an act of impressive low tech efficiency, simply play it back on a phone mounted on the wheels to achieve the same sense of reaction on an empty plate.Were there other creatures in Wicked that posed unique challenges for Framestore? Could you share some insights into their creation?The most fun we had with a lot of the creatures was finding what aspect of them, to a greater or lesser extent, made them Ozian. It involved a huge amount of design work. Deviating from nature is always a delicate balance which by and large Jon was attuned to, so giving a giraffe a mullet or a snow leopard a Dali moustache was lots of fun. Its not easy to know what should come first when you are casting about for animals for an animal band. Visual interest, or musical viability. Who knew chickens would play an Ozian piano, or a sugar glider the drums? And not just any chickens either. They had to be the most elaborate show chickens we could find! (They had some at the goat sanctuarywhich was handy!).Were there any unexpected technical or creative challenges encountered during the production?We dont have many massive full CG shots. Some, but not many. Even most of our establishers feature an element of main unit photography. With a singing cast and often huge background chorus it really did not leave a lot of time or headspace for the film-makers to pause to consider visual effects. We had to fight for attention next to the other crafts who were all operating at capacity.My back garden became an unofficial backlot and now that my iPhone can shoot at 24fps underwater in log I was able to supplement the effects. We had a pyro shoot for fireworks, my daughter played Glindas hand cutting through the surface of the water and I played the Wizard projection emerging from a cloud of mist during the Wizomania show. It was low tech solutions that were made possible by high tech advances.Looking back on the project, what aspects of the visual effects are you most proud of?I got pretty involved in this film. Thanks to so many generous people, from Pablo and his team, to Jon, the 2nd unit team, our amazing data wranglers lead by Chris Lynch, the Clear Angle team as well as my own wonderful Framestore and FPS teams, I am so honoured to have my fingerprints everywhere. Theres no one aspect that stands out.How long have you worked on this show?Its about 2.5 years to date.Whats the VFX shots count?Framestore delivered 870 shots for the film.What is your next project?Well be going back to Oz for Part Two!A big thanks for your time.// TRAILERSWANT TO KNOW MORE?Framestore: Dedicated page about Wicked on Framestore website. Vincent Frei The Art of VFX 2025
    0 Comments ·0 Shares ·66 Views
  • Timing is Everything in Eric Kogans Coincidental Photos of New York City
    www.thisiscolossal.com
    On daily walks around New York City, Eric Kogan has a knack for finding unexpected moments of humor and happenstance. His playful photographs (previously) capture visual coincidences and interactions between his urban surroundings and nature.From clouds seemingly cradled by electrical wires to the moon balanced precariously on the corner of a building, Kogans scenes highlight how perspective, light, and excellent timing can capture a lighthearted, even mischievous view of the city. Find more on his website and Instagram.Do stories and artists like this matter to you? Become a Colossal Member today and support independent arts publishing for as little as $7 per month. The article Timing is Everything in Eric Kogans Coincidental Photos of New York City appeared first on Colossal.
    0 Comments ·0 Shares ·50 Views
  • DeepSeek-R1: Budgeting challenges for on-premise deployments
    www.computerweekly.com
    sdecoret - stock.adobe.comNewsDeepSeek-R1: Budgeting challenges for on-premise deploymentsThe availability of the DeepSeek-R1 large language model shows its possible to deploy AI on modest hardware. But thats only half the storyByCliff Saran,Managing EditorPublished: 18 Feb 2025 17:00 Until now, IT leaders have needed to consider the cyber security risks posed by allowing users to access large language models (LLMs) like ChatGPT directly via the cloud. The alternative has been to use open source LLMs that can be hosted on-premise or accessed via a private cloud.The artificial intelligence (AI) model needs to run in-memory and, when using graphics processing units (GPUs) for AI acceleration, this means IT leaders need to consider the costs associated with purchasing banks of GPUs to build up enough memory to hold the entire model.Nvidias high-end AI acceleration GPU, the H100, is configured with 80Gbytes of random-access memory (RAM), and its specification shows its rated at 350w in terms of energy use.Chinas DeepSeek has been able to demonstrate that its R1 LLM can rival US artificial intelligence without the need to resort to the latest GPU hardware. It does, however, benefit from GPU-based AI acceleration.Nevertheless, deploying a private version of DeepSeek still requires significant hardware investment. To run the entire DeepSeek-R1 model, which has 671 billion parameters in-memory, requires 768Gbytes of memory. With Nvidia H100 GPUs, which are configured with 80GBytes of video memory card each, 10 would be required to ensure the entire DeepSeek-R1 model can run in-memory.IT leaders may well be able to negotiate volume discounts, but the cost of just the AI acceleration hardware to run DeepSeek is around $250,000.Less powerful GPUs can be used, which may help to reduce this figure. But given current GPU prices, a server capable of running the complete 670 billion-parameter DeepSeek-R1 model in-memory is going to cost over $100,000.Read more about DeepSeekNew Relic extends observability to DeepSeek: The observability tools supplier now offers enhanced monitoring for DeepSeek models to help businesses reduce the costs and risks of generative AI development.Welcome to US artificial intelligences Sputnik moment: In spite of the USs financial might, Russias Sputnik was the first satellite. Is something similar about to happen thanks to a new Chinese LLM?The server could be run on public cloud infrastructure. Azure, for instance, offers access to the Nvidia H100 with 900 GBytes of memory for $27.167 per hour, which, on paper, should easily be able to run the 671 billion-parameter DeepSeek-R1 model entirely in-memory.If this model is used every working day, and assuming a 35-hour week and four weeks a year of holidays and downtime, the annual Azure bill would be almost $46,000 a year. Again, this figure could be reduced significantly to $16.63 per hour ($23,000) per year if there is a three-year commitment.Less powerful GPUs will clearly cost less, but its the memory costs that make these prohibitive. For instance, looking at current Google Cloud pricing, the Nvidia T4 GPU is priced at $0.35 per GPU per hour, and is available with up to four GPUs, giving a total of 64 Gbytes of memory for $1.40 per hour, and 12 would be needed to fit the DeepSeek-R1 671 billion-parameter model entirely-in memory, which works out at $16.80 per hour. With a three-year commitment, this figure comes down to $7.68, which works out at just under $13,000 per year.A cheaper approachIT leaders can reduce costs further by avoiding expensive GPUs altogether and relying entirely on general-purpose central processing units (CPUs). This setup is really only suitable when DeepSeek-R1 is used purely for AI inference.A recent tweet from Matthew Carrigan, machine learning engineer at Hugging Face, suggests such a system could be built using two AMD Epyc server processors and 768 Gbytes of fast memory. The system he presented in a series of tweets could be put together for about $6,000.Responding to comments on the setup, Carrigan said he is able to achieve a processing rate of six to eight tokens per second, depending on the specific processor and memory speed that is installed. It also depends on the length of the natural language query, but his tweet includes a video showing near-real-time querying of DeepSeek-R1 on the hardware he built based on the dual AMD Epyc setup and 768Gbytes of memory.Carrigan acknowledges that GPUs will win on speed, but they are expensive. In his series of tweets, he points out that the amount of memory installed has a direct impact on performance. This is due to the way DeepSeek remembers previous queries to get to answers quicker. The technique is called Key-Value (KV) caching.In testing with longer contexts, the KV cache is actually bigger than I realised, he said, and suggested that the hardware configuration would require 1TBytes of memory instead of 76Gbytes, when huge volumes of text or context is pasted into the DeepSeek-R1 query prompt.Buying a prebuilt Dell, HPE or Lenovo server to do something similar is likely to be considerably more expensive, depending on the processor and memory configurations specified.A different way to address memory costsAmong the approaches that can be taken to reduce memory costs is using multiple tiers of memory controlled by a custom chip. This is what California startup SambaNova has done using its SN40L Reconfigurable Dataflow Unit (RDU) and a proprietary dataflow architecture for three-tier memory.DeepSeek-R1 is one of the most advanced frontier AI models available, but its full potential has been limited by the inefficiency of GPUs, said Rodrigo Liang, CEO of SambaNova.The company, which was founded in 2017 by a group of ex-Sun/Oracle engineers and has an ongoing collaboration with Stanford Universitys electrical engineering department, claims the RDU chip collapses the hardware requirements to run DeepSeek-R1 efficiently from 40 racks down to one rack configured with 16 RDUs.Earlier this month at the Leap 2025 conference in Riyadh, SambaNova signed a deal to introduce Saudi Arabias first sovereign LLM-as-a-service cloud platform. Saud AlSheraihi, vice-president of digital solutions at Saudi Telecom Company, said: This collaboration with SambaNova marks a significant milestone in our journey to empower Saudi enterprises with sovereign AI capabilities. By offering a secure and scalable inferencing-as-a-service platform, we are enabling organisations to unlock the full potential of their data while maintaining complete control.This deal with the Saudi Arabian telco provider illustrates how governments need to consider all options when building out sovereign AI capacity. DeepSeek demonstrated that there are alternative approaches that can be just as effective as the tried and tested method of deploying immense and costly arrays of GPUs.And while it does indeed run better, when GPU-accelerated AI hardware is present, what SambaNova is claiming is that there is also an alternative way to achieve the same performance for running models like DeepSeek-R1 on-premise, in-memory, without the costs of having to acquire GPUs fitted with the memory the model needs.In The Current Issue:AI Action Summit: Global leaders decry AI red tapeNavigating the practicalities of AI regulation and legislationDownload Current IssueThe Private AI Infrastructure Imperative: A Practical Perspective Write side up - by Freeform DynamicsSLM series: ABBYY-A strategic recalibration of the tech arsenal CW Developer NetworkView All Blogs
    0 Comments ·0 Shares ·69 Views
  • Download your Kindle books ASAP - before Amazon kills this feature next week
    www.zdnet.com
    The clock is ticking for Kindle users. After February 2025, a long-standing feature disappears. Will this change how you buy and store digital books? Read on to find out.
    0 Comments ·0 Shares ·47 Views
  • I've tested many Ring cameras but one of the best ones is half off for a limited time
    www.zdnet.com
    You can save $90 at Amazon on what is arguably one of the best Ring cameras available -- the Ring Stick Up Cam Pro.
    0 Comments ·0 Shares ·48 Views
  • AOHi Reveals 10,000mAh Starship Pro Power Bank With 210W Output
    www.forbes.com
    The Future Starship Pro is a 10,000mAh power bank with an output of up to 210W. AOHiAOHi has launched the Future Starship Pro, an advanced 10,000mAh power bank with an unprecedented 210W total power output, redefining what is possible in mobile charging. Designed for travelers, professionals and tech enthusiasts, the Starship Pro is compact and powerful.Built for efficiency, speed and reliability, the Starship Pro has dual 140W USB-C ports that support simultaneous high-power device charging. There is a 90W ultra-fast input for rapid recharging of the Starship Pro and an integrated smart display that provides real-time power monitoring.Breakthrough Power and Speed in a Compact FormThe Starship Pro offers an industry-first 210W dual power output, ensuring users can charge laptops, gaming consoles, tablets and smartphones at full speed. The inclusion of two 140W USB-C ports means that even power-intensive devices like two MacBooks can be charged simultaneously without experiencing reduced performance.This charging capability is particularly useful for anyone who relies on high-powered technology throughout the working day, as well as gamers and travelers who need reliable and on-the-go charging.Equipped with a 90W rapid input system, Starship Pro recharges to 60% capacity in just 15 minutes and achieves a full charge in under 30 minutes when paired with a suitable and compatible fast charger. This improved charging time drastically reduces downtime, making it handy when quick and efficient power top-ups are needed throughout the day.This new power bank can charge and power two laptops at once. AOHiDesigned For Portability Without CompromiseBalancing high performance with portability, the Starship Pro maintains is still lightweight and compact enough for traveling. The device is fully compliant with airline regulations, enabling users to bring it on board during flights without restriction. This feature makes it a useful accessory for frequent travelers who need a dependable power source during a flight.As well as being portable, the Starship Pro has an intelligent display system, which provides real-time input and output power readings as well as dynamic visual animations. This enables precise monitoring of charging activity.The Future Starship Pro is powered by automotive-grade INR18650-25P lithium-ion cells, a battery technology that is both durable and efficient. Even after three years of regular use, AOHi claims the battery retains over 80% of its original capacity, guaranteeing sustained performance over time.The Future Starship Pro power bank can slip into a pocket and provide power anywhere. AOHiTo ensure safety and regulatory compliance, the power bank is backed by FCC, CE, CCC and PSE certifications for electrical safety, electromagnetic compatibility and overheating protection. These safety measures protect the device and the user from potential hazards.Additionally, the inclusion of a high-performance 140W USB-C cable makes the charger more convenient to use. Some power banks can fail to deliver their maximum wattage because most USB cables are rated too low. With the provided EPR-certified cable, users can take full advantage of Starships maximum output without worrying about inefficiencies or compatibility.AOHi says the Future Starship Pro is engineered to make portable charging fuss-free. Many users experience slow charge times, limited compatibility and bulky designs with some power banks, but AOHi says its latest innovation eliminates these barriers, offering a fast, compact and intelligent charging solution.Pricing and Availability:The Future Starship is available from Kickstarter, where early backers can access launch discounts from $79. For more information and campaign updates, visit AOHis official Kickstarter page. As always, with any crowdfunding campaign, be sure to carry out due diligence.
    0 Comments ·0 Shares ·44 Views
  • Meet The Metaverse: Practical Business And Social Applications
    www.forbes.com
    Through immersive and realistic simulations, the metaverse may soon open up global experiences, unlock innovative business strategies and foster new communities.
    0 Comments ·0 Shares ·58 Views
  • www.techspot.com
    Reviewers LikedBattery life better than everNice design with high-end buildBeautiful LTPO displayComprehensive health and fitness featuresWell-priced compared to rival smartwatchesReviewers Didn't LikeNo cellular or alternative size optionsCheap, awkward charging puckLacks some common safety features for sleep apnea, falling or crashesOnly two years of software updatesWellness score can be hit-and-missWeak speaker and hapticsNo ECG in US and CanadaCompetitors and Related Products Our editors hand-pick related products using a variety of criteria: direct competitors targeting the same market segment, or devices that are similar in size, performance, or feature sets. Expert reviews and ratings 80 OnePlus has bumped the specs of its smartwatch across the board to make an already impressive wearable one of the greatest. It doesn't mind what phone you use (as long as it's not from Apple), or if you need to spend days away from a charger while still tracking dozens of health metrics the OnePlus Watch 3 can handle it. As long as you can handle missing out a few features found on Apple, Pixel and Galaxy Watches, and the large, single size option. By Tom's Guide on February 18, 2025 90 The OnePlus Watch 3 perfectly blends high quality design and materials with superb performance and technical ability, then adds in fantastic battery life for a killer product. By DigitalTrends on February 18, 2025 90 It's too bad that OnePlus is only committing to two years of software updates. Samsung and Apple support their smartwatches for longer, and this would pair well with the OnePlus battery longevity story. Still, this is a sacrifice I'm willing to let slide because, guess what? I don't have to charge my damn watch every day. By Wired on February 18, 2025 90 The OnePlus Watch 3 not only fixes the few complaints we had about last years Watch 2, but adds a swathe of new features and functionality, including a more durable design, improved tracking, multiband GPS and a boosted 120-hour battery life that perfects the Wear OS 5 experience on offer. By TrustedReviews on February 18, 2025 90 The OnePlus Watch 3 is one of the very best smartwatches on the market, and the new WearOS battery life champion. By HotHardware on February 18, 2025 80 The OnePlus Watch 3 catches up to Wear OS competitors with features like AI-powered health assessments and fall detection while offering class-leading battery life. By PCMag on February 18, 2025 70 The OnePlus Watch 3 is a solid upgrade, offering outstanding battery life and smooth software. Yet, it fails to address the key shortcomings of its predecessor. The design remains bulky, LTE is absent, and health trackingwhile much improvedis still largely behind Samsung and Google. At this price, its hard to justify over the Pixel Watch 3 or Galaxy Watch 7, which both offer better features for the same or lower cost. If battery life is your top priority, the Watch 3 is worth consideringotherwise, better options exist. By Wareable on February 18, 2025 100 A more refined take on what was already a fantastic WearOS watch. The OnePlus Watch 3 lasts longer, looks slicker, and is even more fitness-focused. By Stuff on February 18, 2025Load More Reviews
    0 Comments ·0 Shares ·53 Views