• MSI introduces "latency killer" to improve DDR5 RAM performance
    In brief: DDR5 memory chips generally have higher latency than DDR4, but this usually doesn't pose a problem under normal usage. However, MSI has released a firmware update to address a mild latency increase introduced last month. Chinese website Uniko's Hardware recently spotted a new feature in MSI firmware for AM5 motherboards called "Latency Killer," which is designed to improve RAM latency. According to its description, Latency Killer can boost latency performance but may have a negative impact on CPU performance. However, most users are unlikely to notice any significant difference in everyday use.The Latency Killer feature offers three settings: Enabled, Disabled, or Auto. This new option is intended to restore latency levels to those seen on AM5 motherboards before the recent AGESA 1202a update, which increased latency by around 10 nanoseconds. Some users had been flashing their motherboards with older AGESA versions to regain lost performance.Corsair explains that a DRAM module's latency refers to how many clock cycles it takes to access a specific set of data stored in RAM. CAS (Column Address Strobe) latency is a key specification that indicates how many clock cycles are needed for the DRAM to provide the requested data to the CPU. For example, a CAS latency of 34 means the module needs 34 RAM clock cycles to fulfill a data request from the processor.DDR5 RAM generally has higher CAS latency than DDR4 memory, but it still offers significant performance benefits due to higher power efficiency, greater storage density, and much faster clock speeds. In practical terms, a 10-nanosecond reduction in latency is unlikely to have a noticeable impact on everyday tasks or gaming performance. // Related StoriesThe new Latency Killer option appears to be aimed at restoring the memory latency levels seen on AM5 motherboards before the AGESA 1202a update. However, AMD is working on a new firmware revision that will address the latency increase without requiring users to enable or disable any BIOS options.The AGESA 1.2.0.2a microcode update, released a few weeks ago, added support for Ryzen 9000X3D "Granite Ridge" CPUs on AM5 motherboards. The Granite Ridge architecture, based on Zen 5, is a powerful CPU design with a generous amount of integrated L3 cache, which should help offset the additional latency introduced by AMD's microcode.
    0 Comments 0 Shares 19 Views
  • WWW.DIGITALTRENDS.COM
    The 6 best OLED TVs for 2024
    Digital TrendsEditor's note: The biggest shopping day of the year Black Friday is almost here, and there are already TV deals you can grab. You can get Samsung's 65-inch S90C for only $1,000. It's a deal that we think makes it the best OLED to buy right now. You can save on more than just TVs on Black Friday. Check out our other Black Friday deals for more sales on headphones, streaming devices, phones, and more.If you're on the hunt for the best TV for whatever you're into movies and TV, sports, or gaming (check out our list of the best gaming TVs) sitting at the top of the TV pyramid are OLED TVs. In lieu of traditional LED backlighting, OLED TVs use millions of self-emissive pixels (that create their own light and are not backlit) to create rich colors, inky blacks, and zero light blooming in short, arguably the best picture in the business. Over the last several years, LG has been the predominant manufacturer of OLED TVss, but now the company competes with Samsung and Sony in the OLED marketplace.With numerous models and sizes to choose from, three of the best TV brands make OLEDs, and theyre all worth considering. To help you hone in on the best TVs, weve created this list of the best OLED TVs for 2024.LG OLED Evo G4Best overall OLED TVJump to detailsLG OLED Evo C4Runner-upJump to detailsLG M3 Wireless OLEDA wire-free flagshipJump to detailsSony XR A95LThe best OLED for videophilesJump to detailsSony Bravia 8 SeriesA runner-up Sony OLEDJump to detailsSamsung S95D QD-OLEDBest Samsung OLEDJump to detailsRecommended VideosZeke Jones / Digital TrendsBest overall OLED TVLG OLED Evo G4LG G4 OLED ReviewProsOutstanding brightnessAstounding accuracyUnprecedented flexibilityAwesome gaming performanceFive-year warrantyConsHit-or-miss soundFrustrating remoteLG has done it again folks. One of our favorite TVs of 2023 was the amazing LG G3 Series, and now that set has been ousted from its throne only to be replaced by the LG G4.Much like its older brother, the G4 leverages Micro Lens Array (MLA) technology to achieve the kind of brightness levels wed expect from an LED-LCD. We've put the G4 through several picture tests and were absolutely blown away by its peak luminance, breathtaking colors, and next-level HDR performance. LG has long been a TV brand one gravitates toward for picture accuracy, and thanks to features like Filmmaker Mode on the G4, that reputation continues.As part of LGs Gallery (what the G in G4 stands for) lineup, this OLED is designed to look like a piece of artwork when mounted to a wall. And if you purchase the 77-, 83-, or 97-inch version of the G4, the low-profile wall mount comes in the box (the 55- and 65-inch versions come with a tabletop stand). Each of the TVs four HDMI ports are 2.1 certified, with resolution and refresh rate capped at 4K/120Hz. Action movies, sports, and gaming will look particularly great on the G4, and lower-res sources get the 4K upscaling treatment.While webOS 24 and the included LG Magic Remote may not be for everyone, we have no doubt that if youre looking for the best OLED picture that money can buy in 2024, youll be hard-pressed to do better than the LG G4 Series.LG OLED Evo G4Best overall OLED TVLG C4 OLEDDouglas Murray / Digital TrendsRunner-upLG OLED Evo C4ProsGreat black levels and color accuracyStellar HDR highlights and vibrant colorsSolid gaming optimizationsConsLackluster off-angle viewingOne of our favorite things about LG is that even though brands like Sony and Samsung have started getting very competitive with models like the A95L and S95D (these are QD-OLED TVs), LG hasnt gone back to the drawing board to attempt to come up with picture tech thats bolder and better. Instead, the company is sticking to its OLED roots, which leads us to the LG C4 OLED Evo.Available in sizes ranging from 42 to 83 inches, the LG C4 is a great TV for everything from movies and shows to video games. And thanks to features like Brightness Booster Max, spectacular highlights and HDR colors really stand out on this set; even in a brightly lit room. We do recommend watching from as centered of a seating position as possible though, as theres a bit of a green overcast to the image when viewing off-angle.Calling all gamers: Not only does the LG C4 have four HDMI 2.1 ports and 144Hz refresh rate capabilities, but features like VRR support lead to some of the best response times and lag-free performance weve ever seen on an OLED. The LG C4 is also equipped with LGs webOS 24 for accessing streaming service apps like Netflix and Disney+, or streaming tunes via AirPlay 2.Historically, OLED TVs have always looked the best in dark rooms, so if your living room is often bathed in sunlight, the C4 Series may not be the best fit for your home. For that, have a look at our list of the best QLED TVs instead.LG OLED Evo C4Runner-upRelatedZeke Jones / Digital TrendsA wire-free flagshipLG M3 Wireless OLEDLG M3 Wireless OLED ReviewProsGorgeous OLED picture qualityLow-latency wireless for gamingSolid wireless signal connectionEasy setupGood soundConsWireless box needs true line of sightIt was only a matter of time before a TV company decided to go wireless, at least in terms of AV connectivity. Sure, Samsungs One Connect tech is great for consolidating wire clutter, but the LG M3 Series goes a step further by eliminating cable leads altogether. Thats right: The M3 Series wirelessly beams picture and sound to the TV via its Zero Connect Box.A line-of-sight peripheral that should be placed no further than 30 feet away from the M3 Series, youll be able to connect everything from streaming devices to gaming consoles to the Zero Connect. Caleb Denison, our editor-at-large, has had the privilege of seeing this bad boy in action, and reported that there is no discernible picture difference between a tried and true HDMI cable and the M3s wireless tech.As for overall picture quality, the LG M3 is simply a beast. As weve come to expect from the kinds of OLED, LG's M3 Series delivers excellent color and contrast, on top of class-leading brightness for an OLED display. LGs phenomenal a9 AI Processor Gen 6 is responsible for the TVs image upscaling, and it does a terrific job at further enhancing visuals for each and every frame.The M3 is also optimized for next-gen gaming, with resolution and motion clarity topping out at 4K/120Hz. Youll be able to grab this model in 77-, 83-, and 98-inch sizes, and even though this is LGs first foray into wireless TV tech, the execution is already excellent and we cant wait to see what future iterations of the M3 Series will look and sound like.LG M3 Wireless OLEDA wire-free flagshipZeke Jones / Digital TrendsThe best OLED for videophilesSony XR A95LSony Bravia A95L ReviewProsOutstanding color accuracy and brightnessExcellent contrast and luminanceGreat soundGorgeous game mode picture qualityIncredibly good upscalingConsSome deep features not available at launchWe finished our review of the Sony XR A95L as very happy campers. In fact, we awarded the latest Sony QD-OLED flagship a perfect five out of five stars. This TV is seriously good, and because were dealing with a QD-OLED display, theres plenty of brightness, colors, and contrast perks thanks to those onboard quantum dots. But were really just scratching the surface with those highlights; so lets take a look under the hood.As weve come to expect from most Sony sets, the formidable A95L is equipped with the companys Cognitive Processor XR (CPXR) as the main brains behind the picture. As it handles everything from frame-to-frame improvements and 4K upscaling, weve always been major fans of Sonys picture engine, but when you combine the CPXR with the A95Ls native panel tech and XR Triluminos Max, you get movies, shows, and video games that look like they could leap right off the screen.This is also the first of Sonys QD-OLEDs to be equipped with the Pentonic 1000 HDMI chipset, which allows for Dolby Vision gameplay at up to 120Hz. Thats on top of other gaming features like VRR support and HDMI 2.1 connectivity.As for the latter, only two of the TVs four ports support the latest HDMI standard, and as our own Caleb Denison points out in his review, one of these ports is also the TVs HDMI ARC/eARC port. This may pose a little trouble for those who may want to use the eARC connection for Dolby Atmos or HDMI CEC purposes, but one can always invest in an HDMI switcher if push comes to shove.Beyond that one minor hitch, the Sony A95L lives right on the cutting edge of TV tech, and other manufacturers should definitely be taking notes.If you're in the market for a 55-inch model, you can take advantage of Best Buy's $200 price cut on the Sony A95L.Sony XR A95LThe best OLED for videophilesDigital TrendsA runner-up Sony OLEDSony Bravia 8 SeriesProsUnbeatable black levels and rich colorsFantastic HDR performanceLow input lag and fast response timeSolid picture upscalingConsHDMI 2.1 is limited to two portsNot as bright as other OLEDsFor 2024, Sony decided to part ways with the medley of random letters and numbers that generally make up its TV models. Instead, were getting simplified Bravia 3, 7, 8, and 9 Series labeling, and only one of these models is a 2024 Sony OLED: the Bravia 8 Series. And unlike the flagship Sony A95L QD-OLED, the Bravia 8 uses a WOLED panel. While this doesnt really affect the TVs contrast performance, the loss of quantum dots does make for a less vibrant image overall.But OLED TVs arent renowned for brightness to begin with (thats where the best QLED TVs come into play); although the Bravia 8 does bring exceptional SDR and HDR brightness to the table. The Bravia 8 does an excellent job at standing up to ambient lighting and has a wide viewing angle, so every seat in the living room is the right seat. Were also glad to see that Sony included a few gaming optimizations on the Bravia 8 too, including VRR support and an automatic Game Mode. In fact, the only drawback for gamers is that HDMI 2.1 is confined to two of the TVs four HDMI ports.Available in 55-, 65-, and 77-inch sizes, the Bravia 8 has a 50-watt, 2.1 speaker system, which produces a surprising amount of audio for a TV. As for all things apps, games, and smart home, Google TV is the OS and streaming hub for the job. Indulge in Netflix, cast media using AirPlay 2, or call upon Google Assistant to dim your smart lights.The Sony Bravia 8 Series may not be the best TV for every Sony fan, but its still one of our favorite OLED sets of the year. And if you go with one of the smaller sizes, there's a good chance you'll find this TV on our list of the best TVs under $1,000 eventually.Sony Bravia 8 SeriesA runner-up Sony OLEDZeke Jones / Digital TrendsBest Samsung OLEDSamsung S95D QD-OLEDSamsung S95D QD-OLED TV ReviewProsExellent brightnessEye-popping colorSnappy operationGreat gaming featuresAwesome for bright roomsConsAnti-glare treatment is polarizingBlacks may appear lifted in bright roomsWhile we couldnt help but leave the Samsung S95C in our roundup for the time being, 2024 has given us the incredible Samsung S95D. This is the third generation of Samsungs QD-OLED technology, combining quantum dots and self-emissive pixels to deliver a TV with plenty of picture detail especially when it comes time to dig into its color palette.At one time, brightness was a major concern when buying an OLED TV, but sets like the S95D lay those worries to rest. In our own picture tests, editor-at-large Caleb Denison clocked 1,450 nits for peak brightness performance. While this isnt exactly on par with QLEDs like the Hisense U8N, this type of brightness is a huge step forward for all OLEDs, not to mention the TVs anti-glare matte screen, which helps to cut down on how much ambient lighting interferes with your picture.The S95D is equipped with four HDMI 2.1 inputs, and supports 4K/144Hz with Variable Refresh Rate. Whether you plan on hooking up your console or gaming PC, youll get near-instant response time and super-low input lag with this Samsung QD-OLED. We were also quite impressed by the S95Ds built-in speakers, which brought far bigger sound to the table than were used to from any modern TV.Available in 55-, 65-, and 75-inch sizes, the Samsung S95D QD-OLED is further proof that Samsung knows exactly what its doing with OLED technology.Samsung S95D QD-OLEDBest Samsung OLEDWhat is an OLED TV?OLED stands for organic light-emitting diode, and its signature characteristic is that each pixel on the screen of an OLED TV emits its own light and color and can be turned completely off to deliver true black color. OLED versus QLED: which is better?Be sure to check out our QLED versus OLED explainer. In general, OLED TVs produce a higher-quality image than QLED TVs, but there are caveats. QLED TVs get brighter, and so theyre the better choice for brightly-lit rooms. Is OLED better than 4K?Trick question! OLED is a description of a kind of TV display, whereas 4K refers to a TVs native resolution. Like LED and QLED TVs, you can buy 4K and even 8K OLED TVs the choice is yours. What should I look for in an OLED TV?If youre a gamer, make sure your OLED TV supports HDMI 2.1 at the least, and in an ideal world, it should have Nvidia G-Sync and AMD FreeSync, too. If youre a cinephile or video maven, find a model with the best picture processing. Is OLED the best TV technology?We think OLED TVs still currently produce the best overall image quality, but upcoming new formats such as QD-OLED, microLED and mini-LED-based QLED TVs are starting to threaten OLED TVs crown. Do OLED TVs have HDR?Yes, all OLED TVs are compatible with at least HDR10 the most common HDR format while most offer support for HLG and Dolby Vision, too. Do OLED TVs have problems with burn-in?For the vast majority of buyers, burn-in will not be a problem, but it can happen. When it occurs, its usually because someone has set their OLED TV to show a TV channel or a video game that has on-screen graphics that dont move much or at all and left it there for many hours each day, for many days in a row. Who makes the best OLED TV?We believe that LG Electronics makes the best overall OLED TV: the G4 Gallery Series. That said, Sonys image processing is slightly better, so if image perfection is your main yardstick, a Sony OLED TV is a great way to go.While you'll be hard pressed to find an OLED on our list of the best TVs under $500, this is a perfect example of getting what you pay for (at least when it comes to TV technology). Is an OLED TV worth it?Yes. OLED TVs are definitely expensive when compared to some other options, but their black levels, contrast, and color make for an awesome viewing experience. How do we test TVs?We've tested a lot of TVs. A lot. Our editor-at-large and resident TV expert, Caleb Denison, has been covering the TV and home theater space for decades. He's one of the best int he business. But rather than trying to detail his reviewing process here, he's laid it all out in this excellent explainer on how he tests TVs. Heres a rundown of some of the most common terms associated with todays TV technology.4K Ultra HDThis refers to a display resolution that is four times that of 1080p HD. A 4K Ultra HD TVs pixel resolution is a 3,840 x 2,160 grid in a 16:9 aspect ratio, resulting in nearly 8.3 million pixels. This increase in density adds striking detail and realism to an image and allows larger screens to be viewed from closer distances without individual pixels becoming visible.High dynamic range (HDR)High dynamic range is probably most familiar to people through the HDR mode on their digital cameras. Its designed to deliver a picture that has greater details in the shadows and highlights, plus a wider range of colors. HDR in televisions pursues the same goal. The color palette is wider, blacks are deeper, and whites are brighter.Presently, there are two major HDR formats: HDR10 and Dolby Vision, with a third HDR10+ beginning to show up on new models, particularly those from Samsung. The first is the HDR standard, but Dolby Vision offers a premium experience. Consider a TV that supports both. HLG (hybrid log gamma)is another recent addition to the HDR collection, which supports over-the-air (OTA) broadcast content with HDR.Full-array local dimming (FALD)This refers to an LED TVs backlighting system. A FALD display contains an array of LEDs spread out in a grid behind an LCD panel, rather than just at the edges of the TV. This LED array is broken up into zones that can be dimmed when necessary to achieve better black levels. Another benefit is more uniform brightness across the screen.Wide color gamut (WCG)These are the expanded color reproduction abilities of a 4K Ultra HD TV, which are closer than ever to what we see in a digital cinema. By approaching (or sometimes exceeding) the Digital Cinema Initiatives (DCI) P3 color specification, a 4K UHD TV can produce billions of more colors than a 1080p HD TV.Quantum dotsA layer of film loaded with tiny nanocrystal semiconductors is placed in a TVs display panel to help produce a more accurate array of colors. Quantum dots work by producing a purer form of white light from a TVs backlighting system, which helps the TVs color filter perform more accurately.Phosphor-coated LEDAn alternative to Quantum Dots, phosphor-coated LEDs have a chemical coating to alter the lights output. When used in a TV, this results in a purer backlight thats more easily manipulated by a TVs color filter, resulting in a wide color gamut and increased color accuracy.HDMI 2.1The latest version of the HDMI spec. It offers new enhancements for video games like variable refresh rate (VRR) and automatic low-latency mode (ALLM) and the ability to pass 4K signals to the TV at up to 120Hz, for ultra-smooth motion. HDMI 2.1 is a requirement for 8K video sources like the PlayStation 5and Xbox Series X. For most non-gamers, HDMI 2.1 is a nice way to future-proof yourself but it's nowhere near a necessity yet.HDCP 2.3The latest version of the High-Bandwidth Digital Content Protection technology, which provides copy prevention specifically of 4K Ultra HD and 8K content. Any source device that requires HDCP 2.3 will require a TV with an HDCP 2.3-compliant HDMI port for a compatible connection.HEVC (H.265)Stands for High-Efficiency Video Coding. A compression technology developed to make large 4K UHD video files smaller and, therefore, easier to stream over broadband Internet connections. HEVC is said to double the data compression ratio over H.264, the predominant encoding technology used today for 1080p videos while retaining the same video quality. A smart TV or streaming set-top box must be able to decode HEVC to playback 4K Ultra HD video from sites like Netflix and Amazon Prime Video.VP9An alternative to HEVC developed by Google and used predominantly for encoding 4K Ultra HD YouTube videos. For a smart TV or streaming set-top box to play 4K Ultra HD YouTube videos, it must be able to decode VP9 videos.Editors Recommendations
    0 Comments 0 Shares 43 Views
  • WWW.DIGITALTRENDS.COM
    Holiday Gaming Hub
    Every fall, video game publishers kick into high gear to get their biggest releases out in time for holiday shopping. Were here to make sure you dont get overwhelmed. Our Holiday Gaming hub will give you all the news on the seasons hottest games and catalog our reviews to help make your holiday shopping (or listmaking) easier.
    0 Comments 0 Shares 43 Views
  • WWW.WSJ.COM
    Nvidias Sales Soar as AI Spending Boom Barrels Ahead
    The AI chip giant gave a strong outlook, pointing to healthy demand for next-generation chips.
    0 Comments 0 Shares 46 Views
  • WWW.WSJ.COM
    Elon Musks xAI Startup Is Valued at $50 Billion in New Funding Round
    The artificial-intelligence company has told investors it raised $5 billion in its latest funding round.
    0 Comments 0 Shares 46 Views
  • WWW.WSJ.COM
    1001 Movie Posters Review: The Art of the Screen Teaser
    A movie poster is made to promote a film. Hung on a wall at home, it becomes an advertisement for a sense of self.
    0 Comments 0 Shares 45 Views
  • WWW.WSJ.COM
    Wicked Review: A Movie Musical of Old-Fashioned Magic
    Cynthia Erivo and Ariana Grande star in director Jon M. Chus lavishly entertaining adaptation of the Broadway hit set in the land of Oz.
    0 Comments 0 Shares 45 Views
  • ARSTECHNICA.COM
    Qubit that makes most errors obvious now available to customers
    Qubits on rails Qubit that makes most errors obvious now available to customers Can a small machine that makes error correction easier upend the market? John Timmer Nov 20, 2024 3:58 pm | 9 A graphic representation of the two resonance cavities that can hold photons, along with a channel that lets the photon move between them. Credit: Quantum Circuits A graphic representation of the two resonance cavities that can hold photons, along with a channel that lets the photon move between them. Credit: Quantum Circuits Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreWe're nearing the end of the year, and there are typically a flood of announcements regarding quantum computers around now, in part because some companies want to live up to promised schedules. Most of these involve evolutionary improvements on previous generations of hardware. But this year, we have something new: the first company to market with a new qubit technology.The technology is called a dual-rail qubit, and it is intended to make the most common form of error trivially easy to detect in hardware, thus making error correction far more efficient. And, while tech giant Amazon has been experimenting with them, a startup called Quantum Circuits is the first to give the public access to dual-rail qubits via a cloud service.While the tech is interesting on its own, it also provides us with a window into how the field as a whole is thinking about getting error-corrected quantum computing to work.Whats a dual-rail qubit?Dual-rail qubits are variants of the hardware used in transmons, the qubits favored by companies like Google and IBM. The basic hardware unit links a loop of superconducting wire to a tiny cavity that allows microwave photons to resonate. This setup allows the presence of microwave photons in the resonator to influence the behavior of the current in the wire and vice versa. In a transmon, microwave photons are used to control the current. But there are other companies that have hardware that does the reverse, controlling the state of the photons by altering the current.Dual-rail qubits use two of these systems linked together, allowing photons to move from the resonator to the other. Using the superconducting loops, it's possible to control the probability that a photon will end up in the left or right resonator. The actual location of the photon will remain unknown until it's measured, allowing the system as a whole to hold a single bit of quantum informationa qubit.This has an obvious disadvantage: You have to build twice as much hardware for the same number of qubits. So why bother? Because the vast majority of errors involve the loss of the photon, and that's easily detected. "It's about 90 percent or more [of the errors]," said Quantum Circuits' Andrei Petrenko. "So it's a huge advantage that we have with photon loss over other errors. And that's actually what makes the error correction a lot more efficient: The fact that photon losses are by far the dominant error."Petrenko said that, without doing a measurement that would disrupt the storage of the qubit, it's possible to determine if there is an odd number of photons in the hardware. If that isn't the case, you know an error has occurredmost likely a photon loss (gains of photons are rare but do occur). For simple algorithms, this would be a signal to simply start over.But it does not eliminate the need for error correction if we want to do more complex computations that can't make it to completion without encountering an error. There's still the remaining 10 percent of errors, which are primarily something called a phase flip that is distinct to quantum systems. Bit flips are even more rare in dual-rail setups. Finally, simply knowing that a photon was lost doesn't tell you everything you need to know to fix the problem; error-correction measurements of other parts of the logical qubit are still needed to fix any problems. The layout of the new machine. Each qubit (gray square) involves a left and right resonance chamber (blue dots) that a photon can move between. Each of the qubits has connections that allow entanglement with its nearest neighbors. Credit: Quantum Circuits In fact, the initial hardware that's being made available is too small to even approach useful computations. Instead, Quantum Circuits chose to link eight qubits with nearest-neighbor connections in order to allow it to host a single logical qubit that enables error correction. Put differently: this machine is meant to enable people to learn how to use the unique features of dual-rail qubits to improve error correction.One consequence of having this distinctive hardware is that the software stack that controls operations needs to take advantage of its error detection capabilities. None of the other hardware on the market can be directly queried to determine whether it has encountered an error. So, Quantum Circuits has had to develop its own software stack to allow users to actually benefit from dual-rail qubits. Petrenko said that the company also chose to provide access to its hardware via its own cloud service because it wanted to connect directly with the early adopters in order to better understand their needs and expectations.Numbers or noise?Given that a number of companies have already released multiple revisions of their quantum hardware and have scaled them into hundreds of individual qubits, it may seem a bit strange to see a company enter the market now with a machine that has just a handful of qubits. But amazingly, Quantum Circuits isn't alone in planning a relatively late entry into the market with hardware that only hosts a few qubits.Having talked with several of them, there is a logic to what they're doing. What follows is my attempt to convey that logic in a general form, without focusing on any single company's case.Everyone agrees that the future of quantum computation is error correction, which requires linking together multiple hardware qubits into a single unit termed a logical qubit. To get really robust, error-free performance, you have two choices. One is to devote lots of hardware qubits to the logical qubit, so you can handle multiple errors at once. Or you can lower the error rate of the hardware, so that you can get a logical qubit with equivalent performance while using fewer hardware qubits. (The two options aren't mutually exclusive, and everyone will need to do a bit of both.)The two options pose very different challenges. Improving the hardware error rate means diving into the physics of individual qubits and the hardware that controls them. In other words, getting lasers that have fewer of the inevitable fluctuations in frequency and energy. Or figuring out how to manufacture loops of superconducting wire with fewer defects or handle stray charges on the surface of electronics. These are relatively hard problems.By contrast, scaling qubit count largely involves being able to consistently do something you already know how to do. So, if you already know how to make good superconducting wire, you simply need to make a few thousand instances of that wire instead of a few dozen. The electronics that will trap an atom can be made in a way that will make it easier to make them thousands of times. These are mostly engineering problems, and generally of similar complexity to problems we've already solved to make the electronics revolution happen.In other words, within limits, scaling is a much easier problem to solve than errors. It's still going to be extremely difficult to get the millions of hardware qubits we'd need to error correct complex algorithms on today's hardware. But if we can get the error rate down a bit, we can use smaller logical qubits and might only need 10,000 hardware qubits, which will be more approachable.Errors firstAnd there's evidence that even the early entries in quantum computing have reasoned the same way. Google has been working iterations of the same chip design since its 2019 quantum supremacy announcement, focusing on understanding the errors that occur on improved versions of that chip. IBM made hitting the 1,000 qubit mark a major goal but has since been focused on reducing the error rate in smaller processors. Someone at a quantum computing startup once told us it would be trivial to trap more atoms in its hardware and boost the qubit count, but there wasn't much point in doing so given the error rates of the qubits on the then-current generation machine.The new companies entering this market now are making the argument that they have a technology that will either radically reduce the error rate or make handling the errors that do occur much easier. Quantum Circuits clearly falls into the latter category, as dual-rail qubits are entirely about making the most common form of error trivial to detect. The former category includes companies like Oxford Ionics, which has indicated it can perform single-qubit gates with a fidelity of over 99.9991 percent. Or Alice & Bob, which stores qubits in the behavior of multiple photons in a single resonance cavity, making them very robust to the loss of individual photons.These companies are betting that they have distinct technology that will let them handle error rate issues more effectively than established players. That will lower the total scaling they need to do, and scaling will be an easier problem overalland one that they may already have the pieces in place to handle. Quantum Circuits' Petrenko, for example, told Ars, "I think that we're at the point where we've gone through a number of iterations of this qubit architecture where we've de-risked a number of the engineering roadblocks." And Oxford Ionics told us that if they could make the electronics they use to trap ions in their hardware once, it would be easy to mass manufacture them.None of this should imply that these companies will have it easy compared to a startup that already has experience with both reducing errors and scaling, or a giant like Google or IBM that has the resources to do both. But it does explain why, even at this stage in quantum computing's development, we're still seeing startups enter the field.John TimmerSenior Science EditorJohn TimmerSenior Science Editor John is Ars Technica's science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots. 9 Comments Prev story
    0 Comments 0 Shares 47 Views
  • ARSTECHNICA.COM
    Google stops letting sites like Forbes rule search for Best CBD Gummies
    Best Hail-Mary Revenue for Publishers 2024 Google stops letting sites like Forbes rule search for Best CBD Gummies If you've noticed strange sites on "Best" product searches, so has Google. Kevin Purdy Nov 20, 2024 2:47 pm | 46 Credit: Getty Images Credit: Getty Images Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn more"Updating our site reputation abuse policy" is how Google, in almost wondrously opaque fashion, announced yesterday that big changes have come to some big websites, especially those that rely on their domain authority to promote lucrative third-party product recommendations.If you've searched for reviews and seen results that make you ask why so many old-fashioned news sites seem to be "reviewing" products latelyespecially products outside that site's expertisethat's what Google is targeting."This is a tactic where third-party content is published on a host site in an attempt to take advantage of the host's already-established ranking signals," Google's post on its Search Central blog reads. "The goal of this tactic is for the content to rank better than it could otherwise on a different site, and leads to a bad search experience for users."Search firm Sistrix cited the lost traffic to the third-party review content inside Forbes, The Wall Street Journal, CNN, Fortune, and Time as worth $7.5 million last week, according to AdWeek. Search rankings dropped by up to 97 percent at Time's affiliate review site, Time Stamped, and 43 percent at Forbes Advisor. The drops are isolated to the affiliate subdomains of the sites, so their news-minded primary URLs still rank where relevant.Trusted names in CBD gummies and pet insuranceThe "site reputation abuse" Google is targeting takes many forms, but it has one common theme: using an established site's domain history to quietly sell things. Forbes, a well-established business news site, has an ownership stake in Forbes Marketplace (named Forbes Advisor in site copy) but does not fully own it.Under the strength of Forbes' long-existing and well-linked site, Forbes Marketplace/Advisor has dominated the search term "best cbd gummies" for "an eternity," according to SEO analyst Lily Ray. Forbes has similarly dominated "best pet insurance," and long came up as the second result for "how to get rid of roaches," as detailed in a blog post by Lars Lofgren. If people click on this high-ranking result, and then click on a link to buy a product or request a roach removal consultation, Forbes typically gets a cut.Forbes Marketplace had seemingly also provided SEO-minded review services to CNN and USA Today, as detailed by Lofgren. Lofgren's term for this business, "Parasite SEO," took hold in corners critical of the trend. Ars has contacted Forbes for comment and will update this post with response.The unfair, exploitative nature of parasite SEOGoogle writes that it had reviewed "situations where there might be varying degrees of first-party involvement" (most publishers' review sites indicate some kind of oversight or editorial standards linked to the primary site). But however arranged, "no amount of first-party involvement alters the fundamental third-party nature of the content or the unfair, exploitative nature of attempting to take advantage of the host sites' ranking signals."As such, using third-party content in such a way as to take advantage of a high search quality ranking, outside the site's primary focus, is considered spam. That delivers a major hit to a site's Google ranking, and the impact is already being felt.The SEO reordering does not affect more established kinds of third-party content, like wire service reports, syndication, or well-marked sponsored content, as detailed in Google's spam policy section about site reputation abuse. As seen on the SEO subreddit, and on social media, Google has given sites running afoul of its updated policy a "Manual Action" rather than relying only on its algorithm to catch the often opaque arrangements.Kevin PurdySenior Technology ReporterKevin PurdySenior Technology Reporter Kevin is a senior technology reporter at Ars Technica, covering open-source software, PC gaming, home automation, repairability, e-bikes, and tech history. He has previously worked at Lifehacker, Wirecutter, iFixit, and Carbon Switch. 46 Comments
    0 Comments 0 Shares 47 Views
  • WWW.INFORMATIONWEEK.COM
    Help Wanted: IT Hiring Trends in 2025
    Lisa Morgan, Freelance WriterNovember 20, 20248 Min ReadEgor Kotenko via Alamy Stock Digital transformation changed the nature of the IT/business partnership. Specifically, IT has become a driving force in reducing operating costs, making the workforce more productive and improving value streams. These shifts are also reflected in the way IT is structured."When it comes to recruiting and attracting IT talent, it is time for IT leadership to shine. Their involvement in the process needs to be much more active to find the resources that teams need right now. And more than anything, its not the shiny new roles we are struggling to hire for. Its [the] on-prem network engineer and cloud architect you need to drive business outcomes right now. Its the cybersecurity analyst, says Brittany Lutes, research director at Info-Tech Research Group in an email interview.Most organizations arent sunsetting roles, she says. Instead, theyre more focused on retaining talent and ensuring that talent has the right skills and degree of competency in those skills.It takes time to hire new resources, ensure the institutional knowledge is understood, and then get those people to continue learning new skills or applications of the skills they were hired for, says Lutes. We are better off to retain people, explore opportunities to bring in new levels or job titles with HR to satisfy development desires, and understand what the new foundational and technical skills exist that we need to grow in our organization. We have opportunities to use technology in exciting new ways to make every role from CIO to the service desk analyst more efficient and more engaging. This year I think many organizations will work to embrace that.Related:Brittany Lutes, Info-Tech Research GroupBusiness and Technology Shifts Mean IT Changes?Julia Stalnaya, CEO and founder of B2B hiring platform Unbench, believes IT hiring in 2025 is poised for significant transformation, shaped by technological advancements, evolving workforce expectations and changing business needs.The 2024 layoffs across tech industries have introduced new dynamics into the hiring process for 2025. Companies [are] adapting to leaner staffing models increasingly turn to subcontracting and flexible hiring solutions, says Stalnaya.There are several drivers behind these changes. They include technological advancements such as data-driven recruitment, AI and automation.As a result of the pandemic, remote work expanded the talent pool beyond geographical boundaries, allowing companies to hire top talent from diverse locations. This trend necessitates more flexible work arrangements and a shift in how companies handle employee engagement and collaboration.Related:Skills-based hiring will focus more on specific skills and less on traditional qualifications. This reflects the need for targeted competencies aligned with business objectives, says Stalnaya. This trend is significant for roles in rapidly evolving fields like AI, cloud engineering and cybersecurity.Some traditional IT roles will continue to decline as AI takes on more routine tasks while other roles grow. She anticipates the following:AI specialists who work across departments to deploy intelligent systems that enhance productivity and innovationCybersecurity experts, including ethical hackers, cybersecurity analysts and cloud security specialists. In addition to protecting data, they will also help ensure compliance with security standards and develop strategies to safeguard against emerging threats.Data analysts and scientists who help the business leverage insights for strategic decision-makingBlockchain developers able to build decentralized solutionsHowever, organizations must invest in training and development and embrace flexible work options if they want to attract and keep talent, which may conflict with mandatory return to office (RTO) policies.Related:The 2024 layoffs have had a profound impact on the IT hiring landscape. With increased competition for fewer roles, companies now have access to a larger talent pool. Still, they must adapt their recruitment strategies to attract top candidates who are selective about company culture, flexibility and growth opportunities, says Stalnaya. This environment also highlights the importance of subcontracting.Julia Stalnaya, UnbenchGreg Goodin, managing director of talent solutions company EXOS TALENT expects companies to start hiring to get new R&D projects off the ground and to become more competitive.Dont expect it to bounce back to pandemic or necessarily pre-pandemic levels, says Goodin. IT as a career and industry has reached a maturation point where hypergrowth will be more of an outlier and more consistent 3% to 5% year-over-year growth [the norm]. Fiscal responsibility will become the expectation. Hiring trends will most likely run in parallel with this new cycle with compensation leveling out.Whats Changing, Why and How?Interest rates are higher than they have been in recent history, which has directly influenced companies' hiring practices. Not surprisingly, AI has also had an impact, making workforces more productive and reducing costs.Meanwhile, hiring has become more data-driven, enabling organizations to better understand what full-time and contingent labor they need.During the pandemic, companies continued to hire, even if they didnt have a plan for what the new talent would be doing, according to Goodin.This led to a hoarding of employees and spending countless unnecessary dollars to have people essentially doing nothing, says Goodin. This was one of many reasons companies started to reset their workforce with mass layoffs. Expect more thoughtful, data-driven hiring practices to make sure an ROI is being realized for each employee [hired].The IT talent shortage persists, so universities and bootcamps have been attempting to churn out talent thats aligned with market needs. Companies have also had more options, such as hiring internationally, including H-1B visas.Technology moves at a rapid pace, so it is important to maintain an open mind to new ways of solving problems, while not jumping the gun on a passing fad, says Goodin. Continue to invest in your existing workforce and upskill them, when possible. This will lead to better employee engagement [and] decreased costs associated with hiring and training up new talent into your organization.Soft-skills such as communication, character, and emotional quotient will all be that much more coveted in a world utilizing AI and automation to supplement human-beings, he says.IT and the Business IT has always supported the business, but its role is now more of a partnership and a thought leader when it comes to succeeding in an increasingly tech-fueled business environment.By 2025, I believe IT hiring will reflect a new paradigm as the line between IT and other business functions continues to blur, driven by AIs growing role in daily operations. Instead of being confined to back office support, IT will become a foundational aspect of strategic business operations, blending into departments like marketing, finance, and HR. This blur will likely accelerate next year, with roles and responsibilities traditionally managed by IT -- like data security, process automation and analytics -- becoming collaborative efforts with other departments, says Etoulia Salas-Burnett, director of the Center for Digital Business at Howard University. In an email interview This shift demands IT professionals who can bridge technical expertise with business strategy, making the boundary between IT and other business functions increasingly indistinct.In 2025, she believes several newer roles will become more common, including AI integration specialists, AI ethics and compliance officer, digital transformation strategist and automation success managers. Waning titles include help desk technician and network administrator, she says.Stephen Thompson, former VP of Talent at Docusign says the expansion of cloud services and serverless architectures has driven costs up, absorbing a growing portion of IT budgets. In some cases, server expenses rival the total cost of all employees at certain companies.Enterprise organizations are actively seeking integrations with platforms like Salesforce, ServiceNow, and SAP. The serverless shift and the continuous need for integration engineers have required IT departments to evolve, becoming stronger engineering partners and application developers for critical in-house systems in sales, marketing, and HR, says Thompson in an email interview. As a result, 2025 may resemble the 2012 to 2015 period, with new technologies promising growth, and a high demand for scalable engineering expertise. Companies will seek software engineers who not only maintain but also optimize system performance, ensuring a significant return on investment. These professionals turn the seemingly impossible into reality, saving IT departments millions in the process.Green Tech Will Become More PopularFrom smaller AI models to biodegradable and recycled packaging, tech is necessarily becoming greener.We are already seeing many companies review their carbon footprint and prioritize sustainability projects, in response to climate change [and] customer and client demand. CIOs and other tech leaders will likely face more pressure to prove their sustainability and green plans within their IT projects, says Matt Collingwood, founder & managing director at VIQU IT Recruitment. This may include legacy systems needing to be phased out, tracking energy consumption across the business and supply chain, and more. In turn, this will create an increasing demand for IT roles within infrastructure, systems engineering and development.In the meantime, organizations should be mindful about algorithmic and human bias in hiring.Organizations need to make sure that they are hiring inclusively, says Collingwood. This means anonymizing CVs to reduce chances of unconscious bias, as well as putting job adverts through a gender decoder to ensure the business is not inadvertently putting off great female tech professionals.About the AuthorLisa MorganFreelance WriterLisa Morgan is a freelance writer who covers business and IT strategy and emergingtechnology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.See more from Lisa MorganNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeReportsMore Reports
    0 Comments 0 Shares 45 Views