• Dungeons & Dragons, Wizards of the Coast, Giant Skull, videojuegos, RPG, TTRPG, Baldur's Gate 3, noticias de juegos, desarrollo de videojuegos

    ## Introducción

    Recientemente, Wizards of the Coast (WOTC) ha anunciado una colaboración con Giant Skull para desarrollar un nuevo juego de Dungeons & Dragons. Este juego será un título para un solo jugador y se presenta como "épico". Esta noticia surge un año después de que WOTC y Larian Studios terminaran su colaboración, la cual resultó en el exitoso...
    Dungeons & Dragons, Wizards of the Coast, Giant Skull, videojuegos, RPG, TTRPG, Baldur's Gate 3, noticias de juegos, desarrollo de videojuegos ## Introducción Recientemente, Wizards of the Coast (WOTC) ha anunciado una colaboración con Giant Skull para desarrollar un nuevo juego de Dungeons & Dragons. Este juego será un título para un solo jugador y se presenta como "épico". Esta noticia surge un año después de que WOTC y Larian Studios terminaran su colaboración, la cual resultó en el exitoso...
    Wizards of the Coast se une a Giant Skull en un 'épico' juego de Dungeons & Dragons para un solo jugador
    Dungeons & Dragons, Wizards of the Coast, Giant Skull, videojuegos, RPG, TTRPG, Baldur's Gate 3, noticias de juegos, desarrollo de videojuegos ## Introducción Recientemente, Wizards of the Coast (WOTC) ha anunciado una colaboración con Giant Skull para desarrollar un nuevo juego de Dungeons & Dragons. Este juego será un título para un solo jugador y se presenta como "épico". Esta noticia...
    Like
    Love
    Wow
    Sad
    Angry
    94
    1 Comments 0 Shares
  • trans health care, trans joy, resources for trans people, hormones, surgery, US Supreme Court, trans rights, healthcare access, Jules Gill-Peterson

    ---

    ## The Silent Struggles of Trans Individuals

    In the shadow of societal norms, where acceptance often feels like an unreachable dream, the journey of trans individuals is riddled with pain and longing. Each day, they navigate a world that frequently denies them their basic rights, subjecting them to the torment of invisibility and neglect. The ...
    trans health care, trans joy, resources for trans people, hormones, surgery, US Supreme Court, trans rights, healthcare access, Jules Gill-Peterson --- ## The Silent Struggles of Trans Individuals In the shadow of societal norms, where acceptance often feels like an unreachable dream, the journey of trans individuals is riddled with pain and longing. Each day, they navigate a world that frequently denies them their basic rights, subjecting them to the torment of invisibility and neglect. The ...
    This Historian Has Seen the Future of Trans Health Care: A Call for Change
    trans health care, trans joy, resources for trans people, hormones, surgery, US Supreme Court, trans rights, healthcare access, Jules Gill-Peterson --- ## The Silent Struggles of Trans Individuals In the shadow of societal norms, where acceptance often feels like an unreachable dream, the journey of trans individuals is riddled with pain and longing. Each day, they navigate a world that...
    Like
    Love
    Wow
    Angry
    Sad
    534
    1 Comments 0 Shares
  • MedTech AI, hardware, and clinical application programmes

    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #medtech #hardware #clinical #application #programmes
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #medtech #hardware #clinical #application #programmes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between $14 billion and $55 billion annually in productivity gains. Through GenAI adoption, an additional $50 billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experience (UX) being equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. (Image source: “IBM Achieves New Deep Learning Breakthrough” by IBM Research is licensed under CC BY-ND 2.0.)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 Comments 0 Shares
  • Newspaper Club makes headlines with first-ever publication and bold print campaign

    In a confident nod to the enduring power of print, Glasgow-based Newspaper Club has launched The Printing Press, its first-ever self-published newspaper. Known for helping designers, brands, and artists print their own publications, Newspaper Club is now telling its own story through a medium it knows best.
    "We're always sharing the brilliant things people print with us – usually online, through our blog and Instagram," explains CMO Kaye Symington. "Our customers have some great stories behind their projects, and it just made sense for a newspaper printing company to have a newspaper of its own!"
    Teaming up with their brilliant design partner Euan Gallacher at D8 Studio, Kaye said they also wanted to show what's possible with the format: "A lot of people just think of newspapers as something for breaking news, but there's so much more you can do with them."

    The tabloid-style publication explores the creative resurgence of newspapers as branding tools and storytelling devices, which is music to our ears. Inside, readers will find thoughtful features on how modern brands are embracing print, including interviews with Papier's head of brand on narrative design, Cubitts' in-house designer on developing a tactile, analogue campaign, and Vocal Type's Tré Seals on transforming a museum exhibition into a printed experience.
    Why the mighty turnaround? "There's just nothing quite like newsprint," says Kaye. "It slows you down in the best way, especially when there's so much competing for your attention online. A newspaper isn't trying to go viral, which is refreshing."
    She adds: "Putting together a newspaper makes you think differently. It's scrappy and democratic, which makes it a great space to play around and tell stories more creatively. And at the end of it, you've got something real to hand someone instead of just sending them a link."

    To celebrate this almighty launch, Newspaper Club is going beyond the page with a striking national ad campaign. In partnership with Build Hollywood, the company has installed billboards in Glasgow, Birmingham, Brighton, and Cardiff, all proudly showcasing the work of Newspaper Club customers. These include colourful pieces from artist Supermundane and independent homeware designer Sophie McNiven, highlighting the creative range of projects that come to life through their press.
    In London, the celebration continues with a special collaboration with News & Coffee at Holborn Station. For two weeks, the kiosk has been transformed into a shrine to print — complete with stacks of The Printing Press and complimentary coffee for the first 20 early birds each weekday until 17 June.
    The timing feels deliberate. As digital fatigue sets in, social media continues to disappoint, and brands look for fresh ways to stand out in a 'post-search' world, newspapers are experiencing a quiet renaissance. But they're being used not just for news but also as limited-edition catalogues, keepsakes for events, and props in photo shoots. It's this playful, flexible nature of newsprint that The Printing Press aims to explore and celebrate.

    Since 2009, Newspaper Club has built its reputation on making newspaper printing accessible to all — from major brands like Adobe and Spotify to indie creators, students and storytellers. This campaign marks a new chapter: a chance to turn the lens inward, shine a spotlight on the creative possibilities of print, and reassert the joy of ink on paper. As Kaye puts it, "We want people to see that newspapers can be a really creative format. It might be a traditional medium, but that's exactly what makes it stand out in a digital world.
    "Sometimes the hardest part is just knowing where to start with a new project, so we hope this campaign helps spark ideas and inspire people to print something they're excited about!"
    As The Printing Press hits streets and kiosks across the UK, one thing is clear: print isn't dead. It's just getting started.
    #newspaper #club #makes #headlines #with
    Newspaper Club makes headlines with first-ever publication and bold print campaign
    In a confident nod to the enduring power of print, Glasgow-based Newspaper Club has launched The Printing Press, its first-ever self-published newspaper. Known for helping designers, brands, and artists print their own publications, Newspaper Club is now telling its own story through a medium it knows best. "We're always sharing the brilliant things people print with us – usually online, through our blog and Instagram," explains CMO Kaye Symington. "Our customers have some great stories behind their projects, and it just made sense for a newspaper printing company to have a newspaper of its own!" Teaming up with their brilliant design partner Euan Gallacher at D8 Studio, Kaye said they also wanted to show what's possible with the format: "A lot of people just think of newspapers as something for breaking news, but there's so much more you can do with them." The tabloid-style publication explores the creative resurgence of newspapers as branding tools and storytelling devices, which is music to our ears. Inside, readers will find thoughtful features on how modern brands are embracing print, including interviews with Papier's head of brand on narrative design, Cubitts' in-house designer on developing a tactile, analogue campaign, and Vocal Type's Tré Seals on transforming a museum exhibition into a printed experience. Why the mighty turnaround? "There's just nothing quite like newsprint," says Kaye. "It slows you down in the best way, especially when there's so much competing for your attention online. A newspaper isn't trying to go viral, which is refreshing." She adds: "Putting together a newspaper makes you think differently. It's scrappy and democratic, which makes it a great space to play around and tell stories more creatively. And at the end of it, you've got something real to hand someone instead of just sending them a link." To celebrate this almighty launch, Newspaper Club is going beyond the page with a striking national ad campaign. In partnership with Build Hollywood, the company has installed billboards in Glasgow, Birmingham, Brighton, and Cardiff, all proudly showcasing the work of Newspaper Club customers. These include colourful pieces from artist Supermundane and independent homeware designer Sophie McNiven, highlighting the creative range of projects that come to life through their press. In London, the celebration continues with a special collaboration with News & Coffee at Holborn Station. For two weeks, the kiosk has been transformed into a shrine to print — complete with stacks of The Printing Press and complimentary coffee for the first 20 early birds each weekday until 17 June. The timing feels deliberate. As digital fatigue sets in, social media continues to disappoint, and brands look for fresh ways to stand out in a 'post-search' world, newspapers are experiencing a quiet renaissance. But they're being used not just for news but also as limited-edition catalogues, keepsakes for events, and props in photo shoots. It's this playful, flexible nature of newsprint that The Printing Press aims to explore and celebrate. Since 2009, Newspaper Club has built its reputation on making newspaper printing accessible to all — from major brands like Adobe and Spotify to indie creators, students and storytellers. This campaign marks a new chapter: a chance to turn the lens inward, shine a spotlight on the creative possibilities of print, and reassert the joy of ink on paper. As Kaye puts it, "We want people to see that newspapers can be a really creative format. It might be a traditional medium, but that's exactly what makes it stand out in a digital world. "Sometimes the hardest part is just knowing where to start with a new project, so we hope this campaign helps spark ideas and inspire people to print something they're excited about!" As The Printing Press hits streets and kiosks across the UK, one thing is clear: print isn't dead. It's just getting started. #newspaper #club #makes #headlines #with
    WWW.CREATIVEBOOM.COM
    Newspaper Club makes headlines with first-ever publication and bold print campaign
    In a confident nod to the enduring power of print, Glasgow-based Newspaper Club has launched The Printing Press, its first-ever self-published newspaper. Known for helping designers, brands, and artists print their own publications, Newspaper Club is now telling its own story through a medium it knows best. "We're always sharing the brilliant things people print with us – usually online, through our blog and Instagram," explains CMO Kaye Symington. "Our customers have some great stories behind their projects, and it just made sense for a newspaper printing company to have a newspaper of its own!" Teaming up with their brilliant design partner Euan Gallacher at D8 Studio, Kaye said they also wanted to show what's possible with the format: "A lot of people just think of newspapers as something for breaking news, but there's so much more you can do with them." The tabloid-style publication explores the creative resurgence of newspapers as branding tools and storytelling devices, which is music to our ears. Inside, readers will find thoughtful features on how modern brands are embracing print, including interviews with Papier's head of brand on narrative design, Cubitts' in-house designer on developing a tactile, analogue campaign, and Vocal Type's Tré Seals on transforming a museum exhibition into a printed experience. Why the mighty turnaround? "There's just nothing quite like newsprint," says Kaye. "It slows you down in the best way, especially when there's so much competing for your attention online. A newspaper isn't trying to go viral, which is refreshing." She adds: "Putting together a newspaper makes you think differently. It's scrappy and democratic, which makes it a great space to play around and tell stories more creatively. And at the end of it, you've got something real to hand someone instead of just sending them a link." To celebrate this almighty launch, Newspaper Club is going beyond the page with a striking national ad campaign. In partnership with Build Hollywood, the company has installed billboards in Glasgow, Birmingham, Brighton, and Cardiff, all proudly showcasing the work of Newspaper Club customers. These include colourful pieces from artist Supermundane and independent homeware designer Sophie McNiven, highlighting the creative range of projects that come to life through their press. In London, the celebration continues with a special collaboration with News & Coffee at Holborn Station. For two weeks, the kiosk has been transformed into a shrine to print — complete with stacks of The Printing Press and complimentary coffee for the first 20 early birds each weekday until 17 June. The timing feels deliberate. As digital fatigue sets in, social media continues to disappoint, and brands look for fresh ways to stand out in a 'post-search' world, newspapers are experiencing a quiet renaissance. But they're being used not just for news but also as limited-edition catalogues, keepsakes for events, and props in photo shoots. It's this playful, flexible nature of newsprint that The Printing Press aims to explore and celebrate. Since 2009, Newspaper Club has built its reputation on making newspaper printing accessible to all — from major brands like Adobe and Spotify to indie creators, students and storytellers. This campaign marks a new chapter: a chance to turn the lens inward, shine a spotlight on the creative possibilities of print, and reassert the joy of ink on paper. As Kaye puts it, "We want people to see that newspapers can be a really creative format. It might be a traditional medium, but that's exactly what makes it stand out in a digital world. "Sometimes the hardest part is just knowing where to start with a new project, so we hope this campaign helps spark ideas and inspire people to print something they're excited about!" As The Printing Press hits streets and kiosks across the UK, one thing is clear: print isn't dead. It's just getting started.
    0 Comments 0 Shares
  • IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    By John P. Mello Jr.
    June 11, 2025 5:00 AM PT

    IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT
    Enterprise IT Lead Generation Services
    Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more.

    IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible.
    The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent.
    “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.”
    IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del.
    “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.”
    A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time.
    Realistic Roadmap
    Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld.
    “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany.
    “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.”
    Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.”
    “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.”
    “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada.
    “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.”
    Solving the Quantum Error Correction Puzzle
    To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits.
    “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.”
    IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published.

    Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices.
    In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer.
    One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing.
    According to IBM, a practical fault-tolerant quantum architecture must:

    Suppress enough errors for useful algorithms to succeed
    Prepare and measure logical qubits during computation
    Apply universal instructions to logical qubits
    Decode measurements from logical qubits in real time and guide subsequent operations
    Scale modularly across hundreds or thousands of logical qubits
    Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources

    Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained.
    “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.”
    Q-Day Approaching Faster Than Expected
    For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated.
    “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif.
    “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.”

    “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said.
    Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years.
    “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld.
    “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.”
    “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.”
    “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.”

    John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

    Leave a Comment

    Click here to cancel reply.
    Please sign in to post or reply to a comment. New users create a free account.

    Related Stories

    More by John P. Mello Jr.

    view all

    More in Emerging Tech
    #ibm #plans #largescale #faulttolerant #quantum
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech #ibm #plans #largescale #faulttolerant #quantum
    WWW.TECHNEWSWORLD.COM
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system. (Image Credit: IBM) ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillion (10⁴⁸) of the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed $30 billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity check (qLDPC) codes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling [RCS], can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQC [post-quantum cryptography] preparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EO [Executive Order] that relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech
    0 Comments 0 Shares
  • A short history of the roadblock

    Barricades, as we know them today, are thought to date back to the European wars of religion. According to most historians, the first barricade went up in Paris in 1588; the word derives from the French barriques, or barrels, spontaneously put together. They have been assembled from the most diverse materials, from cobblestones, tyres, newspapers, dead horses and bags of ice, to omnibuses and e‑scooters. Their tactical logic is close to that of guerrilla warfare: the authorities have to take the barricades in order to claim victory; all that those manning them have to do to prevail is to hold them. 
    The 19th century was the golden age for blocking narrow, labyrinthine streets. Paris had seen barricades go up nine times in the period before the Second Empire; during the July 1830 Revolution alone, 4,000 barricades had been erected. These barricades would not only stop, but also trap troops; people would then throw stones from windows or pour boiling water onto the streets. Georges‑Eugène Haussmann, Napoleon III’s prefect of Paris, famously created wide boulevards to make blocking by barricade more difficult and moving the military easier, and replaced cobblestones with macadam – a surface of crushed stone. As Flaubert observed in his Dictionary of Accepted Ideas: ‘Macadam: has cancelled revolutions. No more means to make barricades. Nevertheless rather inconvenient.’  
    Lead image: Barricades, as we know them today, are thought to have originated in early modern France. A colour engraving attributed to Achille‑Louis Martinet depicts the defence of a barricade during the 1830 July Revolution. Credit: Paris Musées / Musée Carnavalet – Histoire de Paris. Above: the socialist political thinker and activist Louis Auguste Blanqui – who was imprisoned by every regime that ruled France between 1815 and 1880 – drew instructions for how to build an effective barricade

    Under Napoleon III, Baron Haussmann widened Paris’s streets in his 1853–70 renovation of the city, making barricading more difficult
    Credit: Old Books Images / Alamy
    ‘On one hand,wanted to favour the circulation of ideas,’ reactionary intellectual Louis Veuillot observed apropos the ambiguous liberalism of the latter period of Napoleon III’s Second Empire. ‘On the other, to ensure the circulation of regiments.’ But ‘anti‑insurgency hardware’, as Justinien Tribillon has called it, also served to chase the working class out of the city centre: Haussmann’s projects amounted to a gigantic form of real-estate speculation, and the 1871 Paris Commune that followed constituted not just a short‑lived anarchist experiment featuring enormous barricades; it also signalled the return of the workers to the centre and, arguably, revenge for their dispossession.   
    By the mid‑19th century, observers questioned whether barricades still had practical meaning. Gottfried Semper’s barricade, constructed for the 1849 Dresden uprising, had proved unconquerable, but Friedrich Engels, one‑time ‘inspector of barricades’ in the Elberfeld insurrection of the same year, already suggested that the barricades’ primary meaning was now moral rather than military – a point to be echoed by Leon Trotsky in the subsequent century. Barricades symbolised bravery and the will to hold out among insurrectionists, and, not least, determination rather to destroy one’s possessions – and one’s neighbourhood – than put up with further oppression.  
    Not only self‑declared revolutionaries viewed things this way: the reformist Social Democrat leader Eduard Bernstein observed that ‘the barricade fight as a political weapon of the people has been completely eliminated due to changes in weapon technology and cities’ structures’. Bernstein was also picking up on the fact that, in the era of industrialisation, contention happened at least as much on the factory floor as on the streets. The strike, not the food riot or the defence of workers’ quartiers, became the paradigmatic form of conflict. Joshua Clover has pointed out in his 2016 book Riot. Strike. Riot: The New Era of Uprisings, that the price of labour, rather than the price of goods, caused people to confront the powerful. Blocking production grew more important than blocking the street.
    ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn’
    Today, it is again blocking – not just people streaming along the streets in large marches – that is prominently associated with protests. Disrupting circulation is not only an important gesture in the face of climate emergency; blocking transport is a powerful form of protest in an economic system focused on logistics and just‑in‑time distribution. Members of Insulate Britain and Germany’s Last Generation super‑glue themselves to streets to stop car traffic to draw attention to the climate emergency; they have also attached themselves to airport runways. They form a human barricade of sorts, immobilising traffic by making themselves immovable.  
    Today’s protesters have made themselves consciously vulnerable. They in fact follow the advice of US civil rights’ Bayard Rustin who explained: ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn.’ Making oneself vulnerable might increase the chances of a majority of citizens seeing the importance of the cause which those engaged in civil disobedience are pursuing. Demonstrations – even large, unpredictable ones – are no longer sufficient. They draw too little attention and do not compel a reaction. Naomi Klein proposed the term ‘blockadia’ as ‘a roving transnational conflict zone’ in which people block extraction – be it open‑pit mines, fracking sites or tar sands pipelines – with their bodies. More often than not, these blockades are organised by local people opposing the fossil fuel industry, not environmental activists per se. Blockadia came to denote resistance to the Keystone XL pipeline as well as Canada’s First Nations‑led movement Idle No More.
    In cities, blocking can be accomplished with highly mobile structures. Like the barricade of the 19th century, they can be quickly assembled, yet are difficult to move; unlike old‑style barricades, they can also be quickly disassembled, removed and hidden. Think of super tripods, intricate ‘protest beacons’ based on tensegrity principles, as well as inflatable cobblestones, pioneered by the artist‑activists of Tools for Action.  
    As recently as 1991, newly independent Latvia defended itself against Soviet tanks with the popular construction of barricades, in a series of confrontations that became known as the Barikādes
    Credit: Associated Press / Alamy
    Inversely, roadblocks can be used by police authorities to stop demonstrations and gatherings from taking place – protesters are seen removing such infrastructure in Dhaka during a general strike in 1999
    Credit: REUTERS / Rafiqur Rahman / Bridgeman
    These inflatable objects are highly flexible, but can also be protective against police batons. They pose an awkward challenge to the authorities, who often end up looking ridiculous when dealing with them, and, as one of the inventors pointed out, they are guaranteed to create a media spectacle. This was also true of the 19th‑century barricade: people posed for pictures in front of them. As Wolfgang Scheppe, a curator of Architecture of the Barricade, explains, these images helped the police to find Communards and mete out punishments after the end of the anarchist experiment.
    Much simpler structures can also be highly effective. In 2019, protesters in Hong Kong filled streets with little archways made from just three ordinary bricks: two standing upright, one resting on top. When touched, the falling top one would buttress the other two, and effectively block traffic. In line with their imperative of ‘be water’, protesters would retreat when the police appeared, but the ‘mini‑Stonehenges’ would remain and slow down the authorities.
    Today, elaborate architectures of protest, such as Extinction Rebellion’s ‘tensegrity towers’, are used to blockade roads and distribution networks – in this instance, Rupert Murdoch’s News UK printworks in Broxbourne, for the media group’s failure to report the climate emergency accurately
    Credit: Extinction Rebellion
    In June 2025, protests erupted in Los Angeles against the Trump administration’s deportation policies. Demonstrators barricaded downtown streets using various objects, including the pink public furniture designed by design firm Rios for Gloria Molina Grand Park. LAPD are seen advancing through tear gas
    Credit: Gina Ferazzi / Los Angeles Times via Getty Images
    Roads which radicals might want to target are not just ones in major metropoles and fancy post‑industrial downtowns. Rather, they might block the arteries leading to ‘fulfilment centres’ and harbours with container shipping. The model is not only Occupy Wall Street, which had initially called for the erection of ‘peaceful barricades’, but also the Occupy that led to the Oakland port shutdown in 2011. In short, such roadblocks disrupt what Phil Neel has called a ‘hinterland’ that is often invisible, yet crucial for contemporary capitalism. More recently, Extinction Rebellion targeted Amazon distribution centres in three European countries in November 2021; in the UK, they aimed to disrupt half of all deliveries on a Black Friday.  
    Will such blockades just anger consumers who, after all, are not present but are impatiently waiting for packages at home? One of the hopes associated with the traditional barricade was always that they might create spaces where protesters, police and previously indifferent citizens get talking; French theorists even expected them to become ‘a machine to produce the people’. That could be why military technology has evolved so that the authorities do not have to get close to the barricade: tear gas was first deployed against those on barricades before it was used in the First World War; so‑called riot control vehicles can ever more easily crush barricades. The challenge, then, for anyone who wishes to block is also how to get in other people’s faces – in order to have a chance to convince them of their cause.       

    2025-06-11
    Kristina Rapacki

    Share
    #short #history #roadblock
    A short history of the roadblock
    Barricades, as we know them today, are thought to date back to the European wars of religion. According to most historians, the first barricade went up in Paris in 1588; the word derives from the French barriques, or barrels, spontaneously put together. They have been assembled from the most diverse materials, from cobblestones, tyres, newspapers, dead horses and bags of ice, to omnibuses and e‑scooters. Their tactical logic is close to that of guerrilla warfare: the authorities have to take the barricades in order to claim victory; all that those manning them have to do to prevail is to hold them.  The 19th century was the golden age for blocking narrow, labyrinthine streets. Paris had seen barricades go up nine times in the period before the Second Empire; during the July 1830 Revolution alone, 4,000 barricades had been erected. These barricades would not only stop, but also trap troops; people would then throw stones from windows or pour boiling water onto the streets. Georges‑Eugène Haussmann, Napoleon III’s prefect of Paris, famously created wide boulevards to make blocking by barricade more difficult and moving the military easier, and replaced cobblestones with macadam – a surface of crushed stone. As Flaubert observed in his Dictionary of Accepted Ideas: ‘Macadam: has cancelled revolutions. No more means to make barricades. Nevertheless rather inconvenient.’   Lead image: Barricades, as we know them today, are thought to have originated in early modern France. A colour engraving attributed to Achille‑Louis Martinet depicts the defence of a barricade during the 1830 July Revolution. Credit: Paris Musées / Musée Carnavalet – Histoire de Paris. Above: the socialist political thinker and activist Louis Auguste Blanqui – who was imprisoned by every regime that ruled France between 1815 and 1880 – drew instructions for how to build an effective barricade Under Napoleon III, Baron Haussmann widened Paris’s streets in his 1853–70 renovation of the city, making barricading more difficult Credit: Old Books Images / Alamy ‘On one hand,wanted to favour the circulation of ideas,’ reactionary intellectual Louis Veuillot observed apropos the ambiguous liberalism of the latter period of Napoleon III’s Second Empire. ‘On the other, to ensure the circulation of regiments.’ But ‘anti‑insurgency hardware’, as Justinien Tribillon has called it, also served to chase the working class out of the city centre: Haussmann’s projects amounted to a gigantic form of real-estate speculation, and the 1871 Paris Commune that followed constituted not just a short‑lived anarchist experiment featuring enormous barricades; it also signalled the return of the workers to the centre and, arguably, revenge for their dispossession.    By the mid‑19th century, observers questioned whether barricades still had practical meaning. Gottfried Semper’s barricade, constructed for the 1849 Dresden uprising, had proved unconquerable, but Friedrich Engels, one‑time ‘inspector of barricades’ in the Elberfeld insurrection of the same year, already suggested that the barricades’ primary meaning was now moral rather than military – a point to be echoed by Leon Trotsky in the subsequent century. Barricades symbolised bravery and the will to hold out among insurrectionists, and, not least, determination rather to destroy one’s possessions – and one’s neighbourhood – than put up with further oppression.   Not only self‑declared revolutionaries viewed things this way: the reformist Social Democrat leader Eduard Bernstein observed that ‘the barricade fight as a political weapon of the people has been completely eliminated due to changes in weapon technology and cities’ structures’. Bernstein was also picking up on the fact that, in the era of industrialisation, contention happened at least as much on the factory floor as on the streets. The strike, not the food riot or the defence of workers’ quartiers, became the paradigmatic form of conflict. Joshua Clover has pointed out in his 2016 book Riot. Strike. Riot: The New Era of Uprisings, that the price of labour, rather than the price of goods, caused people to confront the powerful. Blocking production grew more important than blocking the street. ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn’ Today, it is again blocking – not just people streaming along the streets in large marches – that is prominently associated with protests. Disrupting circulation is not only an important gesture in the face of climate emergency; blocking transport is a powerful form of protest in an economic system focused on logistics and just‑in‑time distribution. Members of Insulate Britain and Germany’s Last Generation super‑glue themselves to streets to stop car traffic to draw attention to the climate emergency; they have also attached themselves to airport runways. They form a human barricade of sorts, immobilising traffic by making themselves immovable.   Today’s protesters have made themselves consciously vulnerable. They in fact follow the advice of US civil rights’ Bayard Rustin who explained: ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn.’ Making oneself vulnerable might increase the chances of a majority of citizens seeing the importance of the cause which those engaged in civil disobedience are pursuing. Demonstrations – even large, unpredictable ones – are no longer sufficient. They draw too little attention and do not compel a reaction. Naomi Klein proposed the term ‘blockadia’ as ‘a roving transnational conflict zone’ in which people block extraction – be it open‑pit mines, fracking sites or tar sands pipelines – with their bodies. More often than not, these blockades are organised by local people opposing the fossil fuel industry, not environmental activists per se. Blockadia came to denote resistance to the Keystone XL pipeline as well as Canada’s First Nations‑led movement Idle No More. In cities, blocking can be accomplished with highly mobile structures. Like the barricade of the 19th century, they can be quickly assembled, yet are difficult to move; unlike old‑style barricades, they can also be quickly disassembled, removed and hidden. Think of super tripods, intricate ‘protest beacons’ based on tensegrity principles, as well as inflatable cobblestones, pioneered by the artist‑activists of Tools for Action.   As recently as 1991, newly independent Latvia defended itself against Soviet tanks with the popular construction of barricades, in a series of confrontations that became known as the Barikādes Credit: Associated Press / Alamy Inversely, roadblocks can be used by police authorities to stop demonstrations and gatherings from taking place – protesters are seen removing such infrastructure in Dhaka during a general strike in 1999 Credit: REUTERS / Rafiqur Rahman / Bridgeman These inflatable objects are highly flexible, but can also be protective against police batons. They pose an awkward challenge to the authorities, who often end up looking ridiculous when dealing with them, and, as one of the inventors pointed out, they are guaranteed to create a media spectacle. This was also true of the 19th‑century barricade: people posed for pictures in front of them. As Wolfgang Scheppe, a curator of Architecture of the Barricade, explains, these images helped the police to find Communards and mete out punishments after the end of the anarchist experiment. Much simpler structures can also be highly effective. In 2019, protesters in Hong Kong filled streets with little archways made from just three ordinary bricks: two standing upright, one resting on top. When touched, the falling top one would buttress the other two, and effectively block traffic. In line with their imperative of ‘be water’, protesters would retreat when the police appeared, but the ‘mini‑Stonehenges’ would remain and slow down the authorities. Today, elaborate architectures of protest, such as Extinction Rebellion’s ‘tensegrity towers’, are used to blockade roads and distribution networks – in this instance, Rupert Murdoch’s News UK printworks in Broxbourne, for the media group’s failure to report the climate emergency accurately Credit: Extinction Rebellion In June 2025, protests erupted in Los Angeles against the Trump administration’s deportation policies. Demonstrators barricaded downtown streets using various objects, including the pink public furniture designed by design firm Rios for Gloria Molina Grand Park. LAPD are seen advancing through tear gas Credit: Gina Ferazzi / Los Angeles Times via Getty Images Roads which radicals might want to target are not just ones in major metropoles and fancy post‑industrial downtowns. Rather, they might block the arteries leading to ‘fulfilment centres’ and harbours with container shipping. The model is not only Occupy Wall Street, which had initially called for the erection of ‘peaceful barricades’, but also the Occupy that led to the Oakland port shutdown in 2011. In short, such roadblocks disrupt what Phil Neel has called a ‘hinterland’ that is often invisible, yet crucial for contemporary capitalism. More recently, Extinction Rebellion targeted Amazon distribution centres in three European countries in November 2021; in the UK, they aimed to disrupt half of all deliveries on a Black Friday.   Will such blockades just anger consumers who, after all, are not present but are impatiently waiting for packages at home? One of the hopes associated with the traditional barricade was always that they might create spaces where protesters, police and previously indifferent citizens get talking; French theorists even expected them to become ‘a machine to produce the people’. That could be why military technology has evolved so that the authorities do not have to get close to the barricade: tear gas was first deployed against those on barricades before it was used in the First World War; so‑called riot control vehicles can ever more easily crush barricades. The challenge, then, for anyone who wishes to block is also how to get in other people’s faces – in order to have a chance to convince them of their cause.        2025-06-11 Kristina Rapacki Share #short #history #roadblock
    WWW.ARCHITECTURAL-REVIEW.COM
    A short history of the roadblock
    Barricades, as we know them today, are thought to date back to the European wars of religion. According to most historians, the first barricade went up in Paris in 1588; the word derives from the French barriques, or barrels, spontaneously put together. They have been assembled from the most diverse materials, from cobblestones, tyres, newspapers, dead horses and bags of ice (during Kyiv’s Euromaidan in 2013–14), to omnibuses and e‑scooters. Their tactical logic is close to that of guerrilla warfare: the authorities have to take the barricades in order to claim victory; all that those manning them have to do to prevail is to hold them.  The 19th century was the golden age for blocking narrow, labyrinthine streets. Paris had seen barricades go up nine times in the period before the Second Empire; during the July 1830 Revolution alone, 4,000 barricades had been erected (roughly one for every 200 Parisians). These barricades would not only stop, but also trap troops; people would then throw stones from windows or pour boiling water onto the streets. Georges‑Eugène Haussmann, Napoleon III’s prefect of Paris, famously created wide boulevards to make blocking by barricade more difficult and moving the military easier, and replaced cobblestones with macadam – a surface of crushed stone. As Flaubert observed in his Dictionary of Accepted Ideas: ‘Macadam: has cancelled revolutions. No more means to make barricades. Nevertheless rather inconvenient.’   Lead image: Barricades, as we know them today, are thought to have originated in early modern France. A colour engraving attributed to Achille‑Louis Martinet depicts the defence of a barricade during the 1830 July Revolution. Credit: Paris Musées / Musée Carnavalet – Histoire de Paris. Above: the socialist political thinker and activist Louis Auguste Blanqui – who was imprisoned by every regime that ruled France between 1815 and 1880 – drew instructions for how to build an effective barricade Under Napoleon III, Baron Haussmann widened Paris’s streets in his 1853–70 renovation of the city, making barricading more difficult Credit: Old Books Images / Alamy ‘On one hand, [the authorities] wanted to favour the circulation of ideas,’ reactionary intellectual Louis Veuillot observed apropos the ambiguous liberalism of the latter period of Napoleon III’s Second Empire. ‘On the other, to ensure the circulation of regiments.’ But ‘anti‑insurgency hardware’, as Justinien Tribillon has called it, also served to chase the working class out of the city centre: Haussmann’s projects amounted to a gigantic form of real-estate speculation, and the 1871 Paris Commune that followed constituted not just a short‑lived anarchist experiment featuring enormous barricades; it also signalled the return of the workers to the centre and, arguably, revenge for their dispossession.    By the mid‑19th century, observers questioned whether barricades still had practical meaning. Gottfried Semper’s barricade, constructed for the 1849 Dresden uprising, had proved unconquerable, but Friedrich Engels, one‑time ‘inspector of barricades’ in the Elberfeld insurrection of the same year, already suggested that the barricades’ primary meaning was now moral rather than military – a point to be echoed by Leon Trotsky in the subsequent century. Barricades symbolised bravery and the will to hold out among insurrectionists, and, not least, determination rather to destroy one’s possessions – and one’s neighbourhood – than put up with further oppression.   Not only self‑declared revolutionaries viewed things this way: the reformist Social Democrat leader Eduard Bernstein observed that ‘the barricade fight as a political weapon of the people has been completely eliminated due to changes in weapon technology and cities’ structures’. Bernstein was also picking up on the fact that, in the era of industrialisation, contention happened at least as much on the factory floor as on the streets. The strike, not the food riot or the defence of workers’ quartiers, became the paradigmatic form of conflict. Joshua Clover has pointed out in his 2016 book Riot. Strike. Riot: The New Era of Uprisings, that the price of labour, rather than the price of goods, caused people to confront the powerful. Blocking production grew more important than blocking the street. ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn’ Today, it is again blocking – not just people streaming along the streets in large marches – that is prominently associated with protests. Disrupting circulation is not only an important gesture in the face of climate emergency; blocking transport is a powerful form of protest in an economic system focused on logistics and just‑in‑time distribution. Members of Insulate Britain and Germany’s Last Generation super‑glue themselves to streets to stop car traffic to draw attention to the climate emergency; they have also attached themselves to airport runways. They form a human barricade of sorts, immobilising traffic by making themselves immovable.   Today’s protesters have made themselves consciously vulnerable. They in fact follow the advice of US civil rights’ Bayard Rustin who explained: ‘The only weapons we have are our bodies, and we need to tuck them in places so wheels don’t turn.’ Making oneself vulnerable might increase the chances of a majority of citizens seeing the importance of the cause which those engaged in civil disobedience are pursuing. Demonstrations – even large, unpredictable ones – are no longer sufficient. They draw too little attention and do not compel a reaction. Naomi Klein proposed the term ‘blockadia’ as ‘a roving transnational conflict zone’ in which people block extraction – be it open‑pit mines, fracking sites or tar sands pipelines – with their bodies. More often than not, these blockades are organised by local people opposing the fossil fuel industry, not environmental activists per se. Blockadia came to denote resistance to the Keystone XL pipeline as well as Canada’s First Nations‑led movement Idle No More. In cities, blocking can be accomplished with highly mobile structures. Like the barricade of the 19th century, they can be quickly assembled, yet are difficult to move; unlike old‑style barricades, they can also be quickly disassembled, removed and hidden (by those who have the engineering and architectural know‑how). Think of super tripods, intricate ‘protest beacons’ based on tensegrity principles, as well as inflatable cobblestones, pioneered by the artist‑activists of Tools for Action (and as analysed in Nick Newman’s recent volume Protest Architecture).   As recently as 1991, newly independent Latvia defended itself against Soviet tanks with the popular construction of barricades, in a series of confrontations that became known as the Barikādes Credit: Associated Press / Alamy Inversely, roadblocks can be used by police authorities to stop demonstrations and gatherings from taking place – protesters are seen removing such infrastructure in Dhaka during a general strike in 1999 Credit: REUTERS / Rafiqur Rahman / Bridgeman These inflatable objects are highly flexible, but can also be protective against police batons. They pose an awkward challenge to the authorities, who often end up looking ridiculous when dealing with them, and, as one of the inventors pointed out, they are guaranteed to create a media spectacle. This was also true of the 19th‑century barricade: people posed for pictures in front of them. As Wolfgang Scheppe, a curator of Architecture of the Barricade (currently on display at the Arsenale Institute for Politics of Representation in Venice), explains, these images helped the police to find Communards and mete out punishments after the end of the anarchist experiment. Much simpler structures can also be highly effective. In 2019, protesters in Hong Kong filled streets with little archways made from just three ordinary bricks: two standing upright, one resting on top. When touched, the falling top one would buttress the other two, and effectively block traffic. In line with their imperative of ‘be water’, protesters would retreat when the police appeared, but the ‘mini‑Stonehenges’ would remain and slow down the authorities. Today, elaborate architectures of protest, such as Extinction Rebellion’s ‘tensegrity towers’, are used to blockade roads and distribution networks – in this instance, Rupert Murdoch’s News UK printworks in Broxbourne, for the media group’s failure to report the climate emergency accurately Credit: Extinction Rebellion In June 2025, protests erupted in Los Angeles against the Trump administration’s deportation policies. Demonstrators barricaded downtown streets using various objects, including the pink public furniture designed by design firm Rios for Gloria Molina Grand Park. LAPD are seen advancing through tear gas Credit: Gina Ferazzi / Los Angeles Times via Getty Images Roads which radicals might want to target are not just ones in major metropoles and fancy post‑industrial downtowns. Rather, they might block the arteries leading to ‘fulfilment centres’ and harbours with container shipping. The model is not only Occupy Wall Street, which had initially called for the erection of ‘peaceful barricades’, but also the Occupy that led to the Oakland port shutdown in 2011. In short, such roadblocks disrupt what Phil Neel has called a ‘hinterland’ that is often invisible, yet crucial for contemporary capitalism. More recently, Extinction Rebellion targeted Amazon distribution centres in three European countries in November 2021; in the UK, they aimed to disrupt half of all deliveries on a Black Friday.   Will such blockades just anger consumers who, after all, are not present but are impatiently waiting for packages at home? One of the hopes associated with the traditional barricade was always that they might create spaces where protesters, police and previously indifferent citizens get talking; French theorists even expected them to become ‘a machine to produce the people’. That could be why military technology has evolved so that the authorities do not have to get close to the barricade: tear gas was first deployed against those on barricades before it was used in the First World War; so‑called riot control vehicles can ever more easily crush barricades. The challenge, then, for anyone who wishes to block is also how to get in other people’s faces – in order to have a chance to convince them of their cause.        2025-06-11 Kristina Rapacki Share
    0 Comments 0 Shares
  • A shortage of high-voltage power cables could stall the clean energy transition

    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why.
    A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea.
    Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment.
    High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress.

    "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg.
    The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities.
    // Related Stories

    As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct currentcables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses.

    But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometersof new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years.
    Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused.
    Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule.
    Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross.
    Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted.
    The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them."
    #shortage #highvoltage #power #cables #could
    A shortage of high-voltage power cables could stall the clean energy transition
    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why. A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea. Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment. High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress. "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg. The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities. // Related Stories As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct currentcables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses. But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometersof new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years. Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused. Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule. Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross. Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted. The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them." #shortage #highvoltage #power #cables #could
    WWW.TECHSPOT.COM
    A shortage of high-voltage power cables could stall the clean energy transition
    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why. A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea. Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment. High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress. "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg. The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities. // Related Stories As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct current (HVDC) cables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses. But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometers (around 49.7 million miles) of new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years. Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused. Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule. Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross. Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted. The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them."
    0 Comments 0 Shares