• Anker’s Soundcore Sleep earbuds finally feature active noise canceling

    Anker has announced a new version of its wireless sleep buds that could be even more effective at delivering a peaceful slumber by blocking out disturbing noises using active noise cancellation. Previous versions of the Soundcore Sleep earbuds blocked external sounds passively using just a snug fit inside the ear, but the new Sleep A30 finally add ANC while still offering enough battery life to last the night.As with previous versions, Anker is making its new Soundcore Sleep A30 available for preorder through a Kickstarter crowdfunding campaign that’s launching today, while full availability of the earbuds is expected sometime in August 2025 through Amazon and Soundcore’s online store. At the Sleep A30 are quite a bit more expensive than last year’s Sleep A20, but the earliest Kickstarter backers can get the A30 discounted to The Sleep A30 are slimmer and smaller than previous versions, potentially making them more comfortable to wear overnight. Image: AnkerThe Sleep A30 earbuds are now 7 percent slimmer and feature a smaller design that ensures they don’t protrude from your ears so there’s reduced pressure while wearing them and laying on a pillow if you’re a side sleeper. To help you find a snug fit, Anker includes four sizes of silicone ear tips, three sizes of memory foam tips, and three sizes of ear wings.Anker claims the new Sleep A30 block up to 30dB of external noise, but the added ANC, which uses two mics positioned inside and outside your ears, does result in reduced battery life. The A20 could run for up to 14 hours on a single charge, but the A30 max out at up to nine hours on their own, or up to 45 hours with their charging case. However, that’s only when listening to white noise or other sounds designed to help you fall asleep that are stored on the buds themselves. When streaming music or podcasts from a phone, battery life is further reduced to up to 6.5 hours or 35 hours with the case.The Sleep A30’s charging case has been upgraded to detect snoring sounds and generate audio to mask them. Image: AnkerThe Sleep A30’s charging case has been upgraded with what Anker is calling “Adaptive Snore Masking technology.” If it detects the sounds of snoring from another person nearby, it analyzes the volume and frequency of the sounds and generates “noise masking audio” that’s sent to the buds to help block it out.The new earbuds also feature sleep monitoring and sleep position tracking, allowing you to see how restful or eventful your night was through the Soundcore mobile app; a private repeatable alarm with snooze functionality; and a Find My Earbud feature should they fall out in the night and get lost in the sheets.See More:
    #ankers #soundcore #sleep #earbuds #finally
    Anker’s Soundcore Sleep earbuds finally feature active noise canceling
    Anker has announced a new version of its wireless sleep buds that could be even more effective at delivering a peaceful slumber by blocking out disturbing noises using active noise cancellation. Previous versions of the Soundcore Sleep earbuds blocked external sounds passively using just a snug fit inside the ear, but the new Sleep A30 finally add ANC while still offering enough battery life to last the night.As with previous versions, Anker is making its new Soundcore Sleep A30 available for preorder through a Kickstarter crowdfunding campaign that’s launching today, while full availability of the earbuds is expected sometime in August 2025 through Amazon and Soundcore’s online store. At the Sleep A30 are quite a bit more expensive than last year’s Sleep A20, but the earliest Kickstarter backers can get the A30 discounted to The Sleep A30 are slimmer and smaller than previous versions, potentially making them more comfortable to wear overnight. Image: AnkerThe Sleep A30 earbuds are now 7 percent slimmer and feature a smaller design that ensures they don’t protrude from your ears so there’s reduced pressure while wearing them and laying on a pillow if you’re a side sleeper. To help you find a snug fit, Anker includes four sizes of silicone ear tips, three sizes of memory foam tips, and three sizes of ear wings.Anker claims the new Sleep A30 block up to 30dB of external noise, but the added ANC, which uses two mics positioned inside and outside your ears, does result in reduced battery life. The A20 could run for up to 14 hours on a single charge, but the A30 max out at up to nine hours on their own, or up to 45 hours with their charging case. However, that’s only when listening to white noise or other sounds designed to help you fall asleep that are stored on the buds themselves. When streaming music or podcasts from a phone, battery life is further reduced to up to 6.5 hours or 35 hours with the case.The Sleep A30’s charging case has been upgraded to detect snoring sounds and generate audio to mask them. Image: AnkerThe Sleep A30’s charging case has been upgraded with what Anker is calling “Adaptive Snore Masking technology.” If it detects the sounds of snoring from another person nearby, it analyzes the volume and frequency of the sounds and generates “noise masking audio” that’s sent to the buds to help block it out.The new earbuds also feature sleep monitoring and sleep position tracking, allowing you to see how restful or eventful your night was through the Soundcore mobile app; a private repeatable alarm with snooze functionality; and a Find My Earbud feature should they fall out in the night and get lost in the sheets.See More: #ankers #soundcore #sleep #earbuds #finally
    WWW.THEVERGE.COM
    Anker’s Soundcore Sleep earbuds finally feature active noise canceling
    Anker has announced a new version of its wireless sleep buds that could be even more effective at delivering a peaceful slumber by blocking out disturbing noises using active noise cancellation. Previous versions of the Soundcore Sleep earbuds blocked external sounds passively using just a snug fit inside the ear, but the new Sleep A30 finally add ANC while still offering enough battery life to last the night.As with previous versions, Anker is making its new Soundcore Sleep A30 available for preorder through a Kickstarter crowdfunding campaign that’s launching today, while full availability of the earbuds is expected sometime in August 2025 through Amazon and Soundcore’s online store. At $229.99, the Sleep A30 are quite a bit more expensive than last year’s $149.99 Sleep A20, but the earliest Kickstarter backers can get the A30 discounted to $139.The Sleep A30 are slimmer and smaller than previous versions, potentially making them more comfortable to wear overnight. Image: AnkerThe Sleep A30 earbuds are now 7 percent slimmer and feature a smaller design that ensures they don’t protrude from your ears so there’s reduced pressure while wearing them and laying on a pillow if you’re a side sleeper. To help you find a snug fit, Anker includes four sizes of silicone ear tips, three sizes of memory foam tips, and three sizes of ear wings.Anker claims the new Sleep A30 block up to 30dB of external noise, but the added ANC, which uses two mics positioned inside and outside your ears, does result in reduced battery life. The A20 could run for up to 14 hours on a single charge, but the A30 max out at up to nine hours on their own, or up to 45 hours with their charging case. However, that’s only when listening to white noise or other sounds designed to help you fall asleep that are stored on the buds themselves. When streaming music or podcasts from a phone, battery life is further reduced to up to 6.5 hours or 35 hours with the case.The Sleep A30’s charging case has been upgraded to detect snoring sounds and generate audio to mask them. Image: AnkerThe Sleep A30’s charging case has been upgraded with what Anker is calling “Adaptive Snore Masking technology.” If it detects the sounds of snoring from another person nearby, it analyzes the volume and frequency of the sounds and generates “noise masking audio” that’s sent to the buds to help block it out.The new earbuds also feature sleep monitoring and sleep position tracking, allowing you to see how restful or eventful your night was through the Soundcore mobile app; a private repeatable alarm with snooze functionality; and a Find My Earbud feature should they fall out in the night and get lost in the sheets.See More:
    Like
    Love
    Wow
    Sad
    Angry
    350
    0 Комментарии 0 Поделились 0 предпросмотр
  • Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm

    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more

    When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development.
    What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute. 
    As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention.
    Engineering around constraints
    DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement.
    While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well.
    This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment.
    If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development.
    That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently.
    This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing.
    Pragmatism over process
    Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process.
    The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content.
    This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations. 
    Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance.
    Market reverberations
    Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders.
    Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI. 
    With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change.
    This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s.
    Beyond model training
    Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training.
    To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards.
    The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk.
    For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted.
    At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort.
    This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails.
    Moving into the future
    So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity. 
    Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market.
    Meta has also responded,
    With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail.
    Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching.
    Jae Lee is CEO and co-founder of TwelveLabs.

    Daily insights on business use cases with VB Daily
    If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.
    Read our Privacy Policy

    Thanks for subscribing. Check out more VB newsletters here.

    An error occured.
    #rethinking #deepseeks #playbook #shakes #highspend
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured. #rethinking #deepseeks #playbook #shakes #highspend
    VENTUREBEAT.COM
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere $6 million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent $500 million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just $5.6 million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate (even though it makes a good story). Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of experts (MoE) architectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending $7 to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending $7 billion or $8 billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive $40 billion funding round that valued the company at an unprecedented $300 billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute” (TTC). As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning” (SPCT). This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM” (generalist reward modeling). But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of others (think OpenAI’s “critique and revise” methods, Anthropic’s constitutional AI or research on self-rewarding agents) to create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately $80 billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured.
    0 Комментарии 0 Поделились 0 предпросмотр
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Комментарии 0 Поделились 0 предпросмотр
  • Venice Biennale 2025 round-up: what else to see?

    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti.
    Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few.
    It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement

    This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars.
    What's not to miss in the Giardini?
    British PavilionUK Pavilion
    The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction.
    Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff.
    The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves.
    The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement

    The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team, looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. here.
    Danish PavilionDemark Pavilion
    A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials.
    Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition.
    The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay.
    Belgian PavilionBelgium Pavilion
    If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore.
    Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture.
    Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance.
    Spanish PavilionSpain Pavilion
    One for the pure architecture lovers out there, models, installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain.
    The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia.
    Polish PavilionPoland Pavilion
    Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture.
    Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher.
    Dutch PavilionNetherlands Pavilion
    Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities.
    The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfsworn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here.
    Performance inside the Nordic Countries PavilionNordic Countries Pavilion
    Probably the most impactful national pavilion this year, the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment.
    The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn.
    The American pavilion next door, loudlyturns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here.
    German PavilionGermany Pavilion
    An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms.
    In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will.
    Best bits of the Arsenale outside the main exhibitions
    Bahrain PavilionBahrain Pavilion
    Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context.
    A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place.
    In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate.
    Slovenian PavilionSlovenia Pavilion
    The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing.
    Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Designin Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films.
    Uzbekistan PavilionUzbekistan Pavilion
    Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders.
    Applied Arts PavilionV&A Applied Arts Pavilion
    Diller Scofidio + Renfrois having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its newcollections archive in east London.
    Featured is a six-channelfilm entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase.
    Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers.
    Canal CaféCanal café
    Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani.
    Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses.
    The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice.
    And what else?
    Holy See PavilionThe Holy See
    Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration.
    Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards.
    The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks.
    The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior.
    Togo PavilionTogo Pavilion
    The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello.
    Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration.
    Estonian PavilionEstonia Pavilion
    Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’
    Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing.
    The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers.
    Venice ProcuratieSMACTimed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects.
    Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo.
    During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun.
    Holcim's installationHolcim x Elemental
    Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project.
    The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens.
    It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build.
    The Next Earth at Palazzo DiedoThe Next Earth
    At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikytherahave come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises.
    Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will.
    The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025.
    #venice #biennale #roundup #what #else
    Venice Biennale 2025 round-up: what else to see?
    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti. Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few. It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars. What's not to miss in the Giardini? British PavilionUK Pavilion The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction. Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff. The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves. The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team, looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. here. Danish PavilionDemark Pavilion A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials. Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition. The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay. Belgian PavilionBelgium Pavilion If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore. Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture. Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance. Spanish PavilionSpain Pavilion One for the pure architecture lovers out there, models, installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain. The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia. Polish PavilionPoland Pavilion Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture. Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher. Dutch PavilionNetherlands Pavilion Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities. The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfsworn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here. Performance inside the Nordic Countries PavilionNordic Countries Pavilion Probably the most impactful national pavilion this year, the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment. The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn. The American pavilion next door, loudlyturns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here. German PavilionGermany Pavilion An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms. In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will. Best bits of the Arsenale outside the main exhibitions Bahrain PavilionBahrain Pavilion Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context. A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place. In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate. Slovenian PavilionSlovenia Pavilion The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing. Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Designin Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films. Uzbekistan PavilionUzbekistan Pavilion Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders. Applied Arts PavilionV&A Applied Arts Pavilion Diller Scofidio + Renfrois having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its newcollections archive in east London. Featured is a six-channelfilm entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase. Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers. Canal CaféCanal café Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani. Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses. The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice. And what else? Holy See PavilionThe Holy See Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration. Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards. The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks. The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior. Togo PavilionTogo Pavilion The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello. Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration. Estonian PavilionEstonia Pavilion Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’ Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing. The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers. Venice ProcuratieSMACTimed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects. Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo. During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun. Holcim's installationHolcim x Elemental Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project. The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens. It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build. The Next Earth at Palazzo DiedoThe Next Earth At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikytherahave come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises. Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will. The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025. #venice #biennale #roundup #what #else
    WWW.ARCHITECTSJOURNAL.CO.UK
    Venice Biennale 2025 round-up: what else to see?
    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti. Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few. It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars. What's not to miss in the Giardini? British Pavilion (photography: Chris Lane) UK Pavilion The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction. Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff. The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves. The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team (PART), looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. Read more here. Danish Pavilion (photography: Hampus Berndtson) Demark Pavilion A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials. Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition. The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay. Belgian Pavilion (photography: Michiel De Cleene) Belgium Pavilion If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore. Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture. Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance. Spanish Pavilion (photography: Luca Capuano) Spain Pavilion One for the pure architecture lovers out there, models (32!), installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain. The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia. Polish Pavilion (photography: Luca Capuano) Poland Pavilion Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture. Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher. Dutch Pavilion (photography: Cristiano Corte) Netherlands Pavilion Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities. The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfs (currently a must-have fashion item) worn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here. Performance inside the Nordic Countries Pavilion (photography: Venla Helenius) Nordic Countries Pavilion Probably the most impactful national pavilion this year (and with the best tote bag by far), the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment. The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn. The American pavilion next door, loudly (country music!) turns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here. German Pavilion (photography: Luca Capuano) Germany Pavilion An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms. In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will. Best bits of the Arsenale outside the main exhibitions Bahrain Pavilion (photography: Andrea Avezzù) Bahrain Pavilion Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context. A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place. In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate. Slovenian Pavilion (photography: Andrea Avezzù) Slovenia Pavilion The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing. Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Design (MAO) in Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films. Uzbekistan Pavilion (photography: Luca Capuano) Uzbekistan Pavilion Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders. Applied Arts Pavilion (photography: Andrea Avezzù) V&A Applied Arts Pavilion Diller Scofidio + Renfro (DS+R) is having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its new (and free) collections archive in east London. Featured is a six-channel (and screen) film entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase. Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers. Canal Café (photography: Marco Zorzanello) Canal café Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani. Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses. The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice. And what else? Holy See Pavilion (photography: Andrea Avezzù) The Holy See Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration. Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards. The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks. The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior. Togo Pavilion (photography: Andrea Avezzù) Togo Pavilion The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello. Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration. Estonian Pavilion (photography: Joosep Kivimäe) Estonia Pavilion Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’ Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing. The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers. Venice Procuratie (photography: Mike Merkenschlager) SMAC (San Marco Art Centre) Timed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects. Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo. During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun. Holcim's installation (photography: Celestia Studio) Holcim x Elemental Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project. The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens. It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build. The Next Earth at Palazzo Diedo (photography: Joan Porcel) The Next Earth At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikythera (apparently taking its name from the first-known computer) have come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises. Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will. The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025.
    Like
    Love
    Wow
    Sad
    Angry
    632
    0 Комментарии 0 Поделились 0 предпросмотр
  • FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY

    By TREVOR HOGG

    Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.”

    A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.”

    Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.”

    Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.”

    Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.”

    Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.”

    Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.”

    “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.”

    Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.”

    Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.”

    “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’”

    Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.”

    Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh.

    Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.”
    —Dan Mindel, Cinematographer, Twisters

    Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.”
    #set #pixels #cinematic #artists #come
    FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY
    By TREVOR HOGG Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.” #set #pixels #cinematic #artists #come
    WWW.VFXVOICE.COM
    FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY
    By TREVOR HOGG Denis Villeneuve (Dune: Part Two) finds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track. (Image courtesy of Warner Bros. Pictures) If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation. [VFX Supervisor] Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects is [Cinematographer] Roger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey. (Image courtesy of Paramount Pictures) Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow. (Image courtesy of Apple Studios) One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it. (Image courtesy of Paramount Pictures) Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters. (Image courtesy of Universal Pictures) Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, and [they] create a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky. (Image courtesy of Prime Video) Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles. (Image courtesy of Columbia Pictures) Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline. (Image courtesy of HBO) Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats. (Image courtesy of Universal Pictures) For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef. (Image courtesy of Netflix) Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once. (Image courtesy of A24) Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. For [the 2026 Netflix limited series] East of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well. [Director] Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise. (Image courtesy of HBO) Bluescreen and stunt doubles on Twisters. (Image courtesy of Universal Pictures) “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.”
    Like
    Love
    Wow
    Sad
    Angry
    634
    0 Комментарии 0 Поделились 0 предпросмотр
  • How old are the Dead Sea Scrolls? An AI model can help

    Science & technology | ScrollytellingHow old are the Dead Sea Scrolls? An AI model can help Scientists are using it to estimate the age of ancient handwriting Sensitive subjectPhotograph: Israel Antiquities Authority/Shai Halevi Jun 5th 2025EVER SINCE the Dead Sea Scrolls were discovered by Bedouin shepherds in the 1940s, debate has raged over their exact age. The scrolls, which contain the earliest surviving copies of books from the Hebrew Bible and other religious texts, mostly written in Aramaic and Hebrew, are thought to have been compiled sometime between 300BC and 200AD. Dating each of the 1,000-odd individual scrolls would help historians understand how literacy spread among ancient Jewish populations and the first Christians, and offer a valuable window into the genesis of the sacred texts. But scholars hoping to do so have had little but their own intuition to rely on.Explore moreThis article appeared in the Science & technology section of the print edition under the headline “Scrollytelling”From the June 7th 2025 editionDiscover stories from this section and more in the list of contents⇒Explore the editionReuse this content
    #how #old #are #dead #sea
    How old are the Dead Sea Scrolls? An AI model can help
    Science & technology | ScrollytellingHow old are the Dead Sea Scrolls? An AI model can help Scientists are using it to estimate the age of ancient handwriting Sensitive subjectPhotograph: Israel Antiquities Authority/Shai Halevi Jun 5th 2025EVER SINCE the Dead Sea Scrolls were discovered by Bedouin shepherds in the 1940s, debate has raged over their exact age. The scrolls, which contain the earliest surviving copies of books from the Hebrew Bible and other religious texts, mostly written in Aramaic and Hebrew, are thought to have been compiled sometime between 300BC and 200AD. Dating each of the 1,000-odd individual scrolls would help historians understand how literacy spread among ancient Jewish populations and the first Christians, and offer a valuable window into the genesis of the sacred texts. But scholars hoping to do so have had little but their own intuition to rely on.Explore moreThis article appeared in the Science & technology section of the print edition under the headline “Scrollytelling”From the June 7th 2025 editionDiscover stories from this section and more in the list of contents⇒Explore the editionReuse this content #how #old #are #dead #sea
    WWW.ECONOMIST.COM
    How old are the Dead Sea Scrolls? An AI model can help
    Science & technology | ScrollytellingHow old are the Dead Sea Scrolls? An AI model can help Scientists are using it to estimate the age of ancient handwriting Sensitive subjectPhotograph: Israel Antiquities Authority/Shai Halevi Jun 5th 2025EVER SINCE the Dead Sea Scrolls were discovered by Bedouin shepherds in the 1940s, debate has raged over their exact age. The scrolls, which contain the earliest surviving copies of books from the Hebrew Bible and other religious texts, mostly written in Aramaic and Hebrew, are thought to have been compiled sometime between 300BC and 200AD. Dating each of the 1,000-odd individual scrolls would help historians understand how literacy spread among ancient Jewish populations and the first Christians, and offer a valuable window into the genesis of the sacred texts. But scholars hoping to do so have had little but their own intuition to rely on.Explore moreThis article appeared in the Science & technology section of the print edition under the headline “Scrollytelling”From the June 7th 2025 editionDiscover stories from this section and more in the list of contents⇒Explore the editionReuse this content
    Like
    Love
    Wow
    Sad
    Angry
    454
    0 Комментарии 0 Поделились 0 предпросмотр
  • The hidden time bomb in the tax code that's fueling mass tech layoffs: A decades-old tax rule helped build America's tech economy. A quiet change under Trump helped dismantle it

    For the past two years, it’s been a ghost in the machine of American tech. Between 2022 and today, a little-noticed tweak to the U.S. tax code has quietly rewired the financial logic of how American companies invest in research and development. Outside of CFO and accounting circles, almost no one knew it existed. “I work on these tax write-offs and still hadn’t heard about this,” a chief operating officer at a private-equity-backed tech company told Quartz. “It’s just been so weirdly silent.”AdvertisementStill, the delayed change to a decades-old tax provision — buried deep in the 2017 tax law — has contributed to the loss of hundreds of thousands of high-paying, white-collar jobs. That’s the picture that emerges from a review of corporate filings, public financial data, analysis of timelines, and interviews with industry insiders. One accountant, working in-house at a tech company, described it as a “niche issue with broad impact,” echoing sentiments from venture capital investors also interviewed for this article. Some spoke on condition of anonymity to discuss sensitive political matters.Since the start of 2023, more than half-a-million tech workers have been laid off, according to industry tallies. Headlines have blamed over-hiring during the pandemic and, more recently, AI. But beneath the surface was a hidden accelerant: a change to what’s known as Section 174 that helped gut in-house software and product development teams everywhere from tech giants such as Microsoftand Metato much smaller, private, direct-to-consumer and other internet-first companies.Now, as a bipartisan effort to repeal the Section 174 change moves through Congress, bigger questions are surfacing: How did a single line in the tax code help trigger a tsunami of mass layoffs? And why did no one see it coming? For almost 70 years, American companies could deduct 100% of qualified research and development spending in the year they incurred the costs. Salaries, software, contractor payments — if it contributed to creating or improving a product, it came off the top of a firm’s taxable income.AdvertisementThe deduction was guaranteed by Section 174 of the IRS Code of 1954, and under the provision, R&D flourished in the U.S.Microsoft was founded in 1975. Applelaunched its first computer in 1976. Googleincorporated in 1998. Facebook opened to the general public in 2006. All these companies, now among the most valuable in the world, developed their earliest products — programming tools, hardware, search engines — under a tax system that rewarded building now, not later.The subsequent rise of smartphones, cloud computing, and mobile apps also happened in an America where companies could immediately write off their investments in engineering, infrastructure, and experimentation. It was a baseline assumption — innovation and risk-taking subsidized by the tax code — that shaped how founders operated and how investors made decisions.In turn, tech companies largely built their products in the U.S. AdvertisementMicrosoft’s operating systems were coded in Washington state. Apple’s early hardware and software teams were in California. Google’s search engine was born at Stanford and scaled from Mountain View. Facebook’s entire social architecture was developed in Menlo Park. The deduction directly incentivized keeping R&D close to home, rewarding companies for investing in American workers, engineers, and infrastructure.That’s what makes the politics of Section 174 so revealing. For all the rhetoric about bringing jobs back and making things in America, the first Trump administration’s major tax bill arguably helped accomplish the opposite.When Congress passed the Tax Cuts and Jobs Act, the signature legislative achievement of President Donald Trump’s first term, it slashed the corporate tax rate from 35% to 21% — a massive revenue loss on paper for the federal government.To make the 2017 bill comply with Senate budget rules, lawmakers needed to offset the cost. So they added future tax hikes that wouldn’t kick in right away, wouldn’t provoke immediate backlash from businesses, and could, in theory, be quietly repealed later.AdvertisementThe delayed change to Section 174 — from immediate expensing of R&D to mandatory amortization, meaning that companies must spread the deduction out in smaller chunks over five or even 15-year periods — was that kind of provision. It didn’t start affecting the budget until 2022, but it helped the TCJA appear “deficit neutral” over the 10-year window used for legislative scoring.The delay wasn’t a technical necessity. It was a political tactic. Such moves are common in tax legislation. Phase-ins and delayed provisions let lawmakers game how the Congressional Budget Office— Congress’ nonpartisan analyst of how bills impact budgets and deficits — scores legislation, pushing costs or revenue losses outside official forecasting windows.And so, on schedule in 2022, the change to Section 174 went into effect. Companies filed their 2022 tax returns under the new rules in early 2023. And suddenly, R&D wasn’t a full, immediate write-off anymore. The tax benefits of salaries for engineers, product and project managers, data scientists, and even some user experience and marketing staff — all of which had previously reduced taxable income in year one — now had to be spread out over five- or 15-year periods. To understand the impact, imagine a personal tax code change that allowed you to deduct 100% of your biggest source of expenses, and that becoming a 20% deduction. For cash-strapped companies, especially those not yet profitable, the result was a painful tax bill just as venture funding dried up and interest rates soared.AdvertisementSalesforce office buildings in San Francisco.Photo: Jason Henry/BloombergIt’s no coincidence that Meta announced its “Year of Efficiency” immediately after the Section 174 change took effect. Ditto Microsoft laying off 10,000 employees in January 2023 despite strong earnings, or Google parent Alphabet cutting 12,000 jobs around the same time.Amazonalso laid off almost 30,000 people, with cuts focused not just on logistics but on Alexa and internal cloud tools — precisely the kinds of projects that would have once qualified as immediately deductible R&D. Salesforceeliminated 10% of its staff, or 8,000 people, including entire product teams.In public, companies blamed bloat and AI. But inside boardrooms, spreadsheets were telling a quieter story. And MD&A notes — management’s notes on the numbers — buried deep in 10-K filings recorded the change, too. R&D had become more expensive to carry. Headcount, the leading R&D expense across the tech industry, was the easiest thing to cut.AdvertisementIn its 2023 annual report, Meta described salaries as its single biggest R&D expense. Between the first and second years that the Section 174 change began affecting tax returns, Meta cut its total workforce by almost 25%. Over the same period, Microsoft reduced its global headcount by about 7%, with cuts concentrated in product-facing, engineering-heavy roles.Smaller companies without the fortress-like balance sheets of Big Tech have arguably been hit even harder. Twilioslashed 22% of its workforce in 2023 alone. Shopifycut almost 30% of staff in 2022 and 2023. Coinbasereduced headcount by 36% across a pair of brutal restructuring waves.Since going into effect, the provision has hit at the very heart of America’s economic growth engine: the tech sector.By market cap, tech giants dominate the S&P 500, with the “Magnificent 7” alone accounting for more than a third of the index’s total value. Workforce numbers tell a similar story, with tech employing millions of Americans directly and supporting the employment of tens of millions more. As measured by GDP, capital-T tech contributes about 10% of national output.AdvertisementIt’s not just that tech layoffs were large, it’s that they were massively disproportionate. Across the broader U.S. economy, job cuts hovered around in low single digits across most sectors. But in tech, entire divisions vanished, with a whopping 60% jump in layoffs between 2022 and 2023. Some cuts reflected real inefficiencies — a response to over-hiring during the zero-interest rate boom. At the same time, many of the roles eliminated were in R&D, product, and engineering, precisely the kind of functions that had once benefitted from generous tax treatment under Section 174.Throughout the 2010s, a broad swath of startups, direct-to-consumer brands, and internet-first firms — basically every company you recognize from Instagram or Facebook ads — built their growth models around a kind of engineered break-even.The tax code allowed them to spend aggressively on product and engineering, then write it all off as R&D, keeping their taxable income close to zero by design. It worked because taxable income and actual cash flow were often notGAAP accounting practices. Basically, as long as spending counted as R&D, companies could report losses to investors while owing almost nothing to the IRS.But the Section 174 change broke that model. Once those same expenses had to be spread out, or amortized, over multiple years, the tax shield vanished. Companies that were still burning cash suddenly looked profitable on paper, triggering real tax bills on imaginary gains.AdvertisementThe logic that once fueled a generation of digital-first growth collapsed overnight.So it wasn’t just tech experiencing effects. From 1954 until 2022, the U.S. tax code had encouraged businesses of all stripes to behave like tech companies. From retail to logistics, healthcare to media, if firms built internal tools, customized a software stack, or invested in business intelligence and data-driven product development, they could expense those costs. The write-off incentivized in-house builds and fast growth well outside the capital-T tech sector. This lines up with OECD research showing that immediate deductions foster innovation more than spread-out ones.And American companies ran with that logic. According to government data, U.S. businesses reported about billion in R&D expenditures in 2019 alone, and almost half of that came from industries outside traditional tech. The Bureau of Economic Analysis estimates that this sector, the broader digital economy, accounts for another 10% of GDP.Add that to core tech’s contribution, and the Section 174 shift has likely touched at least 20% of the U.S. economy.AdvertisementThe result? A tax policy aimed at raising short-term revenue effectively hid a time bomb inside the growth engines of thousands of companies. And when it detonated, it kneecapped the incentive for hiring American engineers or investing in American-made tech and digital products.It made building tech companies in America look irrational on a spreadsheet.A bipartisan group of lawmakers is pushing to repeal the Section 174 change, with business groups, CFOs, crypto executives, and venture capitalists lobbying hard for retroactive relief. But the politics are messy. Fixing 174 would mean handing a tax break to the same companies many voters in both parties see as symbols of corporate excess. Any repeal would also come too late for the hundreds of thousands of workers already laid off.And of course, the losses don’t stop at Meta’s or Google’s campus gates. They ripple out. When high-paid tech workers disappear, so do the lunch orders. The house tours. The contract gigs. The spending habits that sustain entire urban economies and thousands of other jobs. Sandwich artists. Rideshare drivers. Realtors. Personal trainers. House cleaners. In tech-heavy cities, the fallout runs deep — and it’s still unfolding.AdvertisementWashington is now poised to pass a second Trump tax bill — one packed with more obscure provisions, more delayed impacts, more quiet redistribution. And it comes as analysts are only just beginning to understand the real-world effects of the last round.The Section 174 change “significantly increased the tax burden on companies investing in innovation, potentially stifling economic growth and reducing the United States’ competitiveness on the global stage,” according to the tax consulting firm KBKG. Whether the U.S. will reverse course — or simply adapt to a new normal — remains to be seen.
    #hidden #time #bomb #tax #code
    The hidden time bomb in the tax code that's fueling mass tech layoffs: A decades-old tax rule helped build America's tech economy. A quiet change under Trump helped dismantle it
    For the past two years, it’s been a ghost in the machine of American tech. Between 2022 and today, a little-noticed tweak to the U.S. tax code has quietly rewired the financial logic of how American companies invest in research and development. Outside of CFO and accounting circles, almost no one knew it existed. “I work on these tax write-offs and still hadn’t heard about this,” a chief operating officer at a private-equity-backed tech company told Quartz. “It’s just been so weirdly silent.”AdvertisementStill, the delayed change to a decades-old tax provision — buried deep in the 2017 tax law — has contributed to the loss of hundreds of thousands of high-paying, white-collar jobs. That’s the picture that emerges from a review of corporate filings, public financial data, analysis of timelines, and interviews with industry insiders. One accountant, working in-house at a tech company, described it as a “niche issue with broad impact,” echoing sentiments from venture capital investors also interviewed for this article. Some spoke on condition of anonymity to discuss sensitive political matters.Since the start of 2023, more than half-a-million tech workers have been laid off, according to industry tallies. Headlines have blamed over-hiring during the pandemic and, more recently, AI. But beneath the surface was a hidden accelerant: a change to what’s known as Section 174 that helped gut in-house software and product development teams everywhere from tech giants such as Microsoftand Metato much smaller, private, direct-to-consumer and other internet-first companies.Now, as a bipartisan effort to repeal the Section 174 change moves through Congress, bigger questions are surfacing: How did a single line in the tax code help trigger a tsunami of mass layoffs? And why did no one see it coming? For almost 70 years, American companies could deduct 100% of qualified research and development spending in the year they incurred the costs. Salaries, software, contractor payments — if it contributed to creating or improving a product, it came off the top of a firm’s taxable income.AdvertisementThe deduction was guaranteed by Section 174 of the IRS Code of 1954, and under the provision, R&D flourished in the U.S.Microsoft was founded in 1975. Applelaunched its first computer in 1976. Googleincorporated in 1998. Facebook opened to the general public in 2006. All these companies, now among the most valuable in the world, developed their earliest products — programming tools, hardware, search engines — under a tax system that rewarded building now, not later.The subsequent rise of smartphones, cloud computing, and mobile apps also happened in an America where companies could immediately write off their investments in engineering, infrastructure, and experimentation. It was a baseline assumption — innovation and risk-taking subsidized by the tax code — that shaped how founders operated and how investors made decisions.In turn, tech companies largely built their products in the U.S. AdvertisementMicrosoft’s operating systems were coded in Washington state. Apple’s early hardware and software teams were in California. Google’s search engine was born at Stanford and scaled from Mountain View. Facebook’s entire social architecture was developed in Menlo Park. The deduction directly incentivized keeping R&D close to home, rewarding companies for investing in American workers, engineers, and infrastructure.That’s what makes the politics of Section 174 so revealing. For all the rhetoric about bringing jobs back and making things in America, the first Trump administration’s major tax bill arguably helped accomplish the opposite.When Congress passed the Tax Cuts and Jobs Act, the signature legislative achievement of President Donald Trump’s first term, it slashed the corporate tax rate from 35% to 21% — a massive revenue loss on paper for the federal government.To make the 2017 bill comply with Senate budget rules, lawmakers needed to offset the cost. So they added future tax hikes that wouldn’t kick in right away, wouldn’t provoke immediate backlash from businesses, and could, in theory, be quietly repealed later.AdvertisementThe delayed change to Section 174 — from immediate expensing of R&D to mandatory amortization, meaning that companies must spread the deduction out in smaller chunks over five or even 15-year periods — was that kind of provision. It didn’t start affecting the budget until 2022, but it helped the TCJA appear “deficit neutral” over the 10-year window used for legislative scoring.The delay wasn’t a technical necessity. It was a political tactic. Such moves are common in tax legislation. Phase-ins and delayed provisions let lawmakers game how the Congressional Budget Office— Congress’ nonpartisan analyst of how bills impact budgets and deficits — scores legislation, pushing costs or revenue losses outside official forecasting windows.And so, on schedule in 2022, the change to Section 174 went into effect. Companies filed their 2022 tax returns under the new rules in early 2023. And suddenly, R&D wasn’t a full, immediate write-off anymore. The tax benefits of salaries for engineers, product and project managers, data scientists, and even some user experience and marketing staff — all of which had previously reduced taxable income in year one — now had to be spread out over five- or 15-year periods. To understand the impact, imagine a personal tax code change that allowed you to deduct 100% of your biggest source of expenses, and that becoming a 20% deduction. For cash-strapped companies, especially those not yet profitable, the result was a painful tax bill just as venture funding dried up and interest rates soared.AdvertisementSalesforce office buildings in San Francisco.Photo: Jason Henry/BloombergIt’s no coincidence that Meta announced its “Year of Efficiency” immediately after the Section 174 change took effect. Ditto Microsoft laying off 10,000 employees in January 2023 despite strong earnings, or Google parent Alphabet cutting 12,000 jobs around the same time.Amazonalso laid off almost 30,000 people, with cuts focused not just on logistics but on Alexa and internal cloud tools — precisely the kinds of projects that would have once qualified as immediately deductible R&D. Salesforceeliminated 10% of its staff, or 8,000 people, including entire product teams.In public, companies blamed bloat and AI. But inside boardrooms, spreadsheets were telling a quieter story. And MD&A notes — management’s notes on the numbers — buried deep in 10-K filings recorded the change, too. R&D had become more expensive to carry. Headcount, the leading R&D expense across the tech industry, was the easiest thing to cut.AdvertisementIn its 2023 annual report, Meta described salaries as its single biggest R&D expense. Between the first and second years that the Section 174 change began affecting tax returns, Meta cut its total workforce by almost 25%. Over the same period, Microsoft reduced its global headcount by about 7%, with cuts concentrated in product-facing, engineering-heavy roles.Smaller companies without the fortress-like balance sheets of Big Tech have arguably been hit even harder. Twilioslashed 22% of its workforce in 2023 alone. Shopifycut almost 30% of staff in 2022 and 2023. Coinbasereduced headcount by 36% across a pair of brutal restructuring waves.Since going into effect, the provision has hit at the very heart of America’s economic growth engine: the tech sector.By market cap, tech giants dominate the S&P 500, with the “Magnificent 7” alone accounting for more than a third of the index’s total value. Workforce numbers tell a similar story, with tech employing millions of Americans directly and supporting the employment of tens of millions more. As measured by GDP, capital-T tech contributes about 10% of national output.AdvertisementIt’s not just that tech layoffs were large, it’s that they were massively disproportionate. Across the broader U.S. economy, job cuts hovered around in low single digits across most sectors. But in tech, entire divisions vanished, with a whopping 60% jump in layoffs between 2022 and 2023. Some cuts reflected real inefficiencies — a response to over-hiring during the zero-interest rate boom. At the same time, many of the roles eliminated were in R&D, product, and engineering, precisely the kind of functions that had once benefitted from generous tax treatment under Section 174.Throughout the 2010s, a broad swath of startups, direct-to-consumer brands, and internet-first firms — basically every company you recognize from Instagram or Facebook ads — built their growth models around a kind of engineered break-even.The tax code allowed them to spend aggressively on product and engineering, then write it all off as R&D, keeping their taxable income close to zero by design. It worked because taxable income and actual cash flow were often notGAAP accounting practices. Basically, as long as spending counted as R&D, companies could report losses to investors while owing almost nothing to the IRS.But the Section 174 change broke that model. Once those same expenses had to be spread out, or amortized, over multiple years, the tax shield vanished. Companies that were still burning cash suddenly looked profitable on paper, triggering real tax bills on imaginary gains.AdvertisementThe logic that once fueled a generation of digital-first growth collapsed overnight.So it wasn’t just tech experiencing effects. From 1954 until 2022, the U.S. tax code had encouraged businesses of all stripes to behave like tech companies. From retail to logistics, healthcare to media, if firms built internal tools, customized a software stack, or invested in business intelligence and data-driven product development, they could expense those costs. The write-off incentivized in-house builds and fast growth well outside the capital-T tech sector. This lines up with OECD research showing that immediate deductions foster innovation more than spread-out ones.And American companies ran with that logic. According to government data, U.S. businesses reported about billion in R&D expenditures in 2019 alone, and almost half of that came from industries outside traditional tech. The Bureau of Economic Analysis estimates that this sector, the broader digital economy, accounts for another 10% of GDP.Add that to core tech’s contribution, and the Section 174 shift has likely touched at least 20% of the U.S. economy.AdvertisementThe result? A tax policy aimed at raising short-term revenue effectively hid a time bomb inside the growth engines of thousands of companies. And when it detonated, it kneecapped the incentive for hiring American engineers or investing in American-made tech and digital products.It made building tech companies in America look irrational on a spreadsheet.A bipartisan group of lawmakers is pushing to repeal the Section 174 change, with business groups, CFOs, crypto executives, and venture capitalists lobbying hard for retroactive relief. But the politics are messy. Fixing 174 would mean handing a tax break to the same companies many voters in both parties see as symbols of corporate excess. Any repeal would also come too late for the hundreds of thousands of workers already laid off.And of course, the losses don’t stop at Meta’s or Google’s campus gates. They ripple out. When high-paid tech workers disappear, so do the lunch orders. The house tours. The contract gigs. The spending habits that sustain entire urban economies and thousands of other jobs. Sandwich artists. Rideshare drivers. Realtors. Personal trainers. House cleaners. In tech-heavy cities, the fallout runs deep — and it’s still unfolding.AdvertisementWashington is now poised to pass a second Trump tax bill — one packed with more obscure provisions, more delayed impacts, more quiet redistribution. And it comes as analysts are only just beginning to understand the real-world effects of the last round.The Section 174 change “significantly increased the tax burden on companies investing in innovation, potentially stifling economic growth and reducing the United States’ competitiveness on the global stage,” according to the tax consulting firm KBKG. Whether the U.S. will reverse course — or simply adapt to a new normal — remains to be seen. #hidden #time #bomb #tax #code
    QZ.COM
    The hidden time bomb in the tax code that's fueling mass tech layoffs: A decades-old tax rule helped build America's tech economy. A quiet change under Trump helped dismantle it
    For the past two years, it’s been a ghost in the machine of American tech. Between 2022 and today, a little-noticed tweak to the U.S. tax code has quietly rewired the financial logic of how American companies invest in research and development. Outside of CFO and accounting circles, almost no one knew it existed. “I work on these tax write-offs and still hadn’t heard about this,” a chief operating officer at a private-equity-backed tech company told Quartz. “It’s just been so weirdly silent.”AdvertisementStill, the delayed change to a decades-old tax provision — buried deep in the 2017 tax law — has contributed to the loss of hundreds of thousands of high-paying, white-collar jobs. That’s the picture that emerges from a review of corporate filings, public financial data, analysis of timelines, and interviews with industry insiders. One accountant, working in-house at a tech company, described it as a “niche issue with broad impact,” echoing sentiments from venture capital investors also interviewed for this article. Some spoke on condition of anonymity to discuss sensitive political matters.Since the start of 2023, more than half-a-million tech workers have been laid off, according to industry tallies. Headlines have blamed over-hiring during the pandemic and, more recently, AI. But beneath the surface was a hidden accelerant: a change to what’s known as Section 174 that helped gut in-house software and product development teams everywhere from tech giants such as Microsoft (MSFT) and Meta (META) to much smaller, private, direct-to-consumer and other internet-first companies.Now, as a bipartisan effort to repeal the Section 174 change moves through Congress, bigger questions are surfacing: How did a single line in the tax code help trigger a tsunami of mass layoffs? And why did no one see it coming? For almost 70 years, American companies could deduct 100% of qualified research and development spending in the year they incurred the costs. Salaries, software, contractor payments — if it contributed to creating or improving a product, it came off the top of a firm’s taxable income.AdvertisementThe deduction was guaranteed by Section 174 of the IRS Code of 1954, and under the provision, R&D flourished in the U.S.Microsoft was founded in 1975. Apple (AAPL) launched its first computer in 1976. Google (GOOGL) incorporated in 1998. Facebook opened to the general public in 2006. All these companies, now among the most valuable in the world, developed their earliest products — programming tools, hardware, search engines — under a tax system that rewarded building now, not later.The subsequent rise of smartphones, cloud computing, and mobile apps also happened in an America where companies could immediately write off their investments in engineering, infrastructure, and experimentation. It was a baseline assumption — innovation and risk-taking subsidized by the tax code — that shaped how founders operated and how investors made decisions.In turn, tech companies largely built their products in the U.S. AdvertisementMicrosoft’s operating systems were coded in Washington state. Apple’s early hardware and software teams were in California. Google’s search engine was born at Stanford and scaled from Mountain View. Facebook’s entire social architecture was developed in Menlo Park. The deduction directly incentivized keeping R&D close to home, rewarding companies for investing in American workers, engineers, and infrastructure.That’s what makes the politics of Section 174 so revealing. For all the rhetoric about bringing jobs back and making things in America, the first Trump administration’s major tax bill arguably helped accomplish the opposite.When Congress passed the Tax Cuts and Jobs Act (TCJA), the signature legislative achievement of President Donald Trump’s first term, it slashed the corporate tax rate from 35% to 21% — a massive revenue loss on paper for the federal government.To make the 2017 bill comply with Senate budget rules, lawmakers needed to offset the cost. So they added future tax hikes that wouldn’t kick in right away, wouldn’t provoke immediate backlash from businesses, and could, in theory, be quietly repealed later.AdvertisementThe delayed change to Section 174 — from immediate expensing of R&D to mandatory amortization, meaning that companies must spread the deduction out in smaller chunks over five or even 15-year periods — was that kind of provision. It didn’t start affecting the budget until 2022, but it helped the TCJA appear “deficit neutral” over the 10-year window used for legislative scoring.The delay wasn’t a technical necessity. It was a political tactic. Such moves are common in tax legislation. Phase-ins and delayed provisions let lawmakers game how the Congressional Budget Office (CBO) — Congress’ nonpartisan analyst of how bills impact budgets and deficits — scores legislation, pushing costs or revenue losses outside official forecasting windows.And so, on schedule in 2022, the change to Section 174 went into effect. Companies filed their 2022 tax returns under the new rules in early 2023. And suddenly, R&D wasn’t a full, immediate write-off anymore. The tax benefits of salaries for engineers, product and project managers, data scientists, and even some user experience and marketing staff — all of which had previously reduced taxable income in year one — now had to be spread out over five- or 15-year periods. To understand the impact, imagine a personal tax code change that allowed you to deduct 100% of your biggest source of expenses, and that becoming a 20% deduction. For cash-strapped companies, especially those not yet profitable, the result was a painful tax bill just as venture funding dried up and interest rates soared.AdvertisementSalesforce office buildings in San Francisco.Photo: Jason Henry/Bloomberg (Getty Images)It’s no coincidence that Meta announced its “Year of Efficiency” immediately after the Section 174 change took effect. Ditto Microsoft laying off 10,000 employees in January 2023 despite strong earnings, or Google parent Alphabet cutting 12,000 jobs around the same time.Amazon (AMZN) also laid off almost 30,000 people, with cuts focused not just on logistics but on Alexa and internal cloud tools — precisely the kinds of projects that would have once qualified as immediately deductible R&D. Salesforce (CRM) eliminated 10% of its staff, or 8,000 people, including entire product teams.In public, companies blamed bloat and AI. But inside boardrooms, spreadsheets were telling a quieter story. And MD&A notes — management’s notes on the numbers — buried deep in 10-K filings recorded the change, too. R&D had become more expensive to carry. Headcount, the leading R&D expense across the tech industry, was the easiest thing to cut.AdvertisementIn its 2023 annual report, Meta described salaries as its single biggest R&D expense. Between the first and second years that the Section 174 change began affecting tax returns, Meta cut its total workforce by almost 25%. Over the same period, Microsoft reduced its global headcount by about 7%, with cuts concentrated in product-facing, engineering-heavy roles.Smaller companies without the fortress-like balance sheets of Big Tech have arguably been hit even harder. Twilio (TWLO) slashed 22% of its workforce in 2023 alone. Shopify (SHOP) (headquartered in Canada but with much of its R&D teams in the U.S.) cut almost 30% of staff in 2022 and 2023. Coinbase (COIN) reduced headcount by 36% across a pair of brutal restructuring waves.Since going into effect, the provision has hit at the very heart of America’s economic growth engine: the tech sector.By market cap, tech giants dominate the S&P 500, with the “Magnificent 7” alone accounting for more than a third of the index’s total value. Workforce numbers tell a similar story, with tech employing millions of Americans directly and supporting the employment of tens of millions more. As measured by GDP, capital-T tech contributes about 10% of national output.AdvertisementIt’s not just that tech layoffs were large, it’s that they were massively disproportionate. Across the broader U.S. economy, job cuts hovered around in low single digits across most sectors. But in tech, entire divisions vanished, with a whopping 60% jump in layoffs between 2022 and 2023. Some cuts reflected real inefficiencies — a response to over-hiring during the zero-interest rate boom. At the same time, many of the roles eliminated were in R&D, product, and engineering, precisely the kind of functions that had once benefitted from generous tax treatment under Section 174.Throughout the 2010s, a broad swath of startups, direct-to-consumer brands, and internet-first firms — basically every company you recognize from Instagram or Facebook ads — built their growth models around a kind of engineered break-even.The tax code allowed them to spend aggressively on product and engineering, then write it all off as R&D, keeping their taxable income close to zero by design. It worked because taxable income and actual cash flow were often notGAAP accounting practices. Basically, as long as spending counted as R&D, companies could report losses to investors while owing almost nothing to the IRS.But the Section 174 change broke that model. Once those same expenses had to be spread out, or amortized, over multiple years, the tax shield vanished. Companies that were still burning cash suddenly looked profitable on paper, triggering real tax bills on imaginary gains.AdvertisementThe logic that once fueled a generation of digital-first growth collapsed overnight.So it wasn’t just tech experiencing effects. From 1954 until 2022, the U.S. tax code had encouraged businesses of all stripes to behave like tech companies. From retail to logistics, healthcare to media, if firms built internal tools, customized a software stack, or invested in business intelligence and data-driven product development, they could expense those costs. The write-off incentivized in-house builds and fast growth well outside the capital-T tech sector. This lines up with OECD research showing that immediate deductions foster innovation more than spread-out ones.And American companies ran with that logic. According to government data, U.S. businesses reported about $500 billion in R&D expenditures in 2019 alone, and almost half of that came from industries outside traditional tech. The Bureau of Economic Analysis estimates that this sector, the broader digital economy, accounts for another 10% of GDP.Add that to core tech’s contribution, and the Section 174 shift has likely touched at least 20% of the U.S. economy.AdvertisementThe result? A tax policy aimed at raising short-term revenue effectively hid a time bomb inside the growth engines of thousands of companies. And when it detonated, it kneecapped the incentive for hiring American engineers or investing in American-made tech and digital products.It made building tech companies in America look irrational on a spreadsheet.A bipartisan group of lawmakers is pushing to repeal the Section 174 change, with business groups, CFOs, crypto executives, and venture capitalists lobbying hard for retroactive relief. But the politics are messy. Fixing 174 would mean handing a tax break to the same companies many voters in both parties see as symbols of corporate excess. Any repeal would also come too late for the hundreds of thousands of workers already laid off.And of course, the losses don’t stop at Meta’s or Google’s campus gates. They ripple out. When high-paid tech workers disappear, so do the lunch orders. The house tours. The contract gigs. The spending habits that sustain entire urban economies and thousands of other jobs. Sandwich artists. Rideshare drivers. Realtors. Personal trainers. House cleaners. In tech-heavy cities, the fallout runs deep — and it’s still unfolding.AdvertisementWashington is now poised to pass a second Trump tax bill — one packed with more obscure provisions, more delayed impacts, more quiet redistribution. And it comes as analysts are only just beginning to understand the real-world effects of the last round.The Section 174 change “significantly increased the tax burden on companies investing in innovation, potentially stifling economic growth and reducing the United States’ competitiveness on the global stage,” according to the tax consulting firm KBKG. Whether the U.S. will reverse course — or simply adapt to a new normal — remains to be seen.
    Like
    Love
    Wow
    Sad
    Angry
    368
    0 Комментарии 0 Поделились 0 предпросмотр
  • A housing design catalogue for the 21st century

    The housing catalogue includes 50 low-rise home designs, including for garden suites, duplexes, four-plexes and six-plexes. Each design was developed by local architecture and engineering teams with the intent of aligning with regional building codes, planning rules, climate zones, construction methods and materials.

    TEXT John Lorinc
    RENDERINGS Office In Search Of
    During the spring election, the Liberals leaned into messaging that evoked a historic moment from the late 1940s, when Ottawa succeeded in confronting a severe housing crisis. 
    “We used to build things in this country,” begins Prime Minister Mark Carney in a nostalgic ad filled with archival images of streets lined with brand new post-World War II “strawberry box” bungalows, built for returning Canadian soldiers and their young families. 

    The video also includes montages from the now-iconic design “catalogues,” published by Canada Mortgage and Housing Corporation. These supplied floor plans and unlocked cheap mortgages for tens of thousands of simple suburban houses found in communities across the country. “The government built prefabricated homes that were easy to assemble and inexpensive,” Carney said in the voice-over. “And those homes are still here.” 
    Over the past year, CMHC has initiated a 21st century re-do of that design catalogue, and the first tranche of 50 plans—for garden suites, duplexes, four-plexes and six-plexes—went live in early March. A second tranche, with plans for small apartments, is under development. 
    Unlike the postwar versions, these focus on infill sites, not green fields. One of CMHC’s goals is to promote so-called gentle density to residential properties with easily constructed plans that reflect regional variations, local zoning and building-code regulations, accessibility features and low-carbon design. As with those postwar catalogues, CMHC’s other goal was to tamp down on soft costs for homeowners or small builders looking to develop these kinds of housing by providing no-cost designs that were effectively permit sets.
    The early reviews are generally positive. “I find the design really very compelling in a kind of understated way,” says SvN principal Sam Dufaux. By making available vetted plans that can be either pre-approved or approved as of right, CMHC will remove some of the friction that impedes this scale of housing. “One of the elements of the housing crisis has to do with how do we approve these kinds of projects,” Dufaux adds. “I’m hoping it is a bit of a new beginning.”
    Yet other observers offer cautions about the extent to which the CMHC program can blunt the housing crisis. “It’s a small piece and a positive one,” says missing middle advocate and economist Mike Moffatt, who is executive in residence at the Smart Prosperity Institute and an assistant professor at Western’s Ivey Business School. “Butone that probably captures a disproportionate amount of attention because it’s something people can visualize in a way that they can’t with an apartment tax credit.”
    This kind of new-build infill is unlikely to provide much in the way of affordable or deeply affordable housing, adds Carolyn Whitzman, housing and social policy researcher, and author of Home Truths: Fixing Canada’s Housing Crisis. She estimates Canada needs about three million new dwellings that can be rented for per month or less. The policies that will enable new housing at that scale, she says, involve financing subsidies, publicly owned land, and construction innovation, e.g., prefabricated or factory-built components, as well as “consistent and permissive zoning and consistent and permissive building codes.” 
    Indeed, the make-or-break question hovering over CMHC’s design catalogue is whether municipalities will green-light these plans or simply find new ways to hold up approvals.
     
    An axonometric of a rowhouse development from the Housing Catalogue, designed for Alberta.
    A team effort
    Janna Levitt, partner at LGA Architectural Partners, says that when CMHC issued an RFP for the design catalogue, her firm decided to pitch a team of architects and peer reviewers from across Canada, with LGA serving as project manager. After they were selected, Levitt says they had to quickly clarify a key detail, which was the assumption that the program could deliver pre-approved, permit-ready plans absent a piece of property to build on. “Even in 1947,” she says, “it wasn’t a permit set until you had a site.”
    LGA’s team and CMHC agreed to expand the scope of the assignment so that the finished product wasn’t just a catalogue of plans but also included details about local regulations and typical lot sizes. Re-Housing co-founder Michael Piper, an associate professor at U of T’s John H. Daniels Faculty of Architecture, Landscape, and Design, came on board to carry out research on similar programs, and found initiatives in places like Georgia, Indiana and Texas. “I have not found any that moved forward,” he says. “Canada’s national design catalogue is pretty novel in that regard, which is exciting.” The noteworthy exceptions are California, which has made significant advances in recent years in pre-approving ADUs across the state, and British Columbia, which last fall released its own standardized design catalogue. 
    He also carried out a scan of land use and zoning rules in Ontario for 15 to 20 municipalities. “We looked to seetheir zoning permitted and what the rules were, and as you might expect, they’re all over the place,” he says. “Hence the challenge with the standardized design.”
    At present, high-level overviews for the 50 designs are available, including basic floor plans, 3D axonometrics, and building dimensions. Full architectural design packages are expected to be released later this year.
    Levitt says the architects on the team set out to come up with designs that used wood frame construction, had no basements, and drew on vernacular architectural styles. They researched representative lot sizes in the various regions, and configured designs to suit small, medium and large properties. Some versions have accessibility features—CMHC’s remit included both accessible units and aging-in-place as objectives—or can be adapted later on. 
    As for climate and energy efficiency considerations, the recommended materials include low-carbon components and cladding. The designs do reflect geographical variations, but Levitt says there’s only so much her team could do in terms of energy modelling. “How do you do heat energy calculations when you don’t have a site? You don’t have north, south, east, westand you don’t have what zone are you in. In B.C. and Ontario, there are seven climatic regions. There was a lot of working through those kinds of very practical requirements, which were very complicated and actually fed into the design work quite significantly.” As Levitt adds, “in 1947, there were no heat loss models because the world wasn’t like that.”
    LGA provided the architects on the team with templates for interior elements, such as bathrooms, as well as standards for features such as bedroom sizes, dining areas, storage sufficient to hold strollers, and access to outdoor space, either at grade or via a balcony. “We gathered together these ideas about the quality of life that we wanted baked into each of the designs, so thatexpressed a really good quality of life—modest but good quality,” she says. “It’s not about the finishes. People had to be able to live there and live there well.”
    “This isn’t a boutique home solution,” Whitzman says. “This is a cheap and mass-produced solution. And compared to other cheap and mass-produced solutions, whether they be condos or suburban subdivisions,look fine to my untrained eye.”
    A selection of Housing Catalogue designs for the Atlantic region.
    Will it succeed? 
    With the plans now public, the other important variables, besides their conformity with local bylaws, have to do with cost and visibility to potential users, including homeowners, contractors and developers specializing in smaller-scale projects. 
    On the costing side, N. Barry Lyons Consultantshas been retained by CMHC to develop models to accompany the design catalogue, but those figures have yet to be released. While pricing is inevitably dynamic, the calculus behind the entire exercise turns on whether the savings on design outlays and the use of prefabricated components will make such small-scale projects pencil, particularly at a time when there are live concerns about tariffs, skilled labour shortages, and supply chain interruptions on building materials. 
    Finally, there’s the horse-to-water problem. While the design catalogue has received a reasonable amount of media attention since it launched, does CMHC need to find ways to market it more aggressively? “From my experience,” says Levitt, “they are extremely proactive, and have assembled a kind of dream team with a huge range of experience and expertise. They are doing very concerted and deep work with municipalities across the country.”
    Proper promotion, observes Moffatt, “is going to be important in particular, just for political reasons. The prime minister has made a lot of bold promises about500,000 homes.” Carney’s pledge to get Canada back into building will take time to ramp up, he adds. “I do think the federal government needs to visibly show progress, and if they can’t point to abuilding across the road, they could at least, `We’ve got this design catalogue. Here’s how it works. We’ve already got so many builders and developers looking at this.’” 
    While it’s far too soon to draw conclusions about the success of this ambitious program, Levitt is well aware of the long and rich legacy of the predecessor CMHC catalogues from the late 40s and the 1950s, all of which gave many young Canadian architects their earliest commissions and then left an enduring aesthetic on countless communities across Canada.  
    She hopes the updated 21st-century catalogue—fitted out as it is for 21st-century concerns about carbon, resilience and urban density—will acquire a similar cachet. 
    “These are architecturally designed houses for a group of people across the country who will have never lived in an architecturally designed house,” she muses. “I would love it if, 80 years from now, the consistent feedbackwas that they were able to live generously and well in those houses, and that everything was where it should be.”
    ARCHITECTURE FIRM COLLABORATORS Michael Green Architecture, Dub Architects, 5468796 Architecture Inc, Oxbow Architecture, LGA Architectural Partners, KANVA Architecture, Abbott Brown Architects, Taylor Architecture Group

     As appeared in the June 2025 issue of Canadian Architect magazine 

    The post A housing design catalogue for the 21st century appeared first on Canadian Architect.
    #housing #design #catalogue #21st #century
    A housing design catalogue for the 21st century
    The housing catalogue includes 50 low-rise home designs, including for garden suites, duplexes, four-plexes and six-plexes. Each design was developed by local architecture and engineering teams with the intent of aligning with regional building codes, planning rules, climate zones, construction methods and materials. TEXT John Lorinc RENDERINGS Office In Search Of During the spring election, the Liberals leaned into messaging that evoked a historic moment from the late 1940s, when Ottawa succeeded in confronting a severe housing crisis.  “We used to build things in this country,” begins Prime Minister Mark Carney in a nostalgic ad filled with archival images of streets lined with brand new post-World War II “strawberry box” bungalows, built for returning Canadian soldiers and their young families.  The video also includes montages from the now-iconic design “catalogues,” published by Canada Mortgage and Housing Corporation. These supplied floor plans and unlocked cheap mortgages for tens of thousands of simple suburban houses found in communities across the country. “The government built prefabricated homes that were easy to assemble and inexpensive,” Carney said in the voice-over. “And those homes are still here.”  Over the past year, CMHC has initiated a 21st century re-do of that design catalogue, and the first tranche of 50 plans—for garden suites, duplexes, four-plexes and six-plexes—went live in early March. A second tranche, with plans for small apartments, is under development.  Unlike the postwar versions, these focus on infill sites, not green fields. One of CMHC’s goals is to promote so-called gentle density to residential properties with easily constructed plans that reflect regional variations, local zoning and building-code regulations, accessibility features and low-carbon design. As with those postwar catalogues, CMHC’s other goal was to tamp down on soft costs for homeowners or small builders looking to develop these kinds of housing by providing no-cost designs that were effectively permit sets. The early reviews are generally positive. “I find the design really very compelling in a kind of understated way,” says SvN principal Sam Dufaux. By making available vetted plans that can be either pre-approved or approved as of right, CMHC will remove some of the friction that impedes this scale of housing. “One of the elements of the housing crisis has to do with how do we approve these kinds of projects,” Dufaux adds. “I’m hoping it is a bit of a new beginning.” Yet other observers offer cautions about the extent to which the CMHC program can blunt the housing crisis. “It’s a small piece and a positive one,” says missing middle advocate and economist Mike Moffatt, who is executive in residence at the Smart Prosperity Institute and an assistant professor at Western’s Ivey Business School. “Butone that probably captures a disproportionate amount of attention because it’s something people can visualize in a way that they can’t with an apartment tax credit.” This kind of new-build infill is unlikely to provide much in the way of affordable or deeply affordable housing, adds Carolyn Whitzman, housing and social policy researcher, and author of Home Truths: Fixing Canada’s Housing Crisis. She estimates Canada needs about three million new dwellings that can be rented for per month or less. The policies that will enable new housing at that scale, she says, involve financing subsidies, publicly owned land, and construction innovation, e.g., prefabricated or factory-built components, as well as “consistent and permissive zoning and consistent and permissive building codes.”  Indeed, the make-or-break question hovering over CMHC’s design catalogue is whether municipalities will green-light these plans or simply find new ways to hold up approvals.   An axonometric of a rowhouse development from the Housing Catalogue, designed for Alberta. A team effort Janna Levitt, partner at LGA Architectural Partners, says that when CMHC issued an RFP for the design catalogue, her firm decided to pitch a team of architects and peer reviewers from across Canada, with LGA serving as project manager. After they were selected, Levitt says they had to quickly clarify a key detail, which was the assumption that the program could deliver pre-approved, permit-ready plans absent a piece of property to build on. “Even in 1947,” she says, “it wasn’t a permit set until you had a site.” LGA’s team and CMHC agreed to expand the scope of the assignment so that the finished product wasn’t just a catalogue of plans but also included details about local regulations and typical lot sizes. Re-Housing co-founder Michael Piper, an associate professor at U of T’s John H. Daniels Faculty of Architecture, Landscape, and Design, came on board to carry out research on similar programs, and found initiatives in places like Georgia, Indiana and Texas. “I have not found any that moved forward,” he says. “Canada’s national design catalogue is pretty novel in that regard, which is exciting.” The noteworthy exceptions are California, which has made significant advances in recent years in pre-approving ADUs across the state, and British Columbia, which last fall released its own standardized design catalogue.  He also carried out a scan of land use and zoning rules in Ontario for 15 to 20 municipalities. “We looked to seetheir zoning permitted and what the rules were, and as you might expect, they’re all over the place,” he says. “Hence the challenge with the standardized design.” At present, high-level overviews for the 50 designs are available, including basic floor plans, 3D axonometrics, and building dimensions. Full architectural design packages are expected to be released later this year. Levitt says the architects on the team set out to come up with designs that used wood frame construction, had no basements, and drew on vernacular architectural styles. They researched representative lot sizes in the various regions, and configured designs to suit small, medium and large properties. Some versions have accessibility features—CMHC’s remit included both accessible units and aging-in-place as objectives—or can be adapted later on.  As for climate and energy efficiency considerations, the recommended materials include low-carbon components and cladding. The designs do reflect geographical variations, but Levitt says there’s only so much her team could do in terms of energy modelling. “How do you do heat energy calculations when you don’t have a site? You don’t have north, south, east, westand you don’t have what zone are you in. In B.C. and Ontario, there are seven climatic regions. There was a lot of working through those kinds of very practical requirements, which were very complicated and actually fed into the design work quite significantly.” As Levitt adds, “in 1947, there were no heat loss models because the world wasn’t like that.” LGA provided the architects on the team with templates for interior elements, such as bathrooms, as well as standards for features such as bedroom sizes, dining areas, storage sufficient to hold strollers, and access to outdoor space, either at grade or via a balcony. “We gathered together these ideas about the quality of life that we wanted baked into each of the designs, so thatexpressed a really good quality of life—modest but good quality,” she says. “It’s not about the finishes. People had to be able to live there and live there well.” “This isn’t a boutique home solution,” Whitzman says. “This is a cheap and mass-produced solution. And compared to other cheap and mass-produced solutions, whether they be condos or suburban subdivisions,look fine to my untrained eye.” A selection of Housing Catalogue designs for the Atlantic region. Will it succeed?  With the plans now public, the other important variables, besides their conformity with local bylaws, have to do with cost and visibility to potential users, including homeowners, contractors and developers specializing in smaller-scale projects.  On the costing side, N. Barry Lyons Consultantshas been retained by CMHC to develop models to accompany the design catalogue, but those figures have yet to be released. While pricing is inevitably dynamic, the calculus behind the entire exercise turns on whether the savings on design outlays and the use of prefabricated components will make such small-scale projects pencil, particularly at a time when there are live concerns about tariffs, skilled labour shortages, and supply chain interruptions on building materials.  Finally, there’s the horse-to-water problem. While the design catalogue has received a reasonable amount of media attention since it launched, does CMHC need to find ways to market it more aggressively? “From my experience,” says Levitt, “they are extremely proactive, and have assembled a kind of dream team with a huge range of experience and expertise. They are doing very concerted and deep work with municipalities across the country.” Proper promotion, observes Moffatt, “is going to be important in particular, just for political reasons. The prime minister has made a lot of bold promises about500,000 homes.” Carney’s pledge to get Canada back into building will take time to ramp up, he adds. “I do think the federal government needs to visibly show progress, and if they can’t point to abuilding across the road, they could at least, `We’ve got this design catalogue. Here’s how it works. We’ve already got so many builders and developers looking at this.’”  While it’s far too soon to draw conclusions about the success of this ambitious program, Levitt is well aware of the long and rich legacy of the predecessor CMHC catalogues from the late 40s and the 1950s, all of which gave many young Canadian architects their earliest commissions and then left an enduring aesthetic on countless communities across Canada.   She hopes the updated 21st-century catalogue—fitted out as it is for 21st-century concerns about carbon, resilience and urban density—will acquire a similar cachet.  “These are architecturally designed houses for a group of people across the country who will have never lived in an architecturally designed house,” she muses. “I would love it if, 80 years from now, the consistent feedbackwas that they were able to live generously and well in those houses, and that everything was where it should be.” ARCHITECTURE FIRM COLLABORATORS Michael Green Architecture, Dub Architects, 5468796 Architecture Inc, Oxbow Architecture, LGA Architectural Partners, KANVA Architecture, Abbott Brown Architects, Taylor Architecture Group  As appeared in the June 2025 issue of Canadian Architect magazine  The post A housing design catalogue for the 21st century appeared first on Canadian Architect. #housing #design #catalogue #21st #century
    WWW.CANADIANARCHITECT.COM
    A housing design catalogue for the 21st century
    The housing catalogue includes 50 low-rise home designs, including for garden suites, duplexes, four-plexes and six-plexes. Each design was developed by local architecture and engineering teams with the intent of aligning with regional building codes, planning rules, climate zones, construction methods and materials. TEXT John Lorinc RENDERINGS Office In Search Of During the spring election, the Liberals leaned into messaging that evoked a historic moment from the late 1940s, when Ottawa succeeded in confronting a severe housing crisis.  “We used to build things in this country,” begins Prime Minister Mark Carney in a nostalgic ad filled with archival images of streets lined with brand new post-World War II “strawberry box” bungalows, built for returning Canadian soldiers and their young families.  The video also includes montages from the now-iconic design “catalogues,” published by Canada Mortgage and Housing Corporation (CMHC). These supplied floor plans and unlocked cheap mortgages for tens of thousands of simple suburban houses found in communities across the country. “The government built prefabricated homes that were easy to assemble and inexpensive,” Carney said in the voice-over. “And those homes are still here.”  Over the past year, CMHC has initiated a 21st century re-do of that design catalogue, and the first tranche of 50 plans—for garden suites, duplexes, four-plexes and six-plexes—went live in early March. A second tranche, with plans for small apartments, is under development.  Unlike the postwar versions, these focus on infill sites, not green fields. One of CMHC’s goals is to promote so-called gentle density to residential properties with easily constructed plans that reflect regional variations, local zoning and building-code regulations, accessibility features and low-carbon design. As with those postwar catalogues, CMHC’s other goal was to tamp down on soft costs for homeowners or small builders looking to develop these kinds of housing by providing no-cost designs that were effectively permit sets. The early reviews are generally positive. “I find the design really very compelling in a kind of understated way,” says SvN principal Sam Dufaux. By making available vetted plans that can be either pre-approved or approved as of right, CMHC will remove some of the friction that impedes this scale of housing. “One of the elements of the housing crisis has to do with how do we approve these kinds of projects,” Dufaux adds. “I’m hoping it is a bit of a new beginning.” Yet other observers offer cautions about the extent to which the CMHC program can blunt the housing crisis. “It’s a small piece and a positive one,” says missing middle advocate and economist Mike Moffatt, who is executive in residence at the Smart Prosperity Institute and an assistant professor at Western’s Ivey Business School. “But [it’s] one that probably captures a disproportionate amount of attention because it’s something people can visualize in a way that they can’t with an apartment tax credit.” This kind of new-build infill is unlikely to provide much in the way of affordable or deeply affordable housing, adds Carolyn Whitzman, housing and social policy researcher, and author of Home Truths: Fixing Canada’s Housing Crisis (UBC Press, 2024). She estimates Canada needs about three million new dwellings that can be rented for $1,000 per month or less. The policies that will enable new housing at that scale, she says, involve financing subsidies, publicly owned land, and construction innovation, e.g., prefabricated or factory-built components, as well as “consistent and permissive zoning and consistent and permissive building codes.”  Indeed, the make-or-break question hovering over CMHC’s design catalogue is whether municipalities will green-light these plans or simply find new ways to hold up approvals.   An axonometric of a rowhouse development from the Housing Catalogue, designed for Alberta. A team effort Janna Levitt, partner at LGA Architectural Partners, says that when CMHC issued an RFP for the design catalogue, her firm decided to pitch a team of architects and peer reviewers from across Canada, with LGA serving as project manager. After they were selected, Levitt says they had to quickly clarify a key detail, which was the assumption that the program could deliver pre-approved, permit-ready plans absent a piece of property to build on. “Even in 1947,” she says, “it wasn’t a permit set until you had a site.” LGA’s team and CMHC agreed to expand the scope of the assignment so that the finished product wasn’t just a catalogue of plans but also included details about local regulations and typical lot sizes. Re-Housing co-founder Michael Piper, an associate professor at U of T’s John H. Daniels Faculty of Architecture, Landscape, and Design, came on board to carry out research on similar programs, and found initiatives in places like Georgia, Indiana and Texas. “I have not found any that moved forward,” he says. “Canada’s national design catalogue is pretty novel in that regard, which is exciting.” The noteworthy exceptions are California, which has made significant advances in recent years in pre-approving ADUs across the state, and British Columbia, which last fall released its own standardized design catalogue.  He also carried out a scan of land use and zoning rules in Ontario for 15 to 20 municipalities. “We looked to see [what] their zoning permitted and what the rules were, and as you might expect, they’re all over the place,” he says. “Hence the challenge with the standardized design.” At present, high-level overviews for the 50 designs are available, including basic floor plans, 3D axonometrics, and building dimensions. Full architectural design packages are expected to be released later this year. Levitt says the architects on the team set out to come up with designs that used wood frame construction, had no basements (to save on cost and reduce embodied carbon), and drew on vernacular architectural styles. They researched representative lot sizes in the various regions, and configured designs to suit small, medium and large properties. Some versions have accessibility features—CMHC’s remit included both accessible units and aging-in-place as objectives—or can be adapted later on.  As for climate and energy efficiency considerations, the recommended materials include low-carbon components and cladding. The designs do reflect geographical variations, but Levitt says there’s only so much her team could do in terms of energy modelling. “How do you do heat energy calculations when you don’t have a site? You don’t have north, south, east, west [orientations] and you don’t have what zone are you in. In B.C. and Ontario, there are seven climatic regions. There was a lot of working through those kinds of very practical requirements, which were very complicated and actually fed into the design work quite significantly.” As Levitt adds, “in 1947, there were no heat loss models because the world wasn’t like that.” LGA provided the architects on the team with templates for interior elements, such as bathrooms, as well as standards for features such as bedroom sizes, dining areas, storage sufficient to hold strollers, and access to outdoor space, either at grade or via a balcony. “We gathered together these ideas about the quality of life that we wanted baked into each of the designs, so that [they] expressed a really good quality of life—modest but good quality,” she says. “It’s not about the finishes. People had to be able to live there and live there well.” “This isn’t a boutique home solution,” Whitzman says. “This is a cheap and mass-produced solution. And compared to other cheap and mass-produced solutions, whether they be condos or suburban subdivisions, [the catalogue designs] look fine to my untrained eye.” A selection of Housing Catalogue designs for the Atlantic region. Will it succeed?  With the plans now public, the other important variables, besides their conformity with local bylaws, have to do with cost and visibility to potential users, including homeowners, contractors and developers specializing in smaller-scale projects.  On the costing side, N. Barry Lyons Consultants (NBLC) has been retained by CMHC to develop models to accompany the design catalogue, but those figures have yet to be released. While pricing is inevitably dynamic, the calculus behind the entire exercise turns on whether the savings on design outlays and the use of prefabricated components will make such small-scale projects pencil, particularly at a time when there are live concerns about tariffs, skilled labour shortages, and supply chain interruptions on building materials.  Finally, there’s the horse-to-water problem. While the design catalogue has received a reasonable amount of media attention since it launched, does CMHC need to find ways to market it more aggressively? “From my experience,” says Levitt, “they are extremely proactive, and have assembled a kind of dream team with a huge range of experience and expertise. They are doing very concerted and deep work with municipalities across the country.” Proper promotion, observes Moffatt, “is going to be important in particular, just for political reasons. The prime minister has made a lot of bold promises about [adding] 500,000 homes.” Carney’s pledge to get Canada back into building will take time to ramp up, he adds. “I do think the federal government needs to visibly show progress, and if they can’t point to a [new] building across the road, they could at least [say], `We’ve got this design catalogue. Here’s how it works. We’ve already got so many builders and developers looking at this.’”  While it’s far too soon to draw conclusions about the success of this ambitious program, Levitt is well aware of the long and rich legacy of the predecessor CMHC catalogues from the late 40s and the 1950s, all of which gave many young Canadian architects their earliest commissions and then left an enduring aesthetic on countless communities across Canada.   She hopes the updated 21st-century catalogue—fitted out as it is for 21st-century concerns about carbon, resilience and urban density—will acquire a similar cachet.  “These are architecturally designed houses for a group of people across the country who will have never lived in an architecturally designed house,” she muses. “I would love it if, 80 years from now, the consistent feedback [from occupants] was that they were able to live generously and well in those houses, and that everything was where it should be.” ARCHITECTURE FIRM COLLABORATORS Michael Green Architecture, Dub Architects, 5468796 Architecture Inc, Oxbow Architecture, LGA Architectural Partners, KANVA Architecture, Abbott Brown Architects, Taylor Architecture Group  As appeared in the June 2025 issue of Canadian Architect magazine  The post A housing design catalogue for the 21st century appeared first on Canadian Architect.
    0 Комментарии 0 Поделились 0 предпросмотр
  • The Butterfly takes flight: The Butterfly, Vancouver, BC

    The tower takes shape as two sets of overlapping cylinders, clad with prefabricated panels intended to evoke clouds.
    PROJECT The Butterfly + First Baptist Church Complex
    ARCHITECT Revery Architecture
    PHOTOS Ema Peter
    When you fly into Vancouver, the most prominent structure in the city’s forest of glass skyscrapers is now a 57-storey edifice known as the Butterfly. Designed by Revery Architecture, the luxury residential tower is the latest in a string of high-rises that pop out of the city’s backdrop of generic window-wall façades. 
    The Butterfly’s striking form evolved over many years, beginning with studies dating back to 2012. Revery principal Venelin Kokalov imagined several options, most of them suggesting a distinct pair of architectural forms in dialogue. Renderings and models of the early concepts relay a wealth of imagination that is sorely missing from much of the city’s contemporary architecture, as land economics, zoning issues, and the profit motive often compel a default into generic glass-and-steel towers. The earliest concepts look starkly different—some evoke the Ginger and Fred building in Prague; others the Absolute Towers in Mississauga. But one consistent theme runs through the design evolution: a sense of two Rilkean solitudes, touching. 
    On each floor, semi-private sky gardens offer an outdoor place for residents to socialize.

    Client feedback, engineering studies, and simple pragmatics led to the final form: two sets of overlapping cylinders linked by a common breezeway and flanked by a rental apartment on one side and a restored church doubling as a community centre on the other. The contours of the floorplan are visually organic: evocative of human cells dividing. The roundness of the main massing is complemented by curvilinear balustrades that smoothly transform into the outer walls of each unit. It’s an eye-catching counterpoint to the orthogonality of the city’s built landscape. The two adjacent buildings—built, restored, and expanded as part of a density bonus arrangement with the city—help integrate this gargantuan structure with the lower-rise neighbourhood around it. 
    The Butterfly is a high-end, high-priced residential tower—one of the few typologies in which clients and communities are now willing to invest big money and resources in creative, visually astonishing architecture. That leads to a fundamental question: what is the public purpose of a luxury condo tower? 
    A public galleria joins the renovated First Baptist Church to the new building. Serving as a welcoming atrium, it allows for community access to the expanded church, including its daycare, full gymnasium, multi-purpose rooms, overnight emergency shelter, and community dining hall equipped with a commercial kitchen.
    Whatever one feels about the widening divide between the haves and have-nots in our big cities, this building—like its ilk—does serve several important public purposes. The most direct and quantifiable benefits are the two flanking buildings, also designed by Revery and part of the larger project. The seven-storey rental apartment provides a modest contribution to the city’s dearth of mid-priced housing. The superbly restored and seismically upgraded First Baptist Church has expanded into the area between the new tower and original church, and now offers the public a wider array of programming including a gymnasium, childcare facility, and areas for emergency shelter and counselling services for individuals in need. 
    The church’s Pinder Hall has been reimagined as a venue for church and community events including concerts, weddings, and cultural programming.
    The Butterfly’s character is largely defined by undulating precast concrete panels that wrap around the building. The architects describe the swooping lines as being inspired by clouds, but for this writer, the Butterfly evokes a 57-layer frosted cake towering above the city’s boxy skyline. Kokalov winces when he hears that impression, but it’s meant as a sincere compliment. Clouds are not universally welcome, but who doesn’t like cake? 
    Kokalov argues that its experiential quality is the building’s greatest distinction—most notably, the incorporation of an “outdoors”—not a balcony or deck, but an actual outdoor pathway—at all residential levels. For years the lead form-maker at Bing Thom Architects, Kokalov was responsible for much of the curvilinearity in the firm’s later works, including the 2019 Xiqu Centre opera house in Hong Kong. It’s easy to assume that his forte and focus would be pure aesthetic delight, but he avers that every sinuous curve has a practical rationale. 
    The breezeways provide residents with outdoor entries to their units—an unusual attribute for high-rise towers—and contribute to natural cooling, ventilation, and daylight in the suites.
    Defying the local tower-on-podium formula, the building’s façade falls almost straight to the ground. At street level, the building is indented with huge parabolic concavities. It’s an abrupt way to meet the street, but the fall is visually “broken” by a publicly accessible courtyard.  
    The tower’s layered, undulating volume is echoed in a soaring residential lobby, which includes developer Westbank’s signature—a bespoke Fazioli grand piano designed by the building’s architect.
    After passing through this courtyard, you enter the building via the usual indoor luxe foyer—complete with developer Westbank’s signature, an over-the-top hand-built grand piano designed by the architect. In this case, the piano’s baroquely sculpted legs are right in keeping with the architecture. But after taking the elevator up to the designated floor, you step out into what is technically “outdoors” and walk to your front door in a brief but bracing open-air transition. 
    The main entrance of every unit is accessed via a breezeway that runs from one side of the building to another. Unglazed and open to the outside, each breezeway is marked at one end with what the architects calla “sky garden,” in most cases consisting of a sapling that will grow into a leafy tree in due course, God and strata maintenance willing. This incorporation of nature and fresh air transforms the condominium units into something akin to townhouses, albeit stacked exceptionally high. 
    The suites feature a custom counter with a sculptural folded form.
    Inside each unit, the space can be expanded and contracted and reconfigured visually—not literally—by the fact that the interior wall of the secondary bedroom is completely transparent, floor to ceiling. It’s unusual, and slightly unnerving, but undeniably exciting for any occupants who wish to maximize their views to the mountains and sea. The curved glass wall transforms the room into a private enclave by means of a curtain, futuristically activated by remote control.
    The visual delight of swooping curves is only tempered when it’s wholly impractical—the offender here being a massive built-in counter that serves to both anchor and divide the living-kitchen areas. It reads as a long, pliable slab that is “folded” into the middle in such a way that the counter itself transforms into its own horseshoe-shaped base, creating a narrow crevice in the middle of the countertop. I marvel at its beauty and uniqueness; I weep for whoever is assigned to clean out the crumbs and other culinary flotsam that will fall into that crevice. 
    A structure made of high-performance modular precast concrete structural ribs arcs over a swimming pool that bridges between the building’s main amenity space and the podium roof.
    The building’s high-priced architecture may well bring more to the table than density-bonus amenities. On a broader scale, these luxe dwellings may be just what is needed to help lure the affluent from their mansions. As wealthy residents and investors continue to seek out land-hogging detached homes, the Butterfly offers an alternate concept that maintains the psychological benefit of a dedicated outside entrance and an outrageously flexible interior space. Further over-the-top amenities add to the appeal. Prominent among these is a supremely gorgeous residents-only swimming pool, housed within ribs of concrete columns that curve and dovetail into beams.  
    The ultimate public purpose for the architecturally spectacular condo tower: its role as public art in the city. The units in any of these buildings are the private side of architecture’s Janus face, but its presence in the skyline and on the street is highly public. By contributing a newly striking visual ballast, the Butterfly has served its purpose as one of the age-old Seven Arts: defining a location, a community, and an era.
    Adele Weder is a contributing editor to Canadian Architect.
    Screenshot
    CLIENT Westbank Corporation, First Baptist Church | ARCHITECT TEAM Venelin Kokalov, Bing Thom, Amirali Javidan, Nicole Hu, Shinobu Homma MRAIC, Bibi Fehr, Culum Osborne, Dustin Yee, Cody Loeffen, Kailey O’Farrell, Mark Melnichuk, Andrea Flynn, Jennifer Zhang, Daniel Gasser, Zhuoli Yang, Lisa Potopsingh | STRUCTURAL Glotman Simpson | MECHANICAL Introba | ELECTRICAL Nemetz & Associates, Inc. | LANDSCAPE SWA Groupw/ Cornelia Oberlander & G|ALA – Gauthier & Associates Landscape Architecture, Inc.| INTERIORS Revery Architecture | CONTRACTOR Icon West Construction; The Haebler Group| LIGHTING ARUP& Nemetz| SUSTAINABILITY & ENERGY MODELlING Introba | BUILDING ENVELOPE RDH Building Science, Inc. | HERITAGE CONSERVATION Donald Luxton & Associates, Inc.| ACOUSTICS BKL Consultants Ltd. | TRAFFIC Bunt & Associates, Inc. | POOL Rockingham Pool Consulting, Inc. | FOUNTAIN Vincent Helton & Associates | WIND Gradient Wind Engineering, Inc. | WASTE CONSULTANT Target Zero Waste Consulting, Inc. | AREA 56,206 M2 | BUDGET Withheld | COMPLETION Spring 2025
    ENERGY USE INTENSITY106 kWh/m2/year | WATER USE INTENSITY0.72 m3/m2/year

    As appeared in the June 2025 issue of Canadian Architect magazine

    The post The Butterfly takes flight: The Butterfly, Vancouver, BC appeared first on Canadian Architect.
    #butterfly #takes #flight #vancouver
    The Butterfly takes flight: The Butterfly, Vancouver, BC
    The tower takes shape as two sets of overlapping cylinders, clad with prefabricated panels intended to evoke clouds. PROJECT The Butterfly + First Baptist Church Complex ARCHITECT Revery Architecture PHOTOS Ema Peter When you fly into Vancouver, the most prominent structure in the city’s forest of glass skyscrapers is now a 57-storey edifice known as the Butterfly. Designed by Revery Architecture, the luxury residential tower is the latest in a string of high-rises that pop out of the city’s backdrop of generic window-wall façades.  The Butterfly’s striking form evolved over many years, beginning with studies dating back to 2012. Revery principal Venelin Kokalov imagined several options, most of them suggesting a distinct pair of architectural forms in dialogue. Renderings and models of the early concepts relay a wealth of imagination that is sorely missing from much of the city’s contemporary architecture, as land economics, zoning issues, and the profit motive often compel a default into generic glass-and-steel towers. The earliest concepts look starkly different—some evoke the Ginger and Fred building in Prague; others the Absolute Towers in Mississauga. But one consistent theme runs through the design evolution: a sense of two Rilkean solitudes, touching.  On each floor, semi-private sky gardens offer an outdoor place for residents to socialize. Client feedback, engineering studies, and simple pragmatics led to the final form: two sets of overlapping cylinders linked by a common breezeway and flanked by a rental apartment on one side and a restored church doubling as a community centre on the other. The contours of the floorplan are visually organic: evocative of human cells dividing. The roundness of the main massing is complemented by curvilinear balustrades that smoothly transform into the outer walls of each unit. It’s an eye-catching counterpoint to the orthogonality of the city’s built landscape. The two adjacent buildings—built, restored, and expanded as part of a density bonus arrangement with the city—help integrate this gargantuan structure with the lower-rise neighbourhood around it.  The Butterfly is a high-end, high-priced residential tower—one of the few typologies in which clients and communities are now willing to invest big money and resources in creative, visually astonishing architecture. That leads to a fundamental question: what is the public purpose of a luxury condo tower?  A public galleria joins the renovated First Baptist Church to the new building. Serving as a welcoming atrium, it allows for community access to the expanded church, including its daycare, full gymnasium, multi-purpose rooms, overnight emergency shelter, and community dining hall equipped with a commercial kitchen. Whatever one feels about the widening divide between the haves and have-nots in our big cities, this building—like its ilk—does serve several important public purposes. The most direct and quantifiable benefits are the two flanking buildings, also designed by Revery and part of the larger project. The seven-storey rental apartment provides a modest contribution to the city’s dearth of mid-priced housing. The superbly restored and seismically upgraded First Baptist Church has expanded into the area between the new tower and original church, and now offers the public a wider array of programming including a gymnasium, childcare facility, and areas for emergency shelter and counselling services for individuals in need.  The church’s Pinder Hall has been reimagined as a venue for church and community events including concerts, weddings, and cultural programming. The Butterfly’s character is largely defined by undulating precast concrete panels that wrap around the building. The architects describe the swooping lines as being inspired by clouds, but for this writer, the Butterfly evokes a 57-layer frosted cake towering above the city’s boxy skyline. Kokalov winces when he hears that impression, but it’s meant as a sincere compliment. Clouds are not universally welcome, but who doesn’t like cake?  Kokalov argues that its experiential quality is the building’s greatest distinction—most notably, the incorporation of an “outdoors”—not a balcony or deck, but an actual outdoor pathway—at all residential levels. For years the lead form-maker at Bing Thom Architects, Kokalov was responsible for much of the curvilinearity in the firm’s later works, including the 2019 Xiqu Centre opera house in Hong Kong. It’s easy to assume that his forte and focus would be pure aesthetic delight, but he avers that every sinuous curve has a practical rationale.  The breezeways provide residents with outdoor entries to their units—an unusual attribute for high-rise towers—and contribute to natural cooling, ventilation, and daylight in the suites. Defying the local tower-on-podium formula, the building’s façade falls almost straight to the ground. At street level, the building is indented with huge parabolic concavities. It’s an abrupt way to meet the street, but the fall is visually “broken” by a publicly accessible courtyard.   The tower’s layered, undulating volume is echoed in a soaring residential lobby, which includes developer Westbank’s signature—a bespoke Fazioli grand piano designed by the building’s architect. After passing through this courtyard, you enter the building via the usual indoor luxe foyer—complete with developer Westbank’s signature, an over-the-top hand-built grand piano designed by the architect. In this case, the piano’s baroquely sculpted legs are right in keeping with the architecture. But after taking the elevator up to the designated floor, you step out into what is technically “outdoors” and walk to your front door in a brief but bracing open-air transition.  The main entrance of every unit is accessed via a breezeway that runs from one side of the building to another. Unglazed and open to the outside, each breezeway is marked at one end with what the architects calla “sky garden,” in most cases consisting of a sapling that will grow into a leafy tree in due course, God and strata maintenance willing. This incorporation of nature and fresh air transforms the condominium units into something akin to townhouses, albeit stacked exceptionally high.  The suites feature a custom counter with a sculptural folded form. Inside each unit, the space can be expanded and contracted and reconfigured visually—not literally—by the fact that the interior wall of the secondary bedroom is completely transparent, floor to ceiling. It’s unusual, and slightly unnerving, but undeniably exciting for any occupants who wish to maximize their views to the mountains and sea. The curved glass wall transforms the room into a private enclave by means of a curtain, futuristically activated by remote control. The visual delight of swooping curves is only tempered when it’s wholly impractical—the offender here being a massive built-in counter that serves to both anchor and divide the living-kitchen areas. It reads as a long, pliable slab that is “folded” into the middle in such a way that the counter itself transforms into its own horseshoe-shaped base, creating a narrow crevice in the middle of the countertop. I marvel at its beauty and uniqueness; I weep for whoever is assigned to clean out the crumbs and other culinary flotsam that will fall into that crevice.  A structure made of high-performance modular precast concrete structural ribs arcs over a swimming pool that bridges between the building’s main amenity space and the podium roof. The building’s high-priced architecture may well bring more to the table than density-bonus amenities. On a broader scale, these luxe dwellings may be just what is needed to help lure the affluent from their mansions. As wealthy residents and investors continue to seek out land-hogging detached homes, the Butterfly offers an alternate concept that maintains the psychological benefit of a dedicated outside entrance and an outrageously flexible interior space. Further over-the-top amenities add to the appeal. Prominent among these is a supremely gorgeous residents-only swimming pool, housed within ribs of concrete columns that curve and dovetail into beams.   The ultimate public purpose for the architecturally spectacular condo tower: its role as public art in the city. The units in any of these buildings are the private side of architecture’s Janus face, but its presence in the skyline and on the street is highly public. By contributing a newly striking visual ballast, the Butterfly has served its purpose as one of the age-old Seven Arts: defining a location, a community, and an era. Adele Weder is a contributing editor to Canadian Architect. Screenshot CLIENT Westbank Corporation, First Baptist Church | ARCHITECT TEAM Venelin Kokalov, Bing Thom, Amirali Javidan, Nicole Hu, Shinobu Homma MRAIC, Bibi Fehr, Culum Osborne, Dustin Yee, Cody Loeffen, Kailey O’Farrell, Mark Melnichuk, Andrea Flynn, Jennifer Zhang, Daniel Gasser, Zhuoli Yang, Lisa Potopsingh | STRUCTURAL Glotman Simpson | MECHANICAL Introba | ELECTRICAL Nemetz & Associates, Inc. | LANDSCAPE SWA Groupw/ Cornelia Oberlander & G|ALA – Gauthier & Associates Landscape Architecture, Inc.| INTERIORS Revery Architecture | CONTRACTOR Icon West Construction; The Haebler Group| LIGHTING ARUP& Nemetz| SUSTAINABILITY & ENERGY MODELlING Introba | BUILDING ENVELOPE RDH Building Science, Inc. | HERITAGE CONSERVATION Donald Luxton & Associates, Inc.| ACOUSTICS BKL Consultants Ltd. | TRAFFIC Bunt & Associates, Inc. | POOL Rockingham Pool Consulting, Inc. | FOUNTAIN Vincent Helton & Associates | WIND Gradient Wind Engineering, Inc. | WASTE CONSULTANT Target Zero Waste Consulting, Inc. | AREA 56,206 M2 | BUDGET Withheld | COMPLETION Spring 2025 ENERGY USE INTENSITY106 kWh/m2/year | WATER USE INTENSITY0.72 m3/m2/year As appeared in the June 2025 issue of Canadian Architect magazine The post The Butterfly takes flight: The Butterfly, Vancouver, BC appeared first on Canadian Architect. #butterfly #takes #flight #vancouver
    WWW.CANADIANARCHITECT.COM
    The Butterfly takes flight: The Butterfly, Vancouver, BC
    The tower takes shape as two sets of overlapping cylinders, clad with prefabricated panels intended to evoke clouds. PROJECT The Butterfly + First Baptist Church Complex ARCHITECT Revery Architecture PHOTOS Ema Peter When you fly into Vancouver, the most prominent structure in the city’s forest of glass skyscrapers is now a 57-storey edifice known as the Butterfly. Designed by Revery Architecture, the luxury residential tower is the latest in a string of high-rises that pop out of the city’s backdrop of generic window-wall façades.  The Butterfly’s striking form evolved over many years, beginning with studies dating back to 2012. Revery principal Venelin Kokalov imagined several options, most of them suggesting a distinct pair of architectural forms in dialogue. Renderings and models of the early concepts relay a wealth of imagination that is sorely missing from much of the city’s contemporary architecture, as land economics, zoning issues, and the profit motive often compel a default into generic glass-and-steel towers. The earliest concepts look starkly different—some evoke the Ginger and Fred building in Prague (Frank Gehry with Vlado Milunić, 1996); others the Absolute Towers in Mississauga (MAD with Burka Varacalli Architects, 2009). But one consistent theme runs through the design evolution: a sense of two Rilkean solitudes, touching.  On each floor, semi-private sky gardens offer an outdoor place for residents to socialize. Client feedback, engineering studies, and simple pragmatics led to the final form: two sets of overlapping cylinders linked by a common breezeway and flanked by a rental apartment on one side and a restored church doubling as a community centre on the other. The contours of the floorplan are visually organic: evocative of human cells dividing. The roundness of the main massing is complemented by curvilinear balustrades that smoothly transform into the outer walls of each unit. It’s an eye-catching counterpoint to the orthogonality of the city’s built landscape. The two adjacent buildings—built, restored, and expanded as part of a density bonus arrangement with the city—help integrate this gargantuan structure with the lower-rise neighbourhood around it.  The Butterfly is a high-end, high-priced residential tower—one of the few typologies in which clients and communities are now willing to invest big money and resources in creative, visually astonishing architecture. That leads to a fundamental question: what is the public purpose of a luxury condo tower?  A public galleria joins the renovated First Baptist Church to the new building. Serving as a welcoming atrium, it allows for community access to the expanded church, including its daycare, full gymnasium, multi-purpose rooms, overnight emergency shelter, and community dining hall equipped with a commercial kitchen. Whatever one feels about the widening divide between the haves and have-nots in our big cities, this building—like its ilk—does serve several important public purposes. The most direct and quantifiable benefits are the two flanking buildings, also designed by Revery and part of the larger project. The seven-storey rental apartment provides a modest contribution to the city’s dearth of mid-priced housing. The superbly restored and seismically upgraded First Baptist Church has expanded into the area between the new tower and original church, and now offers the public a wider array of programming including a gymnasium, childcare facility, and areas for emergency shelter and counselling services for individuals in need.  The church’s Pinder Hall has been reimagined as a venue for church and community events including concerts, weddings, and cultural programming. The Butterfly’s character is largely defined by undulating precast concrete panels that wrap around the building. The architects describe the swooping lines as being inspired by clouds, but for this writer, the Butterfly evokes a 57-layer frosted cake towering above the city’s boxy skyline. Kokalov winces when he hears that impression, but it’s meant as a sincere compliment. Clouds are not universally welcome, but who doesn’t like cake?  Kokalov argues that its experiential quality is the building’s greatest distinction—most notably, the incorporation of an “outdoors”—not a balcony or deck, but an actual outdoor pathway—at all residential levels. For years the lead form-maker at Bing Thom Architects, Kokalov was responsible for much of the curvilinearity in the firm’s later works, including the 2019 Xiqu Centre opera house in Hong Kong. It’s easy to assume that his forte and focus would be pure aesthetic delight, but he avers that every sinuous curve has a practical rationale.  The breezeways provide residents with outdoor entries to their units—an unusual attribute for high-rise towers—and contribute to natural cooling, ventilation, and daylight in the suites. Defying the local tower-on-podium formula, the building’s façade falls almost straight to the ground. At street level, the building is indented with huge parabolic concavities. It’s an abrupt way to meet the street, but the fall is visually “broken” by a publicly accessible courtyard.   The tower’s layered, undulating volume is echoed in a soaring residential lobby, which includes developer Westbank’s signature—a bespoke Fazioli grand piano designed by the building’s architect. After passing through this courtyard, you enter the building via the usual indoor luxe foyer—complete with developer Westbank’s signature, an over-the-top hand-built grand piano designed by the architect. In this case, the piano’s baroquely sculpted legs are right in keeping with the architecture. But after taking the elevator up to the designated floor, you step out into what is technically “outdoors” and walk to your front door in a brief but bracing open-air transition.  The main entrance of every unit is accessed via a breezeway that runs from one side of the building to another. Unglazed and open to the outside, each breezeway is marked at one end with what the architects call (a little ambitiously) a “sky garden,” in most cases consisting of a sapling that will grow into a leafy tree in due course, God and strata maintenance willing. This incorporation of nature and fresh air transforms the condominium units into something akin to townhouses, albeit stacked exceptionally high.  The suites feature a custom counter with a sculptural folded form. Inside each unit, the space can be expanded and contracted and reconfigured visually—not literally—by the fact that the interior wall of the secondary bedroom is completely transparent, floor to ceiling. It’s unusual, and slightly unnerving, but undeniably exciting for any occupants who wish to maximize their views to the mountains and sea. The curved glass wall transforms the room into a private enclave by means of a curtain, futuristically activated by remote control. The visual delight of swooping curves is only tempered when it’s wholly impractical—the offender here being a massive built-in counter that serves to both anchor and divide the living-kitchen areas. It reads as a long, pliable slab that is “folded” into the middle in such a way that the counter itself transforms into its own horseshoe-shaped base, creating a narrow crevice in the middle of the countertop. I marvel at its beauty and uniqueness; I weep for whoever is assigned to clean out the crumbs and other culinary flotsam that will fall into that crevice.  A structure made of high-performance modular precast concrete structural ribs arcs over a swimming pool that bridges between the building’s main amenity space and the podium roof. The building’s high-priced architecture may well bring more to the table than density-bonus amenities. On a broader scale, these luxe dwellings may be just what is needed to help lure the affluent from their mansions. As wealthy residents and investors continue to seek out land-hogging detached homes, the Butterfly offers an alternate concept that maintains the psychological benefit of a dedicated outside entrance and an outrageously flexible interior space. Further over-the-top amenities add to the appeal. Prominent among these is a supremely gorgeous residents-only swimming pool, housed within ribs of concrete columns that curve and dovetail into beams.   The ultimate public purpose for the architecturally spectacular condo tower: its role as public art in the city. The units in any of these buildings are the private side of architecture’s Janus face, but its presence in the skyline and on the street is highly public. By contributing a newly striking visual ballast, the Butterfly has served its purpose as one of the age-old Seven Arts: defining a location, a community, and an era. Adele Weder is a contributing editor to Canadian Architect. Screenshot CLIENT Westbank Corporation, First Baptist Church | ARCHITECT TEAM Venelin Kokalov (MRAIC), Bing Thom (FRAIC, deceased 2016), Amirali Javidan, Nicole Hu, Shinobu Homma MRAIC, Bibi Fehr, Culum Osborne, Dustin Yee, Cody Loeffen, Kailey O’Farrell, Mark Melnichuk, Andrea Flynn, Jennifer Zhang, Daniel Gasser, Zhuoli Yang, Lisa Potopsingh | STRUCTURAL Glotman Simpson | MECHANICAL Introba | ELECTRICAL Nemetz & Associates, Inc. | LANDSCAPE SWA Group (Design) w/ Cornelia Oberlander & G|ALA – Gauthier & Associates Landscape Architecture, Inc. (Landscape Architect of Record) | INTERIORS Revery Architecture | CONTRACTOR Icon West Construction (new construction); The Haebler Group (heritage) | LIGHTING ARUP (Design) & Nemetz (Engineer of Record) | SUSTAINABILITY & ENERGY MODELlING Introba | BUILDING ENVELOPE RDH Building Science, Inc. | HERITAGE CONSERVATION Donald Luxton & Associates, Inc.| ACOUSTICS BKL Consultants Ltd. | TRAFFIC Bunt & Associates, Inc. | POOL Rockingham Pool Consulting, Inc. | FOUNTAIN Vincent Helton & Associates | WIND Gradient Wind Engineering, Inc. | WASTE CONSULTANT Target Zero Waste Consulting, Inc. | AREA 56,206 M2 | BUDGET Withheld | COMPLETION Spring 2025 ENERGY USE INTENSITY (PROJECTED) 106 kWh/m2/year | WATER USE INTENSITY (PROJECTED) 0.72 m3/m2/year As appeared in the June 2025 issue of Canadian Architect magazine The post The Butterfly takes flight: The Butterfly, Vancouver, BC appeared first on Canadian Architect.
    0 Комментарии 0 Поделились 0 предпросмотр
  • How white-tailed deer came back from the brink of extinction

    Given their abundance in American backyards, gardens and highway corridors these days, it may be surprising to learn that white-tailed deer were nearly extinct about a century ago. While they currently number somewhere in the range of 30 million to 35 million, at the turn of the 20th century, there were as few as 300,000 whitetails across the entire continent: just 1% of the current population.

    This near-disappearance of deer was much discussed at the time. In 1854, Henry David Thoreau had written that no deer had been hunted near Concord, Massachusetts, for a generation. In his famous “Walden,” he reported:

    “One man still preserves the horns of the last deer that was killed in this vicinity, and another has told me the particulars of the hunt in which his uncle was engaged. The hunters were formerly a numerous and merry crew here.”

    But what happened to white-tailed deer? What drove them nearly to extinction, and then what brought them back from the brink?

    As a historical ecologist and environmental archaeologist, I have made it my job to answer these questions. Over the past decade, I’ve studied white-tailed deer bones from archaeological sites across the eastern United States, as well as historical records and ecological data, to help piece together the story of this species.

    Precolonial rise of deer populations

    White-tailed deer have been hunted from the earliest migrations of people into North America, more than 15,000 years ago. The species was far from the most important food resource at that time, though.

    Archaeological evidence suggests that white-tailed deer abundance only began to increase after the extinction of megafauna species like mammoths and mastodons opened up ecological niches for deer to fill. Deer bones become very common in archaeological sites from about 6,000 years ago onward, reflecting the economic and cultural importance of the species for Indigenous peoples.

    Despite being so frequently hunted, deer populations do not seem to have appreciably declined due to Indigenous hunting prior to AD 1600. Unlike elk or sturgeon, whose numbers were reduced by Indigenous hunters and fishers, white-tailed deer seem to have been resilient to human predation. While archaeologists have found some evidence for human-caused declines in certain parts of North America, other cases are more ambiguous, and deer certainly remained abundant throughout the past several millennia.

    Human use of fire could partly explain why white-tailed deer may have been resilient to hunting. Indigenous peoples across North America have long used controlled burning to promote ecosystem health, disturbing old vegetation to promote new growth. Deer love this sort of successional vegetation for food and cover, and thus thrive in previously burned habitats. Indigenous people may have therefore facilitated deer population growth, counteracting any harmful hunting pressure.

    More research is needed, but even though some hunting pressure is evident, the general picture from the precolonial era is that deer seem to have been doing just fine for thousands of years. Ecologists estimate that there were roughly 30 million white-tailed deer in North America on the eve of European colonization—about the same number as today.

    A 16th-century engraving depicts Indigenous Floridians hunting deer while disguised in deerskins.Colonial-era fall of deer numbers

    To better understand how deer populations changed in the colonial era, I recently analyzed deer bones from two archaeological sites in what is now Connecticut. My analysis suggests that hunting pressure on white-tailed deer increased almost as soon as European colonists arrived.

    At one site dated to the 11th to 14th centuriesI found that only about 7% to 10% of the deer killed were juveniles.

    Hunters generally don’t take juvenile deer if they’re frequently encountering adults, since adult deer tend to be larger, offering more meat and bigger hides. Additionally, hunting increases mortality on a deer herd but doesn’t directly affect fertility, so deer populations experiencing hunting pressure end up with juvenile-skewed age structures. For these reasons, this low percentage of juvenile deer prior to European colonization indicates minimal hunting pressure on local herds.

    However, at a nearby site occupied during the 17th century—just after European colonization—between 22% and 31% of the deer hunted were juveniles, suggesting a substantial increase in hunting pressure.

    This elevated hunting pressure likely resulted from the transformation of deer into a commodity for the first time. Venison, antlers and deerskins may have long been exchanged within Indigenous trade networks, but things changed drastically in the 17th century. European colonists integrated North America into a trans-Atlantic mercantile capitalist economic system with no precedent in Indigenous society. This applied new pressures to the continent’s natural resources.

    Deer—particularly their skins—were commodified and sold in markets in the colonies initially and, by the 18th century, in Europe as well. Deer were now being exploited by traders, merchants and manufacturers desiring profit, not simply hunters desiring meat or leather. It was the resulting hunting pressure that drove the species toward its extinction.

    20th-century rebound of white-tailed deer

    Thanks to the rise of the conservation movement in the late 19th and early 20th centuries, white-tailed deer survived their brush with extinction.

    Concerned citizens and outdoorsmen feared for the fate of deer and other wildlife, and pushed for new legislative protections.

    The Lacey Act of 1900, for example, banned interstate transport of poached game and—in combination with state-level protections—helped end commercial deer hunting by effectively de-commodifying the species. Aided by conservation-oriented hunting practices and reintroductions of deer from surviving populations to areas where they had been extirpated, white-tailed deer rebounded.

    The story of white-tailed deer underscores an important fact: Humans are not inherently damaging to the environment. Hunting from the 17th through 19th centuries threatened the existence of white-tailed deer, but precolonial Indigenous hunting and environmental management appear to have been relatively sustainable, and modern regulatory governance in the 20th century forestalled and reversed their looming extinction.

    Elic Weitzel, Peter Buck Postdoctoral Research Fellow, Smithsonian Institution

    This article is republished from The Conversation under a Creative Commons license. Read the original article.
    #how #whitetaileddeer #came #back #brink
    How white-tailed deer came back from the brink of extinction
    Given their abundance in American backyards, gardens and highway corridors these days, it may be surprising to learn that white-tailed deer were nearly extinct about a century ago. While they currently number somewhere in the range of 30 million to 35 million, at the turn of the 20th century, there were as few as 300,000 whitetails across the entire continent: just 1% of the current population. This near-disappearance of deer was much discussed at the time. In 1854, Henry David Thoreau had written that no deer had been hunted near Concord, Massachusetts, for a generation. In his famous “Walden,” he reported: “One man still preserves the horns of the last deer that was killed in this vicinity, and another has told me the particulars of the hunt in which his uncle was engaged. The hunters were formerly a numerous and merry crew here.” But what happened to white-tailed deer? What drove them nearly to extinction, and then what brought them back from the brink? As a historical ecologist and environmental archaeologist, I have made it my job to answer these questions. Over the past decade, I’ve studied white-tailed deer bones from archaeological sites across the eastern United States, as well as historical records and ecological data, to help piece together the story of this species. Precolonial rise of deer populations White-tailed deer have been hunted from the earliest migrations of people into North America, more than 15,000 years ago. The species was far from the most important food resource at that time, though. Archaeological evidence suggests that white-tailed deer abundance only began to increase after the extinction of megafauna species like mammoths and mastodons opened up ecological niches for deer to fill. Deer bones become very common in archaeological sites from about 6,000 years ago onward, reflecting the economic and cultural importance of the species for Indigenous peoples. Despite being so frequently hunted, deer populations do not seem to have appreciably declined due to Indigenous hunting prior to AD 1600. Unlike elk or sturgeon, whose numbers were reduced by Indigenous hunters and fishers, white-tailed deer seem to have been resilient to human predation. While archaeologists have found some evidence for human-caused declines in certain parts of North America, other cases are more ambiguous, and deer certainly remained abundant throughout the past several millennia. Human use of fire could partly explain why white-tailed deer may have been resilient to hunting. Indigenous peoples across North America have long used controlled burning to promote ecosystem health, disturbing old vegetation to promote new growth. Deer love this sort of successional vegetation for food and cover, and thus thrive in previously burned habitats. Indigenous people may have therefore facilitated deer population growth, counteracting any harmful hunting pressure. More research is needed, but even though some hunting pressure is evident, the general picture from the precolonial era is that deer seem to have been doing just fine for thousands of years. Ecologists estimate that there were roughly 30 million white-tailed deer in North America on the eve of European colonization—about the same number as today. A 16th-century engraving depicts Indigenous Floridians hunting deer while disguised in deerskins.Colonial-era fall of deer numbers To better understand how deer populations changed in the colonial era, I recently analyzed deer bones from two archaeological sites in what is now Connecticut. My analysis suggests that hunting pressure on white-tailed deer increased almost as soon as European colonists arrived. At one site dated to the 11th to 14th centuriesI found that only about 7% to 10% of the deer killed were juveniles. Hunters generally don’t take juvenile deer if they’re frequently encountering adults, since adult deer tend to be larger, offering more meat and bigger hides. Additionally, hunting increases mortality on a deer herd but doesn’t directly affect fertility, so deer populations experiencing hunting pressure end up with juvenile-skewed age structures. For these reasons, this low percentage of juvenile deer prior to European colonization indicates minimal hunting pressure on local herds. However, at a nearby site occupied during the 17th century—just after European colonization—between 22% and 31% of the deer hunted were juveniles, suggesting a substantial increase in hunting pressure. This elevated hunting pressure likely resulted from the transformation of deer into a commodity for the first time. Venison, antlers and deerskins may have long been exchanged within Indigenous trade networks, but things changed drastically in the 17th century. European colonists integrated North America into a trans-Atlantic mercantile capitalist economic system with no precedent in Indigenous society. This applied new pressures to the continent’s natural resources. Deer—particularly their skins—were commodified and sold in markets in the colonies initially and, by the 18th century, in Europe as well. Deer were now being exploited by traders, merchants and manufacturers desiring profit, not simply hunters desiring meat or leather. It was the resulting hunting pressure that drove the species toward its extinction. 20th-century rebound of white-tailed deer Thanks to the rise of the conservation movement in the late 19th and early 20th centuries, white-tailed deer survived their brush with extinction. Concerned citizens and outdoorsmen feared for the fate of deer and other wildlife, and pushed for new legislative protections. The Lacey Act of 1900, for example, banned interstate transport of poached game and—in combination with state-level protections—helped end commercial deer hunting by effectively de-commodifying the species. Aided by conservation-oriented hunting practices and reintroductions of deer from surviving populations to areas where they had been extirpated, white-tailed deer rebounded. The story of white-tailed deer underscores an important fact: Humans are not inherently damaging to the environment. Hunting from the 17th through 19th centuries threatened the existence of white-tailed deer, but precolonial Indigenous hunting and environmental management appear to have been relatively sustainable, and modern regulatory governance in the 20th century forestalled and reversed their looming extinction. Elic Weitzel, Peter Buck Postdoctoral Research Fellow, Smithsonian Institution This article is republished from The Conversation under a Creative Commons license. Read the original article. #how #whitetaileddeer #came #back #brink
    WWW.FASTCOMPANY.COM
    How white-tailed deer came back from the brink of extinction
    Given their abundance in American backyards, gardens and highway corridors these days, it may be surprising to learn that white-tailed deer were nearly extinct about a century ago. While they currently number somewhere in the range of 30 million to 35 million, at the turn of the 20th century, there were as few as 300,000 whitetails across the entire continent: just 1% of the current population. This near-disappearance of deer was much discussed at the time. In 1854, Henry David Thoreau had written that no deer had been hunted near Concord, Massachusetts, for a generation. In his famous “Walden,” he reported: “One man still preserves the horns of the last deer that was killed in this vicinity, and another has told me the particulars of the hunt in which his uncle was engaged. The hunters were formerly a numerous and merry crew here.” But what happened to white-tailed deer? What drove them nearly to extinction, and then what brought them back from the brink? As a historical ecologist and environmental archaeologist, I have made it my job to answer these questions. Over the past decade, I’ve studied white-tailed deer bones from archaeological sites across the eastern United States, as well as historical records and ecological data, to help piece together the story of this species. Precolonial rise of deer populations White-tailed deer have been hunted from the earliest migrations of people into North America, more than 15,000 years ago. The species was far from the most important food resource at that time, though. Archaeological evidence suggests that white-tailed deer abundance only began to increase after the extinction of megafauna species like mammoths and mastodons opened up ecological niches for deer to fill. Deer bones become very common in archaeological sites from about 6,000 years ago onward, reflecting the economic and cultural importance of the species for Indigenous peoples. Despite being so frequently hunted, deer populations do not seem to have appreciably declined due to Indigenous hunting prior to AD 1600. Unlike elk or sturgeon, whose numbers were reduced by Indigenous hunters and fishers, white-tailed deer seem to have been resilient to human predation. While archaeologists have found some evidence for human-caused declines in certain parts of North America, other cases are more ambiguous, and deer certainly remained abundant throughout the past several millennia. Human use of fire could partly explain why white-tailed deer may have been resilient to hunting. Indigenous peoples across North America have long used controlled burning to promote ecosystem health, disturbing old vegetation to promote new growth. Deer love this sort of successional vegetation for food and cover, and thus thrive in previously burned habitats. Indigenous people may have therefore facilitated deer population growth, counteracting any harmful hunting pressure. More research is needed, but even though some hunting pressure is evident, the general picture from the precolonial era is that deer seem to have been doing just fine for thousands of years. Ecologists estimate that there were roughly 30 million white-tailed deer in North America on the eve of European colonization—about the same number as today. A 16th-century engraving depicts Indigenous Floridians hunting deer while disguised in deerskins. [Photo: Theodor de Bry/DEA Picture Library/De Agostini/Getty Images] Colonial-era fall of deer numbers To better understand how deer populations changed in the colonial era, I recently analyzed deer bones from two archaeological sites in what is now Connecticut. My analysis suggests that hunting pressure on white-tailed deer increased almost as soon as European colonists arrived. At one site dated to the 11th to 14th centuries (before European colonization) I found that only about 7% to 10% of the deer killed were juveniles. Hunters generally don’t take juvenile deer if they’re frequently encountering adults, since adult deer tend to be larger, offering more meat and bigger hides. Additionally, hunting increases mortality on a deer herd but doesn’t directly affect fertility, so deer populations experiencing hunting pressure end up with juvenile-skewed age structures. For these reasons, this low percentage of juvenile deer prior to European colonization indicates minimal hunting pressure on local herds. However, at a nearby site occupied during the 17th century—just after European colonization—between 22% and 31% of the deer hunted were juveniles, suggesting a substantial increase in hunting pressure. This elevated hunting pressure likely resulted from the transformation of deer into a commodity for the first time. Venison, antlers and deerskins may have long been exchanged within Indigenous trade networks, but things changed drastically in the 17th century. European colonists integrated North America into a trans-Atlantic mercantile capitalist economic system with no precedent in Indigenous society. This applied new pressures to the continent’s natural resources. Deer—particularly their skins—were commodified and sold in markets in the colonies initially and, by the 18th century, in Europe as well. Deer were now being exploited by traders, merchants and manufacturers desiring profit, not simply hunters desiring meat or leather. It was the resulting hunting pressure that drove the species toward its extinction. 20th-century rebound of white-tailed deer Thanks to the rise of the conservation movement in the late 19th and early 20th centuries, white-tailed deer survived their brush with extinction. Concerned citizens and outdoorsmen feared for the fate of deer and other wildlife, and pushed for new legislative protections. The Lacey Act of 1900, for example, banned interstate transport of poached game and—in combination with state-level protections—helped end commercial deer hunting by effectively de-commodifying the species. Aided by conservation-oriented hunting practices and reintroductions of deer from surviving populations to areas where they had been extirpated, white-tailed deer rebounded. The story of white-tailed deer underscores an important fact: Humans are not inherently damaging to the environment. Hunting from the 17th through 19th centuries threatened the existence of white-tailed deer, but precolonial Indigenous hunting and environmental management appear to have been relatively sustainable, and modern regulatory governance in the 20th century forestalled and reversed their looming extinction. Elic Weitzel, Peter Buck Postdoctoral Research Fellow, Smithsonian Institution This article is republished from The Conversation under a Creative Commons license. Read the original article.
    0 Комментарии 0 Поделились 0 предпросмотр
Расширенные страницы
CGShares https://cgshares.com