• Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects

    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada.
    Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption.
    Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits.
    Figure 1: Preheating air for industrial buildings: 2,750 m2of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies
    Quebec’s solar air heating boom: the Trigo Energies story
    Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies.
    Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.”
    One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.
     
    Blue or black, but always efficient: the advanced absorber coating
    In October 2024, the majority of the new 2,750 m²solar façade at FAB3R began operation. According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system.
    The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating.
    Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon.
    Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy
    Matrix Energy: collaborating with architects and engineers in new builds
    The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy.
    Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers.
    “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added.
    Finding the right flow: the importance of unitary airflow rates
    One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance.
    For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170, or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m²offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained.
    It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² are necessary.
    Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering
    Solar air heating systems support LEED-certified building designs
    Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto, where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m².
    “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick.
    The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances.
    The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future.
    Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif

    Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication.
    The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect.
    #oped #canadas #leadership #solar #air
    Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects
    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada. Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption. Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits. Figure 1: Preheating air for industrial buildings: 2,750 m2of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies Quebec’s solar air heating boom: the Trigo Energies story Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies. Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.” One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.   Blue or black, but always efficient: the advanced absorber coating In October 2024, the majority of the new 2,750 m²solar façade at FAB3R began operation. According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system. The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating. Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon. Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy Matrix Energy: collaborating with architects and engineers in new builds The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy. Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers. “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added. Finding the right flow: the importance of unitary airflow rates One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance. For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170, or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m²offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained. It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² are necessary. Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering Solar air heating systems support LEED-certified building designs Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto, where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m². “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick. The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances. The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future. Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication. The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect. #oped #canadas #leadership #solar #air
    WWW.CANADIANARCHITECT.COM
    Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects
    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2 (282,046 ft2) of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada. Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption. Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits. Figure 1: Preheating air for industrial buildings: 2,750 m2 (29,600 ft2) of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies Quebec’s solar air heating boom: the Trigo Energies story Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies. Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.” One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.   Blue or black, but always efficient: the advanced absorber coating In October 2024, the majority of the new 2,750 m² (29,600 ft2) solar façade at FAB3R began operation (see figure 1). According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system. The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating. Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon. Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2 (2,045 ft2) south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy Matrix Energy: collaborating with architects and engineers in new builds The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy. Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers. “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added. Finding the right flow: the importance of unitary airflow rates One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance. For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170 (m3/h/m2), or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m² (7.2 cfm/ft2) offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained. It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² (8.3 to 9.4 cfm/ft2)  are necessary. Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2 (1,722 ft2) of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering Solar air heating systems support LEED-certified building designs Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto (see Figure 3), where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m² (10,764 ft2). “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick. The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances. The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future. Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2 (775 ft2) Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication. The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Thermasol’s New Saunas Meld Scandinavian Design With Wellness Innovation

    Wellness is no longer something we seek out. It’s something we’re building into our homes. From infrared panels and cold plunges to entire rooms designed for rest and recovery, the wellness-at-home movement is reshaping how we live and reset. Thermasol, a pioneer in steam and sauna innovation since 1958, embraces this shift with a new collection of Scandinavian-inspired saunas that bring spa-level luxury to both indoor and outdoor spaces in your home. With natural materials, European craftsmanship, and smart features like WiFi-enabled controls and ambient lighting, each sauna is designed to transform daily rituals into immersive moments of restoration.
    Aalto Indoor Sauna
    The new indoor models – Aalto, Astra, and Lumaria – each take a distinct approach to serenity. Aalto, named after the Finnish word for “wave,” features sculptural lines and a seamless blend of wood and glass, creating an organic sense of flow.
    Astra Indoor Sauna
    Astra, derived from the Latin word for “stars,” offers a luminous experience with a sleek, corner-friendly layout that maximizes space without compromising elegance.
    Lumaria Indoor Sauna
    Lumaria, a combination of the words “lumina”and “aria”, lives up to its name by blending gentle illumination and refined design in two flexible sizes. The Medium fits two to three people while the Large fits up to five – perfect for home wellness areas of all shapes and scales.
    Ombra Outdoor Sauna
    Designed by award-winning designer Bojan Črešnar, Thermasol’s five new outdoor saunas feel like you’re stepping away for a short vacation away from home. Each model is a study in contrast – bold yet serene, architectural yet inviting. The Ombra uses tinted, reflective glass that offers privacy while maximizing the views. Wavy wood accents add an organic warmth to the otherwise streamlined facade.
    Ombra Outdoor Sauna
    Ombra Outdoor Sauna
    Ombra Outdoor Sauna
    Vue Outdoor Sauna
    Vue Outdoor Sauna
    Vue Outdoor Sauna
    The Vue opens up to nature with a striking full-glass front, while the Fortis leans into warmth and durability with thermally modified wood and layered insulation.
    Fortis Outdoor Sauna
    Fortis Outdoor Sauna
    Spectra Outdoor Sauna
    For those with smaller footprints, the Spectra delivers big on style in a compact form, and the Vera is tailored for balconies, rooftops, and garden corners, bringing wellness to even the coziest of spaces.
    Spectra Outdoor Sauna
    With this new collection, Thermasol continues to evolve what wellness can look and feel like at home. These saunas merge design, technology, and tradition in ways that feel both elevated and deeply personal. Whether you’re carving out a moment of stillness indoors or soaking in the quiet of your backyard, these saunas invite you to pause, reset, and reconnect – no spa membership required.
    Vera Outdoor Sauna
    To learn more about Thermasol’s newest saunas and bring tranquility into your own home, visit thermasol.com.
    Imagery courtesy of Thermosol.
    #thermasols #new #saunas #meld #scandinavian
    Thermasol’s New Saunas Meld Scandinavian Design With Wellness Innovation
    Wellness is no longer something we seek out. It’s something we’re building into our homes. From infrared panels and cold plunges to entire rooms designed for rest and recovery, the wellness-at-home movement is reshaping how we live and reset. Thermasol, a pioneer in steam and sauna innovation since 1958, embraces this shift with a new collection of Scandinavian-inspired saunas that bring spa-level luxury to both indoor and outdoor spaces in your home. With natural materials, European craftsmanship, and smart features like WiFi-enabled controls and ambient lighting, each sauna is designed to transform daily rituals into immersive moments of restoration. Aalto Indoor Sauna The new indoor models – Aalto, Astra, and Lumaria – each take a distinct approach to serenity. Aalto, named after the Finnish word for “wave,” features sculptural lines and a seamless blend of wood and glass, creating an organic sense of flow. Astra Indoor Sauna Astra, derived from the Latin word for “stars,” offers a luminous experience with a sleek, corner-friendly layout that maximizes space without compromising elegance. Lumaria Indoor Sauna Lumaria, a combination of the words “lumina”and “aria”, lives up to its name by blending gentle illumination and refined design in two flexible sizes. The Medium fits two to three people while the Large fits up to five – perfect for home wellness areas of all shapes and scales. Ombra Outdoor Sauna Designed by award-winning designer Bojan Črešnar, Thermasol’s five new outdoor saunas feel like you’re stepping away for a short vacation away from home. Each model is a study in contrast – bold yet serene, architectural yet inviting. The Ombra uses tinted, reflective glass that offers privacy while maximizing the views. Wavy wood accents add an organic warmth to the otherwise streamlined facade. Ombra Outdoor Sauna Ombra Outdoor Sauna Ombra Outdoor Sauna Vue Outdoor Sauna Vue Outdoor Sauna Vue Outdoor Sauna The Vue opens up to nature with a striking full-glass front, while the Fortis leans into warmth and durability with thermally modified wood and layered insulation. Fortis Outdoor Sauna Fortis Outdoor Sauna Spectra Outdoor Sauna For those with smaller footprints, the Spectra delivers big on style in a compact form, and the Vera is tailored for balconies, rooftops, and garden corners, bringing wellness to even the coziest of spaces. Spectra Outdoor Sauna With this new collection, Thermasol continues to evolve what wellness can look and feel like at home. These saunas merge design, technology, and tradition in ways that feel both elevated and deeply personal. Whether you’re carving out a moment of stillness indoors or soaking in the quiet of your backyard, these saunas invite you to pause, reset, and reconnect – no spa membership required. Vera Outdoor Sauna To learn more about Thermasol’s newest saunas and bring tranquility into your own home, visit thermasol.com. Imagery courtesy of Thermosol. #thermasols #new #saunas #meld #scandinavian
    DESIGN-MILK.COM
    Thermasol’s New Saunas Meld Scandinavian Design With Wellness Innovation
    Wellness is no longer something we seek out. It’s something we’re building into our homes. From infrared panels and cold plunges to entire rooms designed for rest and recovery, the wellness-at-home movement is reshaping how we live and reset. Thermasol, a pioneer in steam and sauna innovation since 1958, embraces this shift with a new collection of Scandinavian-inspired saunas that bring spa-level luxury to both indoor and outdoor spaces in your home. With natural materials, European craftsmanship, and smart features like WiFi-enabled controls and ambient lighting, each sauna is designed to transform daily rituals into immersive moments of restoration. Aalto Indoor Sauna The new indoor models – Aalto, Astra, and Lumaria – each take a distinct approach to serenity. Aalto, named after the Finnish word for “wave,” features sculptural lines and a seamless blend of wood and glass, creating an organic sense of flow. Astra Indoor Sauna Astra, derived from the Latin word for “stars,” offers a luminous experience with a sleek, corner-friendly layout that maximizes space without compromising elegance. Lumaria Indoor Sauna Lumaria, a combination of the words “lumina” (which means light) and “aria” (or melody), lives up to its name by blending gentle illumination and refined design in two flexible sizes. The Medium fits two to three people while the Large fits up to five – perfect for home wellness areas of all shapes and scales. Ombra Outdoor Sauna Designed by award-winning designer Bojan Črešnar, Thermasol’s five new outdoor saunas feel like you’re stepping away for a short vacation away from home. Each model is a study in contrast – bold yet serene, architectural yet inviting. The Ombra uses tinted, reflective glass that offers privacy while maximizing the views. Wavy wood accents add an organic warmth to the otherwise streamlined facade. Ombra Outdoor Sauna Ombra Outdoor Sauna Ombra Outdoor Sauna Vue Outdoor Sauna Vue Outdoor Sauna Vue Outdoor Sauna The Vue opens up to nature with a striking full-glass front, while the Fortis leans into warmth and durability with thermally modified wood and layered insulation. Fortis Outdoor Sauna Fortis Outdoor Sauna Spectra Outdoor Sauna For those with smaller footprints, the Spectra delivers big on style in a compact form, and the Vera is tailored for balconies, rooftops, and garden corners, bringing wellness to even the coziest of spaces. Spectra Outdoor Sauna With this new collection, Thermasol continues to evolve what wellness can look and feel like at home. These saunas merge design, technology, and tradition in ways that feel both elevated and deeply personal. Whether you’re carving out a moment of stillness indoors or soaking in the quiet of your backyard, these saunas invite you to pause, reset, and reconnect – no spa membership required. Vera Outdoor Sauna To learn more about Thermasol’s newest saunas and bring tranquility into your own home, visit thermasol.com. Imagery courtesy of Thermosol.
    Like
    Love
    Wow
    Sad
    Angry
    353
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Astronomers detect most powerful explosions since Big Bang

    An artist's illustration of an unlucky massive star approaching a supermassive black hole. Credit: University of Hawai'i

    Get the Popular Science daily newsletter
    Breakthroughs, discoveries, and DIY tips sent every weekday.

    At any given time across the universe, massive cosmic bodies are releasing incomprehensible amounts of energy. Stars burn like celestial nuclear fusion reactors, quasars emit thousands of times the luminosity of the Milky Way galaxy, and asteroids slam into planets. But all of these pale in comparison to a new class of events discovered by researchers at the University of Hawai’i’s Institute for Astronomy. According to their findings published June 4 in the journal Science Advances, it’s time to classify the universe’s most energetic explosions as extreme nuclear transients–or ENTs.
    ENTs are as devastating as they are rare. They only occur when a massive star at least three times heavier than the sun drifts too close to a supermassive black hole. The colliding forces subsequently obliterate the star, sending out plumes of energy across huge swaths of space. Similar events known as tidal disruption eventsare known to occur on asmaller scale, and have been documented for over a decade. But ENTs are something else entirely.
    “ENTs are different beasts,” study lead author and astronomer Jason Hinkle explained in an accompanying statement. “Not only are ENTs far brighter than normal tidal disruption events, but they remain luminous for years, far surpassing the energy output of even the brightest known supernova explosions.”
    Hinkle was first tipped off to ENTs while looking into transients—longlasting flares that spew energy from a galaxy’s center. Two particularly strange examples captured by the European Space Agency’s Gaia mission caught his eye. The pair of events brightened over a much longer timeframe than previously documented transients, but lacked some of their usual characteristics.
    “Gaia doesn’t tell you what a transient is, just that something changed in brightness,” Hinkle said. “But when I saw these smooth, long-lived flares from the centers of distant galaxies, I knew we were looking at something unusual.”
    Hinkle soon reached out to observatory teams around the world for what would become a multiyear project to understand these anomalies. In the process, a third suspect was detected by the Zwicky Transient Facility at the Palomar Observatory in San Diego. After months of analysis, Hinkle and collaborators realized they were witnessing something unprecedented.
    An infrared echo tells us that a dusty torus surrounds the central black hole and newly-formed accretion disk. Credit: University of Hawai’i
    The ENTs analyzed by astronomers displayed smoother, longer lasting flares that pointed towards something very particular—a supermassive black hole accreting a giant, wayward star.
    This contrasts with a more standard black hole that typically acquires its material and energy unpredictably, resulting in irregular brightness fluctuations.
    The energy and luminosity of an ENT boggles the mind. The most powerful ENT documented in Hinkle’s study, Gaia18cdj, generated 25 times more energy than the most powerful known supernovae. For reference, a standard supernova puts out as much energy in a single year as the sun does across its entire 10 billion year lifespan. Gaia18cdj, meanwhile, manages to give off 100 suns’ worth of energy over just 12 months.
    The implications of ENTs and their massive energy surges go far beyond their impressive energy outputs. Astronomers believe they contribute to some of the most pivotal events in the cosmos.
    “These ENTs don’t just mark the dramatic end of a massive star’s life. They illuminate the processes responsible for growing the largest black holes in the universe,” said Hinkle.
    From here on Earth, ENTs can also help researchers as they continue studying massive, distant black holes.
    “Because they’re so bright, we can see them across vast cosmic distances—and in astronomy, looking far away means looking back in time,” explained study co-author and astronomer Benjamin Shappee. “By observing these prolonged flares, we gain insights into black hole growth when the universe was half its current age… forming stars and feeding their supermassive black holes 10 times more vigorously than they do today.”
    There’s a catch for astronomers, however. While supernovae are relatively well-documented, ENTs are estimated to occur at least 10 million times less often. This means that further study requires consistent monitoring of the cosmos backed by the support of international governments, astronomical associations, and the public.
    #astronomers #detect #most #powerful #explosions
    Astronomers detect most powerful explosions since Big Bang
    An artist's illustration of an unlucky massive star approaching a supermassive black hole. Credit: University of Hawai'i Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. At any given time across the universe, massive cosmic bodies are releasing incomprehensible amounts of energy. Stars burn like celestial nuclear fusion reactors, quasars emit thousands of times the luminosity of the Milky Way galaxy, and asteroids slam into planets. But all of these pale in comparison to a new class of events discovered by researchers at the University of Hawai’i’s Institute for Astronomy. According to their findings published June 4 in the journal Science Advances, it’s time to classify the universe’s most energetic explosions as extreme nuclear transients–or ENTs. ENTs are as devastating as they are rare. They only occur when a massive star at least three times heavier than the sun drifts too close to a supermassive black hole. The colliding forces subsequently obliterate the star, sending out plumes of energy across huge swaths of space. Similar events known as tidal disruption eventsare known to occur on asmaller scale, and have been documented for over a decade. But ENTs are something else entirely. “ENTs are different beasts,” study lead author and astronomer Jason Hinkle explained in an accompanying statement. “Not only are ENTs far brighter than normal tidal disruption events, but they remain luminous for years, far surpassing the energy output of even the brightest known supernova explosions.” Hinkle was first tipped off to ENTs while looking into transients—longlasting flares that spew energy from a galaxy’s center. Two particularly strange examples captured by the European Space Agency’s Gaia mission caught his eye. The pair of events brightened over a much longer timeframe than previously documented transients, but lacked some of their usual characteristics. “Gaia doesn’t tell you what a transient is, just that something changed in brightness,” Hinkle said. “But when I saw these smooth, long-lived flares from the centers of distant galaxies, I knew we were looking at something unusual.” Hinkle soon reached out to observatory teams around the world for what would become a multiyear project to understand these anomalies. In the process, a third suspect was detected by the Zwicky Transient Facility at the Palomar Observatory in San Diego. After months of analysis, Hinkle and collaborators realized they were witnessing something unprecedented. An infrared echo tells us that a dusty torus surrounds the central black hole and newly-formed accretion disk. Credit: University of Hawai’i The ENTs analyzed by astronomers displayed smoother, longer lasting flares that pointed towards something very particular—a supermassive black hole accreting a giant, wayward star. This contrasts with a more standard black hole that typically acquires its material and energy unpredictably, resulting in irregular brightness fluctuations. The energy and luminosity of an ENT boggles the mind. The most powerful ENT documented in Hinkle’s study, Gaia18cdj, generated 25 times more energy than the most powerful known supernovae. For reference, a standard supernova puts out as much energy in a single year as the sun does across its entire 10 billion year lifespan. Gaia18cdj, meanwhile, manages to give off 100 suns’ worth of energy over just 12 months. The implications of ENTs and their massive energy surges go far beyond their impressive energy outputs. Astronomers believe they contribute to some of the most pivotal events in the cosmos. “These ENTs don’t just mark the dramatic end of a massive star’s life. They illuminate the processes responsible for growing the largest black holes in the universe,” said Hinkle. From here on Earth, ENTs can also help researchers as they continue studying massive, distant black holes. “Because they’re so bright, we can see them across vast cosmic distances—and in astronomy, looking far away means looking back in time,” explained study co-author and astronomer Benjamin Shappee. “By observing these prolonged flares, we gain insights into black hole growth when the universe was half its current age… forming stars and feeding their supermassive black holes 10 times more vigorously than they do today.” There’s a catch for astronomers, however. While supernovae are relatively well-documented, ENTs are estimated to occur at least 10 million times less often. This means that further study requires consistent monitoring of the cosmos backed by the support of international governments, astronomical associations, and the public. #astronomers #detect #most #powerful #explosions
    WWW.POPSCI.COM
    Astronomers detect most powerful explosions since Big Bang
    An artist's illustration of an unlucky massive star approaching a supermassive black hole. Credit: University of Hawai'i Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. At any given time across the universe, massive cosmic bodies are releasing incomprehensible amounts of energy. Stars burn like celestial nuclear fusion reactors, quasars emit thousands of times the luminosity of the Milky Way galaxy, and asteroids slam into planets. But all of these pale in comparison to a new class of events discovered by researchers at the University of Hawai’i’s Institute for Astronomy (IfA). According to their findings published June 4 in the journal Science Advances, it’s time to classify the universe’s most energetic explosions as extreme nuclear transients–or ENTs. ENTs are as devastating as they are rare. They only occur when a massive star at least three times heavier than the sun drifts too close to a supermassive black hole. The colliding forces subsequently obliterate the star, sending out plumes of energy across huge swaths of space. Similar events known as tidal disruption events (TDEs) are known to occur on a (comparatively) smaller scale, and have been documented for over a decade. But ENTs are something else entirely. “ENTs are different beasts,” study lead author and astronomer Jason Hinkle explained in an accompanying statement. “Not only are ENTs far brighter than normal tidal disruption events, but they remain luminous for years, far surpassing the energy output of even the brightest known supernova explosions.” Hinkle was first tipped off to ENTs while looking into transients—longlasting flares that spew energy from a galaxy’s center. Two particularly strange examples captured by the European Space Agency’s Gaia mission caught his eye. The pair of events brightened over a much longer timeframe than previously documented transients, but lacked some of their usual characteristics. “Gaia doesn’t tell you what a transient is, just that something changed in brightness,” Hinkle said. “But when I saw these smooth, long-lived flares from the centers of distant galaxies, I knew we were looking at something unusual.” Hinkle soon reached out to observatory teams around the world for what would become a multiyear project to understand these anomalies. In the process, a third suspect was detected by the Zwicky Transient Facility at the Palomar Observatory in San Diego. After months of analysis, Hinkle and collaborators realized they were witnessing something unprecedented. An infrared echo tells us that a dusty torus surrounds the central black hole and newly-formed accretion disk. Credit: University of Hawai’i The ENTs analyzed by astronomers displayed smoother, longer lasting flares that pointed towards something very particular—a supermassive black hole accreting a giant, wayward star. This contrasts with a more standard black hole that typically acquires its material and energy unpredictably, resulting in irregular brightness fluctuations. The energy and luminosity of an ENT boggles the mind. The most powerful ENT documented in Hinkle’s study, Gaia18cdj, generated 25 times more energy than the most powerful known supernovae. For reference, a standard supernova puts out as much energy in a single year as the sun does across its entire 10 billion year lifespan. Gaia18cdj, meanwhile, manages to give off 100 suns’ worth of energy over just 12 months. The implications of ENTs and their massive energy surges go far beyond their impressive energy outputs. Astronomers believe they contribute to some of the most pivotal events in the cosmos. “These ENTs don’t just mark the dramatic end of a massive star’s life. They illuminate the processes responsible for growing the largest black holes in the universe,” said Hinkle. From here on Earth, ENTs can also help researchers as they continue studying massive, distant black holes. “Because they’re so bright, we can see them across vast cosmic distances—and in astronomy, looking far away means looking back in time,” explained study co-author and astronomer Benjamin Shappee. “By observing these prolonged flares, we gain insights into black hole growth when the universe was half its current age… forming stars and feeding their supermassive black holes 10 times more vigorously than they do today.” There’s a catch for astronomers, however. While supernovae are relatively well-documented, ENTs are estimated to occur at least 10 million times less often. This means that further study requires consistent monitoring of the cosmos backed by the support of international governments, astronomical associations, and the public.
    Like
    Love
    Wow
    Sad
    Angry
    166
    0 Comentários 0 Compartilhamentos 0 Anterior
  • The Orb Will See You Now

    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can seethis becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blaniaand Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than million actually building its app—but pumped an additional million or so into a referral program, whereby new users and the person who invited them would each receive in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash.A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data.“Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean wonhas landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much morelike: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just givingaccess to the latestmodels and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is billion, not billion.
    #orb #will #see #you #now
    The Orb Will See You Now
    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can seethis becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blaniaand Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than million actually building its app—but pumped an additional million or so into a referral program, whereby new users and the person who invited them would each receive in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash.A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data.“Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean wonhas landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much morelike: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just givingaccess to the latestmodels and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is billion, not billion. #orb #will #see #you #now
    TIME.COM
    The Orb Will See You Now
    Once again, Sam Altman wants to show you the future. The CEO of OpenAI is standing on a sparse stage in San Francisco, preparing to reveal his next move to an attentive crowd. “We needed some way for identifying, authenticating humans in the age of AGI,” Altman explains, referring to artificial general intelligence. “We wanted a way to make sure that humans stayed special and central.” The solution Altman came up with is looming behind him. It’s a white sphere about the size of a beach ball, with a camera at its center. The company that makes it, known as Tools for Humanity, calls this mysterious device the Orb. Stare into the heart of the plastic-and-silicon globe and it will map the unique furrows and ciliary zones of your iris. Seconds later, you’ll receive inviolable proof of your humanity: a 12,800-digit binary number, known as an iris code, sent to an app on your phone. At the same time, a packet of cryptocurrency called Worldcoin, worth approximately $42, will be transferred to your digital wallet—your reward for becoming a “verified human.” Altman co-founded Tools for Humanity in 2019 as part of a suite of companies he believed would reshape the world. Once the tech he was developing at OpenAI passed a certain level of intelligence, he reasoned, it would mark the end of one era on the Internet and the beginning of another, in which AI became so advanced, so human-like, that you would no longer be able to tell whether what you read, saw, or heard online came from a real person. When that happened, Altman imagined, we would need a new kind of online infrastructure: a human-verification layer for the Internet, to distinguish real people from the proliferating number of bots and AI “agents.”And so Tools for Humanity set out to build a global “proof-of-humanity” network. It aims to verify 50 million people by the end of 2025; ultimately its goal is to sign up every single human being on the planet. The free crypto serves as both an incentive for users to sign up, and also an entry point into what the company hopes will become the world’s largest financial network, through which it believes “double-digit percentages of the global economy” will eventually flow. Even for Altman, these missions are audacious. “If this really works, it’s like a fundamental piece of infrastructure for the world,” Altman tells TIME in a video interview from the passenger seat of a car a few days before his April 30 keynote address.Internal hardware of the Orb in mid-assembly in March. Davide Monteleone for TIMEThe project’s goal is to solve a problem partly of Altman’s own making. In the near future, he and other tech leaders say, advanced AIs will be imbued with agency: the ability to not just respond to human prompting, but to take actions independently in the world. This will enable the creation of AI coworkers that can drop into your company and begin solving problems; AI tutors that can adapt their teaching style to students’ preferences; even AI doctors that can diagnose routine cases and handle scheduling or logistics. The arrival of these virtual agents, their venture capitalist backers predict, will turbocharge our productivity and unleash an age of material abundance.But AI agents will also have cascading consequences for the human experience online. “As AI systems become harder to distinguish from people, websites may face difficult trade-offs,” says a recent paper by researchers from 25 different universities, nonprofits, and tech companies, including OpenAI. “There is a significant risk that digital institutions will be unprepared for a time when AI-powered agents, including those leveraged by malicious actors, overwhelm other activity online.” On social-media platforms like X and Facebook, bot-driven accounts are amassing billions of views on AI-generated content. In April, the foundation that runs Wikipedia disclosed that AI bots scraping their site were making the encyclopedia too costly to sustainably run. Later the same month, researchers from the University of Zurich found that AI-generated comments on the subreddit /r/ChangeMyView were up to six times more successful than human-written ones at persuading unknowing users to change their minds.  Photograph by Davide Monteleone for TIMEBuy a copy of the Orb issue hereThe arrival of agents won’t only threaten our ability to distinguish between authentic and AI content online. It will also challenge the Internet’s core business model, online advertising, which relies on the assumption that ads are being viewed by humans. “The Internet will change very drastically sometime in the next 12 to 24 months,” says Tools for Humanity CEO Alex Blania. “So we have to succeed, or I’m not sure what else would happen.”For four years, Blania’s team has been testing the Orb’s hardware abroad. Now the U.S. rollout has arrived. Over the next 12 months, 7,500 Orbs will be arriving in dozens of American cities, in locations like gas stations, bodegas, and flagship stores in Los Angeles, Austin, and Miami. The project’s founders and fans hope the Orb’s U.S. debut will kickstart a new phase of growth. The San Francisco keynote was titled: “At Last.” It’s not clear the public appetite matches the exultant branding. Tools for Humanity has “verified” just 12 million humans since mid 2023, a pace Blania concedes is well behind schedule. Few online platforms currently support the so-called “World ID” that the Orb bestows upon its visitors, leaving little to entice users to give up their biometrics beyond the lure of free crypto. Even Altman isn’t sure whether the whole thing can work. “I can see [how] this becomes a fairly mainstream thing in a few years,” he says. “Or I can see that it’s still only used by a small subset of people who think about the world in a certain way.” Blania (left) and Altman debut the Orb at World’s U.S. launch in San Francisco on April 30, 2025. Jason Henry—The New York Times/ReduxYet as the Internet becomes overrun with AI, the creators of this strange new piece of hardware are betting that everybody in the world will soon want—or need—to visit an Orb. The biometric code it creates, they predict, will become a new type of digital passport, without which you might be denied passage to the Internet of the future, from dating apps to government services. In a best-case scenario, World ID could be a privacy-preserving way to fortify the Internet against an AI-driven deluge of fake or deceptive content. It could also enable the distribution of universal basic income (UBI)—a policy that Altman has previously touted—as AI automation transforms the global economy. To examine what this new technology might mean, I reported from three continents, interviewed 10 Tools for Humanity executives and investors, reviewed hundreds of pages of company documents, and “verified” my own humanity. The Internet will inevitably need some kind of proof-of-humanity system in the near future, says Divya Siddarth, founder of the nonprofit Collective Intelligence Project. The real question, she argues, is whether such a system will be centralized—“a big security nightmare that enables a lot of surveillance”—or privacy-preserving, as the Orb claims to be. Questions remain about Tools for Humanity’s corporate structure, its yoking to an unstable cryptocurrency, and what power it would concentrate in the hands of its owners if successful. Yet it’s also one of the only attempts to solve what many see as an increasingly urgent problem. “There are some issues with it,” Siddarth says of World ID. “But you can’t preserve the Internet in amber. Something in this direction is necessary.”In March, I met Blania at Tools for Humanity’s San Francisco headquarters, where a large screen displays the number of weekly “Orb verifications” by country. A few days earlier, the CEO had attended a $1 million-per-head dinner at Mar-a-Lago with President Donald Trump, whom he credits with clearing the way for the company’s U.S. launch by relaxing crypto regulations. “Given Sam is a very high profile target,” Blania says, “we just decided that we would let other companies fight that fight, and enter the U.S. once the air is clear.” As a kid growing up in Germany, Blania was a little different than his peers. “Other kids were, like, drinking a lot, or doing a lot of parties, and I was just building a lot of things that could potentially blow up,” he recalls. At the California Institute of Technology, where he was pursuing research for a masters degree, he spent many evenings reading the blogs of startup gurus like Paul Graham and Altman. Then, in 2019, Blania received an email from Max Novendstern, an entrepreneur who had been kicking around a concept with Altman to build a global cryptocurrency network. They were looking for technical minds to help with the project. Over cappuccinos, Altman told Blania he was certain about three things. First, smarter-than-human AI was not only possible, but inevitable—and it would soon mean you could no longer assume that anything you read, saw, or heard on the Internet was human-created. Second, cryptocurrency and other decentralized technologies would be a massive force for change in the world. And third, scale was essential to any crypto network’s value. The Orb is tested on a calibration rig, surrounded by checkerboard targets to ensure precision in iris detection. Davide Monteleone for TIMEThe goal of Worldcoin, as the project was initially called, was to combine those three insights. Altman took a lesson from PayPal, the company co-founded by his mentor Peter Thiel. Of its initial funding, PayPal spent less than $10 million actually building its app—but pumped an additional $70 million or so into a referral program, whereby new users and the person who invited them would each receive $10 in credit. The referral program helped make PayPal a leading payment platform. Altman thought a version of that strategy would propel Worldcoin to similar heights. He wanted to create a new cryptocurrency and give it to users as a reward for signing up. The more people who joined the system, the higher the token’s value would theoretically rise. Since 2019, the project has raised $244 million from investors like Coinbase and the venture capital firm Andreessen Horowitz. That money paid for the $50 million cost of designing the Orb, plus maintaining the software it runs on. The total market value of all Worldcoins in existence, however, is far higher—around $12 billion. That number is a bit misleading: most of those coins are not in circulation and Worldcoin’s price has fluctuated wildly. Still, it allows the company to reward users for signing up at no cost to itself. The main lure for investors is the crypto upside. Some 75% of all Worldcoins are set aside for humans to claim when they sign up, or as referral bonuses. The remaining 25% are split between Tools for Humanity’s backers and staff, including Blania and Altman. “I’m really excited to make a lot of money,” ” Blania says.From the beginning, Altman was thinking about the consequences of the AI revolution he intended to unleash. (On May 21, he announced plans to team up with famed former Apple designer Jony Ive on a new AI personal device.) A future in which advanced AI could perform most tasks more effectively than humans would bring a wave of unemployment and economic dislocation, he reasoned. Some kind of wealth redistribution might be necessary. In 2016, he partially funded a study of basic income, which gave $1,000 per-month handouts to low-income individuals in Illinois and Texas. But there was no single financial system that would allow money to be sent to everybody in the world. Nor was there a way to stop an individual human from claiming their share twice—or to identify a sophisticated AI pretending to be human and pocketing some cash of its own. In 2023, Tools for Humanity raised the possibility of using the network to redistribute the profits of AI labs that were able to automate human labor. “As AI advances,” it said, “fairly distributing access and some of the created value through UBI will play an increasingly vital role in counteracting the concentration of economic power.”Blania was taken by the pitch, and agreed to join the project as a co-founder. “Most people told us we were very stupid or crazy or insane, including Silicon Valley investors,” Blania says. At least until ChatGPT came out in 2022, transforming OpenAI into one of the world’s most famous tech companies and kickstarting a market bull-run. “Things suddenly started to make more and more sense to the external world,” Blania says of the vision to develop a global “proof-of-humanity” network. “You have to imagine a world in which you will have very smart and competent systems somehow flying through the Internet with different goals and ideas of what they want to do, and us having no idea anymore what we’re dealing with.”After our interview, Blania’s head of communications ushers me over to a circular wooden structure where eight Orbs face one another. The scene feels like a cross between an Apple Store and a ceremonial altar. “Do you want to get verified?” she asks. Putting aside my reservations for the purposes of research, I download the World App and follow its prompts. I flash a QR code at the Orb, then gaze into it. A minute or so later, my phone buzzes with confirmation: I’ve been issued my own personal World ID and some Worldcoin.The first thing the Orb does is check if you’re human, using a neural network that takes input from various sensors, including an infrared camera and a thermometer. Davide Monteleone for TIMEWhile I stared into the Orb, several complex procedures had taken place at once. A neural network took inputs from multiple sensors—an infrared camera, a thermometer—to confirm I was a living human. Simultaneously, a telephoto lens zoomed in on my iris, capturing the physical traits within that distinguish me from every other human on Earth. It then converted that image into an iris code: a numerical abstraction of my unique biometric data. Then the Orb checked to see if my iris code matched any it had seen before, using a technique allowing encrypted data to be compared without revealing the underlying information. Before the Orb deleted my data, it turned my iris code into several derivative codes—none of which on its own can be linked back to the original—encrypted them, deleted the only copies of the decryption keys, and sent each one to a different secure server, so that future users’ iris codes can be checked for uniqueness against mine. If I were to use my World ID to access a website, that site would learn nothing about me except that I’m human. The Orb is open-source, so outside experts can examine its code and verify the company’s privacy claims. “I did a colonoscopy on this company and these technologies before I agreed to join,” says Trevor Traina, a Trump donor and former U.S. ambassador to Austria who now serves as Tools for Humanity’s chief business officer. “It is the most privacy-preserving technology on the planet.”Only weeks later, when researching what would happen if I wanted to delete my data, do I discover that Tools for Humanity’s privacy claims rest on what feels like a sleight of hand. The company argues that in modifying your iris code, it has “effectively anonymized” your biometric data. If you ask Tools for Humanity to delete your iris codes, they will delete the one stored on your phone, but not the derivatives. Those, they argue, are no longer your personal data at all. But if I were to return to an Orb after deleting my data, it would still recognize those codes as uniquely mine. Once you look into the Orb, a piece of your identity remains in the system forever. If users could truly delete that data, the premise of one ID per human would collapse, Tools for Humanity’s chief privacy officer Damien Kieran tells me when I call seeking an explanation. People could delete and sign up for new World IDs after being suspended from a platform. Or claim their Worldcoin tokens, sell them, delete their data, and cash in again. This argument fell flat with European Union regulators in Germany, who recently declared that the Orb posed “fundamental data protection issues” and ordered the company to allow European users to fully delete even their anonymized data. (Tools for Humanity has appealed; the regulator is now reassessing the decision.) “Just like any other technology service, users cannot delete data that is not personal data,” Kieran said in a statement. “If a person could delete anonymized data that can’t be linked to them by World or any third party, it would allow bad actors to circumvent the security and safety that World ID is working to bring to every human.”On a balmy afternoon this spring, I climb a flight of stairs up to a room above a restaurant in an outer suburb of Seoul. Five elderly South Koreans tap on their phones as they wait to be “verified” by the two Orbs in the center of the room. “We don’t really know how to distinguish between AI and humans anymore,” an attendant in a company t-shirt explains in Korean, gesturing toward the spheres. “We need a way to verify that we’re human and not AI. So how do we do that? Well, humans have irises, but AI doesn’t.”The attendant ushers an elderly woman over to an Orb. It bleeps. “Open your eyes,” a disembodied voice says in English. The woman stares into the camera. Seconds later, she checks her phone and sees that a packet of Worldcoin worth 75,000 Korean won (about $54) has landed in her digital wallet. Congratulations, the app tells her. You are now a verified human.A visitor views the Orbs in Seoul on April 14, 2025. Taemin Ha for TIMETools for Humanity aims to “verify” 1 million Koreans over the next year. Taemin Ha for TIMEA couple dozen Orbs have been available in South Korea since 2023, verifying roughly 55,000 people. Now Tools for Humanity is redoubling its efforts there. At an event in a traditional wooden hanok house in central Seoul, an executive announces that 250 Orbs will soon be dispersed around the country—with the aim of verifying 1 million Koreans in the next 12 months. South Korea has high levels of smartphone usage, crypto and AI adoption, and Internet access, while average wages are modest enough for the free Worldcoin on offer to still be an enticing draw—all of which makes it fertile testing ground for the company’s ambitious global expansion. Yet things seem off to a slow start. In a retail space I visited in central Seoul, Tools for Humanity had constructed a wooden structure with eight Orbs facing each other. Locals and tourists wander past looking bemused; few volunteer themselves up. Most who do tell me they are crypto enthusiasts who came intentionally, driven more by the spirit of early adoption than the free coins. The next day, I visit a coffee shop in central Seoul where a chrome Orb sits unassumingly in one corner. Wu Ruijun, a 20-year-old student from China, strikes up a conversation with the barista, who doubles as the Orb’s operator. Wu was invited here by a friend who said both could claim free cryptocurrency if he signed up. The barista speeds him through the process. Wu accepts the privacy disclosure without reading it, and widens his eyes for the Orb. Soon he’s verified. “I wasn’t told anything about the privacy policy,” he says on his way out. “I just came for the money.”As Altman’s car winds through San Francisco, I ask about the vision he laid out in 2019: that AI would make it harder for us to trust each other online. To my surprise, he rejects the framing. “I’m much more [about] like: what is the good we can create, rather than the bad we can stop?” he says. “It’s not like, ‘Oh, we’ve got to avoid the bot overrun’ or whatever. It’s just that we can do a lot of special things for humans.” It’s an answer that may reflect how his role has changed over the years. Altman is now the chief public cheerleader of a $300 billion company that’s touting the transformative utility of AI agents. The rise of agents, he and others say, will be a boon for our quality of life—like having an assistant on hand who can answer your most pressing questions, carry out mundane tasks, and help you develop new skills. It’s an optimistic vision that may well pan out. But it doesn’t quite fit with the prophecies of AI-enabled infopocalypse that Tools for Humanity was founded upon.Altman waves away a question about the influence he and other investors stand to gain if their vision is realized. Most holders, he assumes, will have already started selling their tokens—too early, he adds. “What I think would be bad is if an early crew had a lot of control over the protocol,” he says, “and that’s where I think the commitment to decentralization is so cool.” Altman is referring to the World Protocol, the underlying technology upon which the Orb, Worldcoin, and World ID all rely. Tools for Humanity is developing it, but has committed to giving control to its users over time—a process they say will prevent power from being concentrated in the hands of a few executives or investors. Tools for Humanity would remain a for-profit company, and could levy fees on platforms that use World ID, but other companies would be able to compete for customers by building alternative apps—or even alternative Orbs. The plan draws on ideas that animated the crypto ecosystem in the late 2010s and early 2020s, when evangelists for emerging blockchain technologies argued that the centralization of power—especially in large so-called “Web 2.0” tech companies—was responsible for many of the problems plaguing the modern Internet. Just as decentralized cryptocurrencies could reform a financial system controlled by economic elites, so too would it be possible to create decentralized organizations, run by their members instead of CEOs. How such a system might work in practice remains unclear. “Building a community-based governance system,” Tools for Humanity says in a 2023 white paper, “represents perhaps the most formidable challenge of the entire project.”Altman has a pattern of making idealistic promises that shift over time. He founded OpenAI as a nonprofit in 2015, with a mission to develop AGI safely and for the benefit of all humanity. To raise money, OpenAI restructured itself as a for-profit company in 2019, but with overall control still in the hands of its nonprofit board. Last year, Altman proposed yet another restructure—one which would dilute the board’s control and allow more profits to flow to shareholders. Why, I ask, should the public trust Tools for Humanity’s commitment to freely surrender influence and power? “I think you will just see the continued decentralization via the protocol,” he says. “The value here is going to live in the network, and the network will be owned and governed by a lot of people.” Altman talks less about universal basic income these days. He recently mused about an alternative, which he called “universal basic compute.” Instead of AI companies redistributing their profits, he seemed to suggest, they could instead give everyone in the world fair access to super-powerful AI. Blania tells me he recently “made the decision to stop talking” about UBI at Tools for Humanity. “UBI is one potential answer,” he says. “Just giving [people] access to the latest [AI] models and having them learn faster and better is another.” Says Altman: “I still don’t know what the right answer is. I believe we should do a better job of distribution of resources than we currently do.” When I probe the question of why people should trust him, Altman gets irritated. “I understand that you hate AI, and that’s fine,” he says. “If you want to frame it as the downside of AI is that there’s going to be a proliferation of very convincing AI systems that are pretending to be human, and we need ways to know what is really human-authorized versus not, then yeah, I think you can call that a downside of AI. It’s not how I would naturally frame it.” The phrase human-authorized hints at a tension between World ID and OpenAI’s plans for AI agents. An Internet where a World ID is required to access most services might impede the usefulness of the agents that OpenAI and others are developing. So Tools for Humanity is building a system that would allow users to delegate their World ID to an agent, allowing the bot to take actions online on their behalf, according to Tiago Sada, the company’s chief product officer. “We’ve built everything in a way that can be very easily delegatable to an agent,” Sada says. It’s a measure that would allow humans to be held accountable for the actions of their AIs. But it suggests that Tools for Humanity’s mission may be shifting beyond simply proving humanity, and toward becoming the infrastructure that enables AI agents to proliferate with human authorization. World ID doesn’t tell you whether a piece of content is AI-generated or human-generated; all it tells you is whether the account that posted it is a human or a bot. Even in a world where everybody had a World ID, our online spaces might still be filled with AI-generated text, images, and videos.As I say goodbye to Altman, I’m left feeling conflicted about his project. If the Internet is going to be transformed by AI agents, then some kind of proof-of-humanity system will almost certainly be necessary. Yet if the Orb becomes a piece of Internet infrastructure, it could give Altman—a beneficiary of the proliferation of AI content—significant influence over a leading defense mechanism against it. People might have no choice but to participate in the network in order to access social media or online services.I thought of an encounter I witnessed in Seoul. In the room above the restaurant, Cho Jeong-yeon, 75, watched her friend get verified by an Orb. Cho had been invited to do the same, but demurred. The reward wasn’t enough for her to surrender a part of her identity. “Your iris is uniquely yours, and we don’t really know how it might be used,” she says. “Seeing the machine made me think: are we becoming machines instead of humans now? Everything is changing, and we don’t know how it’ll all turn out.”—With reporting by Stephen Kim/Seoul. This story was supported by Tarbell Grants.Correction, May 30The original version of this story misstated the market capitalization of Worldcoin if all coins were in circulation. It is $12 billion, not $1.2 billion.
    Like
    Love
    Wow
    Sad
    Angry
    240
    0 Comentários 0 Compartilhamentos 0 Anterior
  • HOLLYWOOD VFX TOOLS FOR SPACE EXPLORATION

    By CHRIS McGOWAN

    This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCamshows stunning details of the majestic planet in infrared light.Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey. Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studioproduces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging.
    A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studioat the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award.

    This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere.“The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.”
    —A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization StudioAbout his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.”

    This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look.The Gulf Stream and connected currents.Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars.WORKING WITH DATA
    While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.”
    He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.”

    Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface.HOUDINI AND OTHER TOOLS
    “Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.”

    Satellite imagery from NASA’s Solar Dynamics Observatoryshows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space.Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.”

    While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between.The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar.Another visualization of the black hole Gargantua.INTERSTELLAR & GARGANTUA
    Christensen recalls working for DNEG on Interstellar. “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.”
    He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.”

    Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.”

    The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination.FURTHER CHALLENGES
    The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.”
    SVS also works closely with its NASA partner groups Conceptual Image Laband Goddard Media Studiosto publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public.

    An impact crater on the moon.Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image.Mars. Hellas Basin can be seen in the lower right portion of the image.Mars slightly tilted to show the Martian North Pole.Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.”
    #hollywood #vfx #tools #space #exploration
    HOLLYWOOD VFX TOOLS FOR SPACE EXPLORATION
    By CHRIS McGOWAN This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCamshows stunning details of the majestic planet in infrared light.Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey. Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studioproduces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging. A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studioat the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award. This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere.“The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.” —A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization StudioAbout his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.” This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look.The Gulf Stream and connected currents.Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars.WORKING WITH DATA While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.” He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.” Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface.HOUDINI AND OTHER TOOLS “Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.” Satellite imagery from NASA’s Solar Dynamics Observatoryshows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space.Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.” While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between.The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar.Another visualization of the black hole Gargantua.INTERSTELLAR & GARGANTUA Christensen recalls working for DNEG on Interstellar. “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.” He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.” Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.” The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination.FURTHER CHALLENGES The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.” SVS also works closely with its NASA partner groups Conceptual Image Laband Goddard Media Studiosto publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public. An impact crater on the moon.Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image.Mars. Hellas Basin can be seen in the lower right portion of the image.Mars slightly tilted to show the Martian North Pole.Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.” #hollywood #vfx #tools #space #exploration
    WWW.VFXVOICE.COM
    HOLLYWOOD VFX TOOLS FOR SPACE EXPLORATION
    By CHRIS McGOWAN This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCam (Near-Infrared Camera) shows stunning details of the majestic planet in infrared light. (Image courtesy of NASA, ESA and CSA) Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey (1968). Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studio (SVS) produces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging. A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studio (SVS) at the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award. This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.” —A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization Studio (SVS) About his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.” This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) The Gulf Stream and connected currents. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars. (Image courtesy of NASA’s Goddard Space Flight Center) WORKING WITH DATA While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.” He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.” Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) HOUDINI AND OTHER TOOLS “Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.” Satellite imagery from NASA’s Solar Dynamics Observatory (SDO) shows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.” While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between. (Image courtesy of ASA/JPL/Space Science Institute) The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar. (Image courtesy of DNEG and Paramount Pictures) Another visualization of the black hole Gargantua. (Image courtesy of DNEG and Paramount Pictures) INTERSTELLAR & GARGANTUA Christensen recalls working for DNEG on Interstellar (2014). “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet [in the film]. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.” He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.” Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.” The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) FURTHER CHALLENGES The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.” SVS also works closely with its NASA partner groups Conceptual Image Lab (CIL) and Goddard Media Studios (GMS) to publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public. An impact crater on the moon. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Mars. Hellas Basin can be seen in the lower right portion of the image. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Mars slightly tilted to show the Martian North Pole. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.”
    Like
    Love
    Wow
    Angry
    Sad
    144
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3

    SwitchBot has a knack for crafting ingenious IoT devices, those little problem-solvers like robotic curtain openers and automated button pressers that add a touch of futuristic convenience. Yet, the true linchpin, the secret sauce that elevates their entire ecosystem, is undoubtedly their Hub. It’s the central nervous system that takes individual smart products and weaves them into a cohesive, intelligent tapestry, turning the abstract concept of a ‘smart home’ into a tangible, daily experience.
    This unification through the Hub is what brings us closer to that almost mythical dream: a home where technology works in concert, where devices understand each other’s capabilities and, critically, anticipate your needs. It’s about creating an environment that doesn’t just react to commands, but proactively adapts, making your living space more intuitive, responsive, and, ultimately, more attuned to you. The new Hub 3 aims to refine this very connection.
    Designer: SwitchBot
    Click Here to Buy Now: The predecessor, the Hub 2, already laid a strong foundation. It brought Matter support into the SwitchBot ecosystem, along with reliable infrared controls, making it a versatile little box. It understood the assignment: bridge the old with the new. The Hub 3 takes that solid base and builds upon it, addressing not just functionality but also the nuanced interactions that make a device truly intuitive and, dare I say, enjoyable to use daily.

    Matter support, the industry’s push for interoperability, remains a cornerstone. The Hub 3 acts as a Matter bridge, capable of bringing up to 30 SwitchBot devices into the Matter fold, allowing them to play nice with platforms like Apple Home. Furthermore, it can send up to 30 distinct commands to other Matter-certified products already integrated into your Apple Home setup, with Home Assistant support on the horizon. This makes it a powerful orchestrator.

    One of the most striking additions is the new rotary dial, something SwitchBot calls its “Dial Master” technology. Giving users an intuitive tactile control that feels very familiar, it makes the Hub 3 even more user-friendly. Imagine adjusting your thermostat not by tapping an arrow repeatedly, but by smoothly turning a dial for that exact ±1°C change. The same applies to volume control or any other granular adjustment. This tactile feedback offers a level of hyper-controlled interaction that screen taps often lack, feeling more connected and satisfying.

    Beyond physical interaction, the Hub 3 gets smarter senses. While the trusty thermo-hygro sensormakes a return for indoor temperature and humidity, it’s now joined by a built-in light sensor. This seemingly small addition unlocks a new layer of intuitive automation. Your home can now react to ambient brightness, perhaps cueing your SwitchBot Curtain 3 to draw open gently as the sun rises, or dimming lights as natural light fades.

    Aesthetically, SwitchBot made a subtle but impactful shift from the Hub 2’s white casing to a sleek black for the Hub 3. This change makes the integrated display stand out significantly, improving readability at a glance. And that display now does more heavy lifting. It still shows essential indoor temperature and humidity, but can also pull in local outdoor weather data, giving you a quick forecast without reaching for your phone. Pair it with a SwitchBot Meter Pro, and it’ll even show CO2 levels.

    The Hub 2 featured two handy customizable buttons. The Hub 3 doubles down, offering four such buttons. This means more of your favorite automation scenes, like “Movie Night,” “Good Morning,” and “Away Mode,” are just a single press away. This reduces friction, making your smart home react faster to your needs without diving into an app for every little thing. It’s these quality-of-life improvements that often make the biggest difference in daily use.

    Crucially, the Hub 3 retains everything that made its predecessor a strong contender. The infrared control capabilities are still robust, supporting over 100,000 IR codes for your legacy AV gear and air conditioners, now with a signal that’s reportedly 150% stronger than the Hub Mini. Its deep integration with the existing SwitchBot ecosystem means your Bots, Curtain movers, and vacuums will feel right at home, working in concert.

    Of course, you still have your choice of control methods. Beyond the new dial and physical buttons, there’s comprehensive app control for setting up complex automations and remote access. Voice control via the usual assistants like Alexa and Google Assistant is present and accounted for, ensuring hands-free operation whenever you need it. This flexibility means the Hub 3 adapts to your preferences, not the other way around.

    The true power, as always, lies in the DIY automation scenes. Imagine your AC, humidifier, and dehumidifier working together, orchestrated by the Hub 3 to maintain your perfect 23°C and 58% humidity. Or picture an energy-saving scene where the built-in motion sensor, coupled with geofencing, detects an empty house and powers down non-essential appliances. It’s these intelligent, personalized routines that transform a collection of smart devices into a truly smart home.

    The SwitchBot Hub 3 feels like the most potent iteration of that “secret sauce” yet. It takes the individual brilliance of SwitchBot’s gadgets and, through enhanced sensory input and more tactile controls, truly deepens that crucial understanding between device, environment, and user. The best part? It plugs right into your smart home’s existing setup, communicating with your slew of IoT devices – even more efficiently if you’ve got a Hub 2 or Hub Mini and you’re looking to upgrade.
    Click Here to Buy Now: The post Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3 first appeared on Yanko Design.
    #your #smart #home #got #new
    Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3
    SwitchBot has a knack for crafting ingenious IoT devices, those little problem-solvers like robotic curtain openers and automated button pressers that add a touch of futuristic convenience. Yet, the true linchpin, the secret sauce that elevates their entire ecosystem, is undoubtedly their Hub. It’s the central nervous system that takes individual smart products and weaves them into a cohesive, intelligent tapestry, turning the abstract concept of a ‘smart home’ into a tangible, daily experience. This unification through the Hub is what brings us closer to that almost mythical dream: a home where technology works in concert, where devices understand each other’s capabilities and, critically, anticipate your needs. It’s about creating an environment that doesn’t just react to commands, but proactively adapts, making your living space more intuitive, responsive, and, ultimately, more attuned to you. The new Hub 3 aims to refine this very connection. Designer: SwitchBot Click Here to Buy Now: The predecessor, the Hub 2, already laid a strong foundation. It brought Matter support into the SwitchBot ecosystem, along with reliable infrared controls, making it a versatile little box. It understood the assignment: bridge the old with the new. The Hub 3 takes that solid base and builds upon it, addressing not just functionality but also the nuanced interactions that make a device truly intuitive and, dare I say, enjoyable to use daily. Matter support, the industry’s push for interoperability, remains a cornerstone. The Hub 3 acts as a Matter bridge, capable of bringing up to 30 SwitchBot devices into the Matter fold, allowing them to play nice with platforms like Apple Home. Furthermore, it can send up to 30 distinct commands to other Matter-certified products already integrated into your Apple Home setup, with Home Assistant support on the horizon. This makes it a powerful orchestrator. One of the most striking additions is the new rotary dial, something SwitchBot calls its “Dial Master” technology. Giving users an intuitive tactile control that feels very familiar, it makes the Hub 3 even more user-friendly. Imagine adjusting your thermostat not by tapping an arrow repeatedly, but by smoothly turning a dial for that exact ±1°C change. The same applies to volume control or any other granular adjustment. This tactile feedback offers a level of hyper-controlled interaction that screen taps often lack, feeling more connected and satisfying. Beyond physical interaction, the Hub 3 gets smarter senses. While the trusty thermo-hygro sensormakes a return for indoor temperature and humidity, it’s now joined by a built-in light sensor. This seemingly small addition unlocks a new layer of intuitive automation. Your home can now react to ambient brightness, perhaps cueing your SwitchBot Curtain 3 to draw open gently as the sun rises, or dimming lights as natural light fades. Aesthetically, SwitchBot made a subtle but impactful shift from the Hub 2’s white casing to a sleek black for the Hub 3. This change makes the integrated display stand out significantly, improving readability at a glance. And that display now does more heavy lifting. It still shows essential indoor temperature and humidity, but can also pull in local outdoor weather data, giving you a quick forecast without reaching for your phone. Pair it with a SwitchBot Meter Pro, and it’ll even show CO2 levels. The Hub 2 featured two handy customizable buttons. The Hub 3 doubles down, offering four such buttons. This means more of your favorite automation scenes, like “Movie Night,” “Good Morning,” and “Away Mode,” are just a single press away. This reduces friction, making your smart home react faster to your needs without diving into an app for every little thing. It’s these quality-of-life improvements that often make the biggest difference in daily use. Crucially, the Hub 3 retains everything that made its predecessor a strong contender. The infrared control capabilities are still robust, supporting over 100,000 IR codes for your legacy AV gear and air conditioners, now with a signal that’s reportedly 150% stronger than the Hub Mini. Its deep integration with the existing SwitchBot ecosystem means your Bots, Curtain movers, and vacuums will feel right at home, working in concert. Of course, you still have your choice of control methods. Beyond the new dial and physical buttons, there’s comprehensive app control for setting up complex automations and remote access. Voice control via the usual assistants like Alexa and Google Assistant is present and accounted for, ensuring hands-free operation whenever you need it. This flexibility means the Hub 3 adapts to your preferences, not the other way around. The true power, as always, lies in the DIY automation scenes. Imagine your AC, humidifier, and dehumidifier working together, orchestrated by the Hub 3 to maintain your perfect 23°C and 58% humidity. Or picture an energy-saving scene where the built-in motion sensor, coupled with geofencing, detects an empty house and powers down non-essential appliances. It’s these intelligent, personalized routines that transform a collection of smart devices into a truly smart home. The SwitchBot Hub 3 feels like the most potent iteration of that “secret sauce” yet. It takes the individual brilliance of SwitchBot’s gadgets and, through enhanced sensory input and more tactile controls, truly deepens that crucial understanding between device, environment, and user. The best part? It plugs right into your smart home’s existing setup, communicating with your slew of IoT devices – even more efficiently if you’ve got a Hub 2 or Hub Mini and you’re looking to upgrade. Click Here to Buy Now: The post Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3 first appeared on Yanko Design. #your #smart #home #got #new
    WWW.YANKODESIGN.COM
    Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3
    SwitchBot has a knack for crafting ingenious IoT devices, those little problem-solvers like robotic curtain openers and automated button pressers that add a touch of futuristic convenience. Yet, the true linchpin, the secret sauce that elevates their entire ecosystem, is undoubtedly their Hub. It’s the central nervous system that takes individual smart products and weaves them into a cohesive, intelligent tapestry, turning the abstract concept of a ‘smart home’ into a tangible, daily experience. This unification through the Hub is what brings us closer to that almost mythical dream: a home where technology works in concert, where devices understand each other’s capabilities and, critically, anticipate your needs. It’s about creating an environment that doesn’t just react to commands, but proactively adapts, making your living space more intuitive, responsive, and, ultimately, more attuned to you. The new Hub 3 aims to refine this very connection. Designer: SwitchBot Click Here to Buy Now: $119.99 The predecessor, the Hub 2, already laid a strong foundation. It brought Matter support into the SwitchBot ecosystem, along with reliable infrared controls, making it a versatile little box. It understood the assignment: bridge the old with the new. The Hub 3 takes that solid base and builds upon it, addressing not just functionality but also the nuanced interactions that make a device truly intuitive and, dare I say, enjoyable to use daily. Matter support, the industry’s push for interoperability, remains a cornerstone. The Hub 3 acts as a Matter bridge, capable of bringing up to 30 SwitchBot devices into the Matter fold, allowing them to play nice with platforms like Apple Home. Furthermore, it can send up to 30 distinct commands to other Matter-certified products already integrated into your Apple Home setup, with Home Assistant support on the horizon. This makes it a powerful orchestrator. One of the most striking additions is the new rotary dial, something SwitchBot calls its “Dial Master” technology. Giving users an intuitive tactile control that feels very familiar (think ovens, radios, car ACs), it makes the Hub 3 even more user-friendly. Imagine adjusting your thermostat not by tapping an arrow repeatedly, but by smoothly turning a dial for that exact ±1°C change. The same applies to volume control or any other granular adjustment. This tactile feedback offers a level of hyper-controlled interaction that screen taps often lack, feeling more connected and satisfying. Beyond physical interaction, the Hub 3 gets smarter senses. While the trusty thermo-hygro sensor (cleverly integrated into its cable) makes a return for indoor temperature and humidity, it’s now joined by a built-in light sensor. This seemingly small addition unlocks a new layer of intuitive automation. Your home can now react to ambient brightness, perhaps cueing your SwitchBot Curtain 3 to draw open gently as the sun rises, or dimming lights as natural light fades. Aesthetically, SwitchBot made a subtle but impactful shift from the Hub 2’s white casing to a sleek black for the Hub 3. This change makes the integrated display stand out significantly, improving readability at a glance. And that display now does more heavy lifting. It still shows essential indoor temperature and humidity, but can also pull in local outdoor weather data, giving you a quick forecast without reaching for your phone. Pair it with a SwitchBot Meter Pro, and it’ll even show CO2 levels. The Hub 2 featured two handy customizable buttons. The Hub 3 doubles down, offering four such buttons. This means more of your favorite automation scenes, like “Movie Night,” “Good Morning,” and “Away Mode,” are just a single press away. This reduces friction, making your smart home react faster to your needs without diving into an app for every little thing. It’s these quality-of-life improvements that often make the biggest difference in daily use. Crucially, the Hub 3 retains everything that made its predecessor a strong contender. The infrared control capabilities are still robust, supporting over 100,000 IR codes for your legacy AV gear and air conditioners, now with a signal that’s reportedly 150% stronger than the Hub Mini. Its deep integration with the existing SwitchBot ecosystem means your Bots, Curtain movers, and vacuums will feel right at home, working in concert. Of course, you still have your choice of control methods. Beyond the new dial and physical buttons, there’s comprehensive app control for setting up complex automations and remote access. Voice control via the usual assistants like Alexa and Google Assistant is present and accounted for, ensuring hands-free operation whenever you need it. This flexibility means the Hub 3 adapts to your preferences, not the other way around. The true power, as always, lies in the DIY automation scenes. Imagine your AC, humidifier, and dehumidifier working together, orchestrated by the Hub 3 to maintain your perfect 23°C and 58% humidity. Or picture an energy-saving scene where the built-in motion sensor, coupled with geofencing, detects an empty house and powers down non-essential appliances. It’s these intelligent, personalized routines that transform a collection of smart devices into a truly smart home. The SwitchBot Hub 3 feels like the most potent iteration of that “secret sauce” yet. It takes the individual brilliance of SwitchBot’s gadgets and, through enhanced sensory input and more tactile controls, truly deepens that crucial understanding between device, environment, and user. The best part? It plugs right into your smart home’s existing setup, communicating with your slew of IoT devices – even more efficiently if you’ve got a Hub 2 or Hub Mini and you’re looking to upgrade. Click Here to Buy Now: $119.99The post Your Smart Home Got a New CEO and It’s Called the SwitchBot Hub 3 first appeared on Yanko Design.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • A two-pack of Blink Mini 2 security cameras is only $38 for Memorial Day

    Blink deals are back for Memorial Day, making it a good time to stock up on security cameras if you have an existing system or try out one of our favorites for the first time. A particularly great deal is on a two-pack of Blink Mini 2 wired security cams, which you can grab for just The Blink Mini 2 made our list of the best security cameras, and we gave it kudos for being a great option for the budget-conscious. These are small cameras that are easy to place just about anywhere. You can adjust the head to accommodate a wide variety of angles and we found the setup to be painless and quick.

    The image quality is decent, though not astounding. The bright spots can get a bit blown out, but the camera gets the job done. We found that it was particularly useful at night, thanks to the infrared view and the spotlight. This model also offers the ability to set privacy zones and it can double as a doorbell chime if you happen to have a Blink Video Doorbell installed.
    The camera is weather resistant, so it works outside. It also integrates with Alexa, which makes sense as Amazon owns Blink. Alexa can be used to adjust settings, arm the camera and more.
    There's no location detection here, so it can't automatically arm the device when you leave the house. This has to be done manually. Also, it doesn't offer any cloud storage unless you pony up for a Blink subscription. A basic subscription for one camera costs per month.
    Memorial Day sales also include other Blink bundles. If you're setting up a Blink system for the first time, you can grab a one-camera Blink Outdoor 4 system for half off, or only That 50-percent discount is also available on multi-camera systems, including an expansive five-camera bundle, which is down to Follow @EngadgetDeals on X for the latest tech deals and buying advice.This article originally appeared on Engadget at
    #twopack #blink #mini #security #cameras
    A two-pack of Blink Mini 2 security cameras is only $38 for Memorial Day
    Blink deals are back for Memorial Day, making it a good time to stock up on security cameras if you have an existing system or try out one of our favorites for the first time. A particularly great deal is on a two-pack of Blink Mini 2 wired security cams, which you can grab for just The Blink Mini 2 made our list of the best security cameras, and we gave it kudos for being a great option for the budget-conscious. These are small cameras that are easy to place just about anywhere. You can adjust the head to accommodate a wide variety of angles and we found the setup to be painless and quick. The image quality is decent, though not astounding. The bright spots can get a bit blown out, but the camera gets the job done. We found that it was particularly useful at night, thanks to the infrared view and the spotlight. This model also offers the ability to set privacy zones and it can double as a doorbell chime if you happen to have a Blink Video Doorbell installed. The camera is weather resistant, so it works outside. It also integrates with Alexa, which makes sense as Amazon owns Blink. Alexa can be used to adjust settings, arm the camera and more. There's no location detection here, so it can't automatically arm the device when you leave the house. This has to be done manually. Also, it doesn't offer any cloud storage unless you pony up for a Blink subscription. A basic subscription for one camera costs per month. Memorial Day sales also include other Blink bundles. If you're setting up a Blink system for the first time, you can grab a one-camera Blink Outdoor 4 system for half off, or only That 50-percent discount is also available on multi-camera systems, including an expansive five-camera bundle, which is down to Follow @EngadgetDeals on X for the latest tech deals and buying advice.This article originally appeared on Engadget at #twopack #blink #mini #security #cameras
    WWW.ENGADGET.COM
    A two-pack of Blink Mini 2 security cameras is only $38 for Memorial Day
    Blink deals are back for Memorial Day, making it a good time to stock up on security cameras if you have an existing system or try out one of our favorites for the first time. A particularly great deal is on a two-pack of Blink Mini 2 wired security cams, which you can grab for just $38. The Blink Mini 2 made our list of the best security cameras, and we gave it kudos for being a great option for the budget-conscious. These are small cameras that are easy to place just about anywhere. You can adjust the head to accommodate a wide variety of angles and we found the setup to be painless and quick. The image quality is decent, though not astounding. The bright spots can get a bit blown out, but the camera gets the job done. We found that it was particularly useful at night, thanks to the infrared view and the spotlight. This model also offers the ability to set privacy zones and it can double as a doorbell chime if you happen to have a Blink Video Doorbell installed. The camera is weather resistant, so it works outside. It also integrates with Alexa, which makes sense as Amazon owns Blink. Alexa can be used to adjust settings, arm the camera and more. There's no location detection here, so it can't automatically arm the device when you leave the house. This has to be done manually. Also, it doesn't offer any cloud storage unless you pony up for a Blink subscription. A basic subscription for one camera costs $3 per month. Memorial Day sales also include other Blink bundles. If you're setting up a Blink system for the first time, you can grab a one-camera Blink Outdoor 4 system for half off, or only $50. That 50-percent discount is also available on multi-camera systems, including an expansive five-camera bundle, which is down to $200. Follow @EngadgetDeals on X for the latest tech deals and buying advice.This article originally appeared on Engadget at https://www.engadget.com/deals/a-two-pack-of-blink-mini-2-security-cameras-is-only-38-for-memorial-day-152339657.html?src=rss
    0 Comentários 0 Compartilhamentos 0 Anterior
CGShares https://cgshares.com