Blender is the free and open source 3D creation suite. Free to use for any purpose, forever.
Recent Updates
-
NPR Project
NPR Project
May 23rd, 2025
Code Design, General Development
Clément Foucault
html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";
Wing it! Early NPR project by Blender Studio.
In July 2024 the NPRproject officially started, with a workshop with Dillo Goo Studio and Blender developers.
While the use-cases were clear, the architecture and overall design were not. To help with this, the team started working in a prototype containing many shading features essential to the NPR workflow.
This prototype received a lot of attention, with users contributing a lot of nice examples of what is possible with such system. The feedback showed that there is a big interest from the community for a wide range of effects.
However the amount of flexibity made possible with the prototype came with a cost: it locked NPR features within EEVEE, alienating Cycles from part of the NPR pipeline. It also deviated from the EEVEE architecture, which could limit future feature development.
After much consideration, the design was modified to address these core issues. The outcome can be summarized as:
Move filters and color modification to a multi-stage compositing workflow.
Keep shading features inside the renderer’s material system.
Multi-stage compositing
One of the core feature needed for NPR is the ability to access and modify the shaded pixels.
Doing it inside a render engine has been notoriously difficult. The current way of doing it inside EEVEE is to use the ShaderToRGB node, which comes with a lot of limitations. In Cycles, limited effects can be achieved using custom OSL nodes.
As a result, in production pipeline, this is often done through very cumbersome and time consuming scene-wide compositing. The major downside is that all asset specific compositing needs to be manually merged and managed inside the scene compositor.
Instead, the parts of the compositing pipeline that are specific to a certain asset should be defined at the asset level. The reasoning is that these compositing nodes define the appearance of this asset and should be shared between scene.
Multi-stage compositing is just that! A part of the compositing pipeline is linked to a specific object or material. This part receives the rendered color as well as its AOVs and render passes as input, and output the modified rendered color.
The object level compositor at the bottom right define the final appearance of the object
In this example the appearance of the Suzanne object is defined at the object level inside its asset file. When linked into a scene with other elements, it is automatically combined with other assets.
From left to right: Smooth Toon shading with alpha over specular, Pixelate, Half-Tone with Outline
This new multi-stage compositing will be reusing the compositor nodes, with a different subset of nodes available at the object and material levels. This is an opportunity to streamline the workflow between material nodes editing and compositor nodes.
Grease Pencil Effects can eventually be replaced by this solution.
Final render showing 3 objects with different stylizations seamlessly integrated.
There are a lot more to be said about this feature. For more details see the associated development task.
Anti-Aliased output
A major issue with working with the a compositing workflow is Anti-Aliasing. When compositing anti-aliased input, results often include hard to resolve fringes.
Left: Render Pass, Middle: Object Matte, Right: Extracted Object Render Pass
The common workaround to this issue is to render at higher resolution without AA and downscale after compositing. This method is very memory intensive and only allows for 4x or 9x AA with usually less than ideal filtering. Another option is to use post-process AA filters but that often results in flickering animations.
Left: Anti-Aliased done before compositor based shadingRight: Anti-Aliasing is done after compositor.
The solution to this problem is to run the compositor for each AA step and filter the composited pixels like a renderer would do. This will produce the best image quality with only the added memory usage of the final frame.
Converged input
One of the main issues with modern renderers is that their output is noisy. This doesn’t play well with NPR workflows as many effects require applying sharp transformations of the rendered image or light buffers.
For instance, this is what happens when applying a constant interpolated color ramp over the ambient occlusion node. The averaging operation is run on a noisy output instead of running on a noisy input before the transformation.
Left: Original AO, Middle: Constant Ramp in material, Right: Ramp applied in compositorDoing these effects at compositing time gives us the final converged image as input. However, as explained above, the compositor needs to run before the AA filtering.
So the multi-stage compositors needs to be able to run on converged or denoised inputs while being AA free. In other words, it means that the render samples will be distributed between render passes convergence and final compositor AA.
Engine Features
While improving the compositing workflow is important for stylization flexibility, some features are more suited for the inside of the render engine. This allows builtin interaction with light transport and other renderer features. These features are not exclusive to NPR workflows and fit well inside the engine architecture.
As such, the following features are planned to be directly implemented inside the render engines:
Ray Queries
Portal BSDF
Custom Shading
Depth Offset
The development will start after the Blender 5.0 release, planned for November 2025.
Meanwhile, to follow the project, subscribe to the development task. For more details about the project, join the announcement thread.
Support the Future of Blender
Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases.
♥ Donate to Blender
#npr #projectNPR ProjectNPR Project May 23rd, 2025 Code Design, General Development Clément Foucault html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; Wing it! Early NPR project by Blender Studio. In July 2024 the NPRproject officially started, with a workshop with Dillo Goo Studio and Blender developers. While the use-cases were clear, the architecture and overall design were not. To help with this, the team started working in a prototype containing many shading features essential to the NPR workflow. This prototype received a lot of attention, with users contributing a lot of nice examples of what is possible with such system. The feedback showed that there is a big interest from the community for a wide range of effects. However the amount of flexibity made possible with the prototype came with a cost: it locked NPR features within EEVEE, alienating Cycles from part of the NPR pipeline. It also deviated from the EEVEE architecture, which could limit future feature development. After much consideration, the design was modified to address these core issues. The outcome can be summarized as: Move filters and color modification to a multi-stage compositing workflow. Keep shading features inside the renderer’s material system. Multi-stage compositing One of the core feature needed for NPR is the ability to access and modify the shaded pixels. Doing it inside a render engine has been notoriously difficult. The current way of doing it inside EEVEE is to use the ShaderToRGB node, which comes with a lot of limitations. In Cycles, limited effects can be achieved using custom OSL nodes. As a result, in production pipeline, this is often done through very cumbersome and time consuming scene-wide compositing. The major downside is that all asset specific compositing needs to be manually merged and managed inside the scene compositor. Instead, the parts of the compositing pipeline that are specific to a certain asset should be defined at the asset level. The reasoning is that these compositing nodes define the appearance of this asset and should be shared between scene. Multi-stage compositing is just that! A part of the compositing pipeline is linked to a specific object or material. This part receives the rendered color as well as its AOVs and render passes as input, and output the modified rendered color. The object level compositor at the bottom right define the final appearance of the object In this example the appearance of the Suzanne object is defined at the object level inside its asset file. When linked into a scene with other elements, it is automatically combined with other assets. From left to right: Smooth Toon shading with alpha over specular, Pixelate, Half-Tone with Outline This new multi-stage compositing will be reusing the compositor nodes, with a different subset of nodes available at the object and material levels. This is an opportunity to streamline the workflow between material nodes editing and compositor nodes. Grease Pencil Effects can eventually be replaced by this solution. Final render showing 3 objects with different stylizations seamlessly integrated. There are a lot more to be said about this feature. For more details see the associated development task. Anti-Aliased output A major issue with working with the a compositing workflow is Anti-Aliasing. When compositing anti-aliased input, results often include hard to resolve fringes. Left: Render Pass, Middle: Object Matte, Right: Extracted Object Render Pass The common workaround to this issue is to render at higher resolution without AA and downscale after compositing. This method is very memory intensive and only allows for 4x or 9x AA with usually less than ideal filtering. Another option is to use post-process AA filters but that often results in flickering animations. Left: Anti-Aliased done before compositor based shadingRight: Anti-Aliasing is done after compositor. The solution to this problem is to run the compositor for each AA step and filter the composited pixels like a renderer would do. This will produce the best image quality with only the added memory usage of the final frame. Converged input One of the main issues with modern renderers is that their output is noisy. This doesn’t play well with NPR workflows as many effects require applying sharp transformations of the rendered image or light buffers. For instance, this is what happens when applying a constant interpolated color ramp over the ambient occlusion node. The averaging operation is run on a noisy output instead of running on a noisy input before the transformation. Left: Original AO, Middle: Constant Ramp in material, Right: Ramp applied in compositorDoing these effects at compositing time gives us the final converged image as input. However, as explained above, the compositor needs to run before the AA filtering. So the multi-stage compositors needs to be able to run on converged or denoised inputs while being AA free. In other words, it means that the render samples will be distributed between render passes convergence and final compositor AA. Engine Features While improving the compositing workflow is important for stylization flexibility, some features are more suited for the inside of the render engine. This allows builtin interaction with light transport and other renderer features. These features are not exclusive to NPR workflows and fit well inside the engine architecture. As such, the following features are planned to be directly implemented inside the render engines: Ray Queries Portal BSDF Custom Shading Depth Offset The development will start after the Blender 5.0 release, planned for November 2025. Meanwhile, to follow the project, subscribe to the development task. For more details about the project, join the announcement thread. Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender #npr #projectCODE.BLENDER.ORGNPR ProjectNPR Project May 23rd, 2025 Code Design, General Development Clément Foucault html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" Wing it! Early NPR project by Blender Studio. In July 2024 the NPR (Non-Photorealistic Rendering) project officially started, with a workshop with Dillo Goo Studio and Blender developers. While the use-cases were clear, the architecture and overall design were not. To help with this, the team started working in a prototype containing many shading features essential to the NPR workflow (such as filter support, custom shading, and AOV access). This prototype received a lot of attention, with users contributing a lot of nice examples of what is possible with such system. The feedback showed that there is a big interest from the community for a wide range of effects. However the amount of flexibity made possible with the prototype came with a cost: it locked NPR features within EEVEE, alienating Cycles from part of the NPR pipeline. It also deviated from the EEVEE architecture, which could limit future feature development. After much consideration, the design was modified to address these core issues. The outcome can be summarized as: Move filters and color modification to a multi-stage compositing workflow. Keep shading features inside the renderer’s material system. Multi-stage compositing One of the core feature needed for NPR is the ability to access and modify the shaded pixels. Doing it inside a render engine has been notoriously difficult. The current way of doing it inside EEVEE is to use the ShaderToRGB node, which comes with a lot of limitations. In Cycles, limited effects can be achieved using custom OSL nodes. As a result, in production pipeline, this is often done through very cumbersome and time consuming scene-wide compositing. The major downside is that all asset specific compositing needs to be manually merged and managed inside the scene compositor. Instead, the parts of the compositing pipeline that are specific to a certain asset should be defined at the asset level. The reasoning is that these compositing nodes define the appearance of this asset and should be shared between scene. Multi-stage compositing is just that! A part of the compositing pipeline is linked to a specific object or material. This part receives the rendered color as well as its AOVs and render passes as input, and output the modified rendered color. The object level compositor at the bottom right define the final appearance of the object In this example the appearance of the Suzanne object is defined at the object level inside its asset file. When linked into a scene with other elements, it is automatically combined with other assets. From left to right: Smooth Toon shading with alpha over specular, Pixelate, Half-Tone with Outline This new multi-stage compositing will be reusing the compositor nodes, with a different subset of nodes available at the object and material levels. This is an opportunity to streamline the workflow between material nodes editing and compositor nodes. Grease Pencil Effects can eventually be replaced by this solution. Final render showing 3 objects with different stylizations seamlessly integrated. There are a lot more to be said about this feature. For more details see the associated development task. Anti-Aliased output A major issue with working with the a compositing workflow is Anti-Aliasing (AA). When compositing anti-aliased input, results often include hard to resolve fringes. Left: Render Pass, Middle: Object Matte, Right: Extracted Object Render Pass The common workaround to this issue is to render at higher resolution without AA and downscale after compositing. This method is very memory intensive and only allows for 4x or 9x AA with usually less than ideal filtering. Another option is to use post-process AA filters but that often results in flickering animations. Left: Anti-Aliased done before compositor based shadingRight: Anti-Aliasing is done after compositor. The solution to this problem is to run the compositor for each AA step and filter the composited pixels like a renderer would do. This will produce the best image quality with only the added memory usage of the final frame. Converged input One of the main issues with modern renderers is that their output is noisy. This doesn’t play well with NPR workflows as many effects require applying sharp transformations of the rendered image or light buffers. For instance, this is what happens when applying a constant interpolated color ramp over the ambient occlusion node. The averaging operation is run on a noisy output instead of running on a noisy input before the transformation. Left: Original AO, Middle: Constant Ramp in material, Right: Ramp applied in compositor (desired) Doing these effects at compositing time gives us the final converged image as input. However, as explained above, the compositor needs to run before the AA filtering. So the multi-stage compositors needs to be able to run on converged or denoised inputs while being AA free. In other words, it means that the render samples will be distributed between render passes convergence and final compositor AA. Engine Features While improving the compositing workflow is important for stylization flexibility, some features are more suited for the inside of the render engine. This allows builtin interaction with light transport and other renderer features. These features are not exclusive to NPR workflows and fit well inside the engine architecture. As such, the following features are planned to be directly implemented inside the render engines: Ray Queries Portal BSDF Custom Shading Depth Offset The development will start after the Blender 5.0 release, planned for November 2025. Meanwhile, to follow the project, subscribe to the development task. For more details about the project, join the announcement thread. Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender0 Comments 0 SharesPlease log in to like, share and comment! -
Projects Update – Q2/2025
Projects Update – Q2/2025
html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";
At the beginning of 2025, several projects were announced as the initial targets for the year. Now that we’re in the middle of the second quarter, let’s take a look at where each project stands.
Complete
Vulkan
Vulkan is now officially supported in the upcoming 4.5 LTS release, offering feature parity and comparable performance to the OpenGL backend.
The next step is to monitor bug reports and eventually make it the default backend for non-macOS systems.
Almost Complete
UV Sync
All issues from theUV Sync design task have been addressed. Development is ongoing in the pr-uv-sync-select branch.
The remaining work involves finalizing the port of certain selection operators and resolving minor issues. Follow the project at #136817.
Better integration across node trees
The compositor is moving closer to feature parity with shading and geometry nodes, thanks to the addition of new nodes such as Vector Math, Vector Rotate, Vector Mix, Value Mix, Clamp, Float Curve, and Blackbody.
Compositor Assets Mockup.
The next step is to expose most of the existing node options as socket inputs,. This will enable compositor node assets to be bundled with Blender.
After that, the focus will remain on simplifying onboarding for new compositor users by making node trees reusable.
In Progress
Project Setup
The first milestonewas recently merged. The next step is to handle Path Template errors in a more robust way.
Project Setup Mockup.
After that, work will begin on the Project Definition phase. Follow the project at #133001.
Shape Keys Improvements
What was originally framed as a performance problem has shifted focus toward usability and management of shape keys.
As part of this, new operators for duplicating and updating shape keys have already been merged, with their mirrored counterparts to follow.
Additionally, work on multi-select and editing of shape keys is gaining momentum. Follow the project at #136838.
Remote Asset Libraries
The project has broadened in scope to address usability improvements, including:
Preview generation for all selected assets.
A more compact view with a horizontal list and two-line names.
Snapping for dragged collection assets.
Remote Asset Library mockup.
Meanwhile, the code for handling downloads has been submitted for review but encountered a setback.
Development is taking place in the remote-asset-library-monolithic branch. Follow the project at #134495.
Hair Dynamics
The Hair Dynamics project consists of multiple deliverables:
Embedded Linked Data— still under review
Bundles and Closures — merged as experimental
Declarative Systems — published as a design proposal
Hair Solver — see below
For the hairsolver, the plan is to use the upcoming Blender Studio project—currently unnamed but focused on a facial rig—as a use case, at least to develop it as an experimental feature.
This will also involve addressing existing issues with animated hair and integrating animation and simulation for the same character—initially using separate hair objects.
Design/Prototype
Texture cache and mipmaps
Initially unplanned due to limited resources, this project was eventually added to the agenda. A rudimentary prototype is already available in the cycles-tx branch. In the Attic and Bistro benchmark scenes, memory usage is already significantly reduced.
These scenes were chosen because they include texture cachefiles. To learn how to test it, follow the project at #68917.
NPR
The NPRprototype received extensive feedback, helping to map out all planned and unsupported use cases for the project.
More details, including the final design and development plans, will be shared soon.
In brief:
Some features will be implemented as EEVEE nodes.
Others will be enabled via per-material/object compositing nodes.
EEVEE features will be prioritized first, while the per-material/object compositing nodes require further design.
So far, the focus has been on prototyping and finalizing the design to pull VSE strips out of the scene and create a dedicated sequence data-block.
Story Tools Mockup.
The next step is to finalize the design by either:
Exploring once more the idea of keeping the sequence as part of the scene; or
Settling on a per-camera and file settings design.
After that, a technical breakdown will follow, then development. Follow the project at #131329.
Not Started
Layered Sculpting
Layered sculpting hasn’t started yet. The original plan was to first address multi-resolution undo and rebuild issues, followed by fixing propagation spikes.
However, in recent months the focus shifted to tackling sculpting performance issues present since the 4.3 release, mainly:
Performance problems with smaller brush strokes.
Local brush management.
The performance patches are currently under review and expected in time for the upcoming 4.5 LTS release. Once completed, work on undo and multi-resolution will resume.
Dynamic Overrides
The team is busy with the 5.0 breaking change targets and other tasks and could not reserve the time to start the initial changes expected to simplify the overrides process.
The team is currently focused on the 5.0 breaking change targets and other tasks, so they have not yet been able to start the initial changes aimed at simplifying the overrides process.
And more…
Beyond these projects, daily activity continues across various development modules. For a more frequent, day-to-day view of progress, check out the Weekly Updates and Module Meetings.
All this progress is made possible thanks to donations and ongoing community involvement and contributions.
Support the Future of Blender
Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases.
♥ Donate to Blender
#projects #update #q22025Projects Update – Q2/2025Projects Update – Q2/2025 html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; At the beginning of 2025, several projects were announced as the initial targets for the year. Now that we’re in the middle of the second quarter, let’s take a look at where each project stands. Complete Vulkan Vulkan is now officially supported in the upcoming 4.5 LTS release, offering feature parity and comparable performance to the OpenGL backend. The next step is to monitor bug reports and eventually make it the default backend for non-macOS systems. Almost Complete UV Sync All issues from theUV Sync design task have been addressed. Development is ongoing in the pr-uv-sync-select branch. The remaining work involves finalizing the port of certain selection operators and resolving minor issues. Follow the project at #136817. Better integration across node trees The compositor is moving closer to feature parity with shading and geometry nodes, thanks to the addition of new nodes such as Vector Math, Vector Rotate, Vector Mix, Value Mix, Clamp, Float Curve, and Blackbody. Compositor Assets Mockup. The next step is to expose most of the existing node options as socket inputs,. This will enable compositor node assets to be bundled with Blender. After that, the focus will remain on simplifying onboarding for new compositor users by making node trees reusable. In Progress Project Setup The first milestonewas recently merged. The next step is to handle Path Template errors in a more robust way. Project Setup Mockup. After that, work will begin on the Project Definition phase. Follow the project at #133001. Shape Keys Improvements What was originally framed as a performance problem has shifted focus toward usability and management of shape keys. As part of this, new operators for duplicating and updating shape keys have already been merged, with their mirrored counterparts to follow. Additionally, work on multi-select and editing of shape keys is gaining momentum. Follow the project at #136838. Remote Asset Libraries The project has broadened in scope to address usability improvements, including: Preview generation for all selected assets. A more compact view with a horizontal list and two-line names. Snapping for dragged collection assets. Remote Asset Library mockup. Meanwhile, the code for handling downloads has been submitted for review but encountered a setback. Development is taking place in the remote-asset-library-monolithic branch. Follow the project at #134495. Hair Dynamics The Hair Dynamics project consists of multiple deliverables: Embedded Linked Data— still under review Bundles and Closures — merged as experimental Declarative Systems — published as a design proposal Hair Solver — see below For the hairsolver, the plan is to use the upcoming Blender Studio project—currently unnamed but focused on a facial rig—as a use case, at least to develop it as an experimental feature. This will also involve addressing existing issues with animated hair and integrating animation and simulation for the same character—initially using separate hair objects. Design/Prototype Texture cache and mipmaps Initially unplanned due to limited resources, this project was eventually added to the agenda. A rudimentary prototype is already available in the cycles-tx branch. In the Attic and Bistro benchmark scenes, memory usage is already significantly reduced. These scenes were chosen because they include texture cachefiles. To learn how to test it, follow the project at #68917. NPR The NPRprototype received extensive feedback, helping to map out all planned and unsupported use cases for the project. More details, including the final design and development plans, will be shared soon. In brief: Some features will be implemented as EEVEE nodes. Others will be enabled via per-material/object compositing nodes. EEVEE features will be prioritized first, while the per-material/object compositing nodes require further design. So far, the focus has been on prototyping and finalizing the design to pull VSE strips out of the scene and create a dedicated sequence data-block. Story Tools Mockup. The next step is to finalize the design by either: Exploring once more the idea of keeping the sequence as part of the scene; or Settling on a per-camera and file settings design. After that, a technical breakdown will follow, then development. Follow the project at #131329. Not Started Layered Sculpting Layered sculpting hasn’t started yet. The original plan was to first address multi-resolution undo and rebuild issues, followed by fixing propagation spikes. However, in recent months the focus shifted to tackling sculpting performance issues present since the 4.3 release, mainly: Performance problems with smaller brush strokes. Local brush management. The performance patches are currently under review and expected in time for the upcoming 4.5 LTS release. Once completed, work on undo and multi-resolution will resume. Dynamic Overrides The team is busy with the 5.0 breaking change targets and other tasks and could not reserve the time to start the initial changes expected to simplify the overrides process. The team is currently focused on the 5.0 breaking change targets and other tasks, so they have not yet been able to start the initial changes aimed at simplifying the overrides process. And more… Beyond these projects, daily activity continues across various development modules. For a more frequent, day-to-day view of progress, check out the Weekly Updates and Module Meetings. All this progress is made possible thanks to donations and ongoing community involvement and contributions. Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender #projects #update #q22025CODE.BLENDER.ORGProjects Update – Q2/2025Projects Update – Q2/2025 html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" At the beginning of 2025, several projects were announced as the initial targets for the year. Now that we’re in the middle of the second quarter, let’s take a look at where each project stands. Complete Vulkan Vulkan is now officially supported in the upcoming 4.5 LTS release, offering feature parity and comparable performance to the OpenGL backend. The next step is to monitor bug reports and eventually make it the default backend for non-macOS systems. Almost Complete UV Sync All issues from the (five-year-old!) UV Sync design task have been addressed. Development is ongoing in the pr-uv-sync-select branch (builds are available here). The remaining work involves finalizing the port of certain selection operators and resolving minor issues. Follow the project at #136817. Better integration across node trees The compositor is moving closer to feature parity with shading and geometry nodes, thanks to the addition of new nodes such as Vector Math, Vector Rotate, Vector Mix, Value Mix, Clamp, Float Curve, and Blackbody. Compositor Assets Mockup. The next step is to expose most of the existing node options as socket inputs, (#137223). This will enable compositor node assets to be bundled with Blender. After that, the focus will remain on simplifying onboarding for new compositor users by making node trees reusable (#135223). In Progress Project Setup The first milestone (Blender variables) was recently merged. The next step is to handle Path Template errors in a more robust way. Project Setup Mockup. After that, work will begin on the Project Definition phase. Follow the project at #133001. Shape Keys Improvements What was originally framed as a performance problem has shifted focus toward usability and management of shape keys. As part of this, new operators for duplicating and updating shape keys have already been merged, with their mirrored counterparts to follow. Additionally, work on multi-select and editing of shape keys is gaining momentum. Follow the project at #136838. Remote Asset Libraries The project has broadened in scope to address usability improvements, including: Preview generation for all selected assets. A more compact view with a horizontal list and two-line names. Snapping for dragged collection assets. Remote Asset Library mockup. Meanwhile, the code for handling downloads has been submitted for review but encountered a setback. Development is taking place in the remote-asset-library-monolithic branch. Follow the project at #134495. Hair Dynamics The Hair Dynamics project consists of multiple deliverables: Embedded Linked Data (#133801) — still under review Bundles and Closures — merged as experimental Declarative Systems — published as a design proposal Hair Solver — see below For the hair (physics) solver, the plan is to use the upcoming Blender Studio project—currently unnamed but focused on a facial rig—as a use case, at least to develop it as an experimental feature. This will also involve addressing existing issues with animated hair and integrating animation and simulation for the same character—initially using separate hair objects. Design/Prototype Texture cache and mipmaps Initially unplanned due to limited resources, this project was eventually added to the agenda. A rudimentary prototype is already available in the cycles-tx branch. In the Attic and Bistro benchmark scenes, memory usage is already significantly reduced. These scenes were chosen because they include texture cache (.tx) files. To learn how to test it, follow the project at #68917. NPR The NPR (Non-Photo Realism) prototype received extensive feedback, helping to map out all planned and unsupported use cases for the project. More details, including the final design and development plans, will be shared soon. In brief: Some features will be implemented as EEVEE nodes. Others will be enabled via per-material/object compositing nodes. EEVEE features will be prioritized first, while the per-material/object compositing nodes require further design. So far, the focus has been on prototyping and finalizing the design to pull VSE strips out of the scene and create a dedicated sequence data-block. Story Tools Mockup. The next step is to finalize the design by either: Exploring once more the idea of keeping the sequence as part of the scene; or Settling on a per-camera and file settings design. After that, a technical breakdown will follow, then development. Follow the project at #131329. Not Started Layered Sculpting Layered sculpting hasn’t started yet. The original plan was to first address multi-resolution undo and rebuild issues, followed by fixing propagation spikes. However, in recent months the focus shifted to tackling sculpting performance issues present since the 4.3 release, mainly: Performance problems with smaller brush strokes. Local brush management. The performance patches are currently under review and expected in time for the upcoming 4.5 LTS release. Once completed, work on undo and multi-resolution will resume. Dynamic Overrides The team is busy with the 5.0 breaking change targets and other tasks and could not reserve the time to start the initial changes expected to simplify the overrides process. The team is currently focused on the 5.0 breaking change targets and other tasks, so they have not yet been able to start the initial changes aimed at simplifying the overrides process. And more… Beyond these projects, daily activity continues across various development modules. For a more frequent, day-to-day view of progress, check out the Weekly Updates and Module Meetings. All this progress is made possible thanks to donations and ongoing community involvement and contributions. Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender0 Comments 0 Shares -
Frame Node Improvements
Frame Node Improvements
May 19th, 2025
General Development
Jacques Lucke
html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";
Larger node setups have a tendency to become hard to understand. One of the main tools at our disposal to organize them are frames. A frame allows grouping nodes together visually and giving them a label.
Unfortunately, using frames extensively has been rather annoying for various reasons including hard-to-reach shortcuts, bad readability of nested frames, and various smaller bugs.
In an effort to encourage and help users to build more understandable node groups, we designed and implemented various workflow improvements for dealing with frames. These will be available in Blender 4.5 LTS.
The F Key
The most important functionality related to frames has been moved to the F key, which is easy to reach and remember. It has two main functions:
Create a new Frame around the selected nodes, or a new empty frame. This will also open the rename popup where users can start typing right away to set a label. This makes creating labeled frames as easy as it can be, which is good as those help much more with readability than unlabeled frames.
Detach and attach selected nodes while transforming them. This makes moving nodes between frames a breeze, unlike before where frames felt like they got in the way.
The old functionality has been moved to J.
Visualization Improvements
Frame Highlighting: It used to be difficult which frame nodes will be attached to while moving them. This has been solved by highlighting the border of the frame under the cursor.
Better Labels: Frame labels now sit a bit higher so they’re easier to read. The frame size also adjusts better by considering the full area/bounding box of the nodes inside.
Alternating Frame Colors: Frames are quite dark in the default theme, which made nested frames very hard to see. Now, nested frames use alternating shading depending on the nesting depth. The frame color from the theme is used, just slightly darker/lighter. This keeps all frames visible regardless of the nesting level.
Frame visualization improvements.
“Frame First” Workflow
With all of these improvements, working with frames has become much more enjoyable than before. Our hope is that this lowers the barrier to using frames so much that they become more frequently and not just as an afterthought when cleaning it up later on.
In fact, I’d even be curious to see how well a “frame first” workflow would work for people. The thought came up when thinking about how when writing code, one usually first writes a function or variable name before writing any of the actual logic. Maybe something similar could work for node systems too, where a labeled frame is created before the first node is added inside. However, it’s not clear yet how well that works in practice right now or if more features would be needed to make that feasible. Let me know!
Try It!
Download the latest Blender 4.5 LTS Alpha build and test out the new workflow!
Support the Future of Blender
Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases.
♥ Donate to Blender
#frame #node #improvementsFrame Node ImprovementsFrame Node Improvements May 19th, 2025 General Development Jacques Lucke html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; Larger node setups have a tendency to become hard to understand. One of the main tools at our disposal to organize them are frames. A frame allows grouping nodes together visually and giving them a label. Unfortunately, using frames extensively has been rather annoying for various reasons including hard-to-reach shortcuts, bad readability of nested frames, and various smaller bugs. In an effort to encourage and help users to build more understandable node groups, we designed and implemented various workflow improvements for dealing with frames. These will be available in Blender 4.5 LTS. The F Key The most important functionality related to frames has been moved to the F key, which is easy to reach and remember. It has two main functions: Create a new Frame around the selected nodes, or a new empty frame. This will also open the rename popup where users can start typing right away to set a label. This makes creating labeled frames as easy as it can be, which is good as those help much more with readability than unlabeled frames. Detach and attach selected nodes while transforming them. This makes moving nodes between frames a breeze, unlike before where frames felt like they got in the way. The old functionality has been moved to J. Visualization Improvements Frame Highlighting: It used to be difficult which frame nodes will be attached to while moving them. This has been solved by highlighting the border of the frame under the cursor. Better Labels: Frame labels now sit a bit higher so they’re easier to read. The frame size also adjusts better by considering the full area/bounding box of the nodes inside. Alternating Frame Colors: Frames are quite dark in the default theme, which made nested frames very hard to see. Now, nested frames use alternating shading depending on the nesting depth. The frame color from the theme is used, just slightly darker/lighter. This keeps all frames visible regardless of the nesting level. Frame visualization improvements. “Frame First” Workflow With all of these improvements, working with frames has become much more enjoyable than before. Our hope is that this lowers the barrier to using frames so much that they become more frequently and not just as an afterthought when cleaning it up later on. In fact, I’d even be curious to see how well a “frame first” workflow would work for people. The thought came up when thinking about how when writing code, one usually first writes a function or variable name before writing any of the actual logic. Maybe something similar could work for node systems too, where a labeled frame is created before the first node is added inside. However, it’s not clear yet how well that works in practice right now or if more features would be needed to make that feasible. Let me know! Try It! Download the latest Blender 4.5 LTS Alpha build and test out the new workflow! Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender #frame #node #improvementsCODE.BLENDER.ORGFrame Node ImprovementsFrame Node Improvements May 19th, 2025 General Development Jacques Lucke html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" Larger node setups have a tendency to become hard to understand. One of the main tools at our disposal to organize them are frames. A frame allows grouping nodes together visually and giving them a label. Unfortunately, using frames extensively has been rather annoying for various reasons including hard-to-reach shortcuts (ctrl + J, alt + P, F2), bad readability of nested frames, and various smaller bugs. In an effort to encourage and help users to build more understandable node groups, we designed and implemented various workflow improvements for dealing with frames. These will be available in Blender 4.5 LTS. The F Key The most important functionality related to frames has been moved to the F key, which is easy to reach and remember. It has two main functions: Create a new Frame around the selected nodes, or a new empty frame. This will also open the rename popup where users can start typing right away to set a label. This makes creating labeled frames as easy as it can be, which is good as those help much more with readability than unlabeled frames. Detach and attach selected nodes while transforming them. This makes moving nodes between frames a breeze, unlike before where frames felt like they got in the way. The old functionality has been moved to J. Visualization Improvements Frame Highlighting: It used to be difficult which frame nodes will be attached to while moving them. This has been solved by highlighting the border of the frame under the cursor. Better Labels: Frame labels now sit a bit higher so they’re easier to read. The frame size also adjusts better by considering the full area/bounding box of the nodes inside. Alternating Frame Colors: Frames are quite dark in the default theme, which made nested frames very hard to see (they were essentially black on black). Now, nested frames use alternating shading depending on the nesting depth. The frame color from the theme is used, just slightly darker/lighter. This keeps all frames visible regardless of the nesting level. Frame visualization improvements. “Frame First” Workflow With all of these improvements, working with frames has become much more enjoyable than before. Our hope is that this lowers the barrier to using frames so much that they become more frequently and not just as an afterthought when cleaning it up later on. In fact, I’d even be curious to see how well a “frame first” workflow would work for people. The thought came up when thinking about how when writing code, one usually first writes a function or variable name before writing any of the actual logic. Maybe something similar could work for node systems too, where a labeled frame is created before the first node is added inside. However, it’s not clear yet how well that works in practice right now or if more features would be needed to make that feasible. Let me know! Try It! Download the latest Blender 4.5 LTS Alpha build and test out the new workflow! Support the Future of Blender Donate to Blender by joining the Development Fund to support the Blender Foundation’s work on core development, maintenance, and new releases. ♥ Donate to Blender0 Comments 0 Shares -
Blender at Annecy 2025
Blender at Annecy 2025
May 13th, 2025
Press Releases
Fiona Cohen
html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd"" style="color: #0066cc;">http://www.w3.org/TR/REC-html40/loose.dtd"
As it is now tradition, a small delegation of the Blender team will attend the Annecy festival, June 9-13.
This year though, no booth but a bigger gathering for people who want to connect.
Blender for Breakfast – new formula
Following the success of the last two years, Blender is going to host a “studios” breakfast session (Birds of a Feather-style), this time as part of the official program and in a bigger venue within the Festival-MIFA.
Focused on connecting producers, developers, TDs and artists working in studios where Blender is part of the pipeline, the goal is to share ongoing Blender development, Blender Studio pipeline insights, and give visibility to teams who wish to share their experience using Blender in production.
All while being treated some delicious croissant and coffee!
The event will take place on Tuesday, June 10 at 09:00 at Le Campus Mifa – Salle 1.
Limited spots are available and you need an accreditation to enter the Campus – make sure you register if you are interested, and we’ll send confirmations closer to the event.
See you there!
Source: https://www.blender.org/press/blender-at-annecy-2025/" style="color: #0066cc;">https://www.blender.org/press/blender-at-annecy-2025/
#blender #annecyBlender at Annecy 2025Blender at Annecy 2025 May 13th, 2025 Press Releases Fiona Cohen html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" As it is now tradition, a small delegation of the Blender team will attend the Annecy festival, June 9-13. This year though, no booth but a bigger gathering for people who want to connect. Blender for Breakfast – new formula Following the success of the last two years, Blender is going to host a “studios” breakfast session (Birds of a Feather-style), this time as part of the official program and in a bigger venue within the Festival-MIFA. Focused on connecting producers, developers, TDs and artists working in studios where Blender is part of the pipeline, the goal is to share ongoing Blender development, Blender Studio pipeline insights, and give visibility to teams who wish to share their experience using Blender in production. All while being treated some delicious croissant and coffee! The event will take place on Tuesday, June 10 at 09:00 at Le Campus Mifa – Salle 1. Limited spots are available and you need an accreditation to enter the Campus – make sure you register if you are interested, and we’ll send confirmations closer to the event. See you there! Source: https://www.blender.org/press/blender-at-annecy-2025/ #blender #annecyWWW.BLENDER.ORGBlender at Annecy 2025Blender at Annecy 2025 May 13th, 2025 Press Releases Fiona Cohen html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" As it is now tradition, a small delegation of the Blender team will attend the Annecy festival, June 9-13. This year though, no booth but a bigger gathering for people who want to connect. Blender for Breakfast – new formula Following the success of the last two years, Blender is going to host a “studios” breakfast session (Birds of a Feather-style), this time as part of the official program and in a bigger venue within the Festival-MIFA. Focused on connecting producers, developers, TDs and artists working in studios where Blender is part of the pipeline, the goal is to share ongoing Blender development, Blender Studio pipeline insights, and give visibility to teams who wish to share their experience using Blender in production. All while being treated some delicious croissant and coffee! The event will take place on Tuesday, June 10 at 09:00 at Le Campus Mifa – Salle 1. Limited spots are available and you need an accreditation to enter the Campus – make sure you register if you are interested, and we’ll send confirmations closer to the event. See you there!0 Comments 0 Shares -
CODE.BLENDER.ORGDeclarative Systems in Geometry NodesDeclarative Systems in Geometry Nodes May 2nd, 2025 General Development Jacques Lucke html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" Currently, one of our primary goals for Geometry Nodes is to get to a point where we can build high level and easy to use node group assets for physics simulations. Our general approach was described briefly last year. The goal of this document is to go into a bit more detail. For the examples below I’ll use a particle simulation because it’s the easiest, but the same applies to many other simulation types like hair, cloth and fluids. Two Different Views There are two different ways to look at a particle simulation: Declarative view: Describe the intended behavior of the particles by combining various existing behaviors like emitters, forces, colliders, surface attachment and more. Imperative view: Describe the computation steps to simulate the particles. Such steps are for example “integrate forces”, “update positions”, “find collision points”, etc. Both are totally valid views at a particle simulation, but depending on what you’re trying to achieve exactly, one view is more practical than the other. Most particle simulations can be described using a combination of various behaviors in which case the declarative view is more practical, but when you’re building a particle system from scratch or have custom needs, the imperative approach works better because it has more flexibility at the cost of dealing with low-level details. Current State Geometry Nodes is already quite decent at building particle systems using the imperative approach. Obviously, there are various missing built-in features to really be able to tackle any kind of simulation, but the overall structure is there. Building particle systems using the declarative approach by combining a set of behaviors is much harder, because it requires creating node groups that describe a certain behavior which are passed to a “solver” which performs the actual computations in the right order. Goal We are working towards various core features (bundles, closures, lists) that together will allow building declarative systems. The image below shows how individual behaviors could be encapsulated by node groups. All behaviors have the same type which allows easily combining them to form higher level behaviors. Also note that the order in which they are passed to the particle solver is largely irrelevant. This approach to building particle solvers allows someone to focus on building the actual solver that exposes a specific interface while anyone else can just build and combine behaviors for that solver without worrying too much about how it works internally. Since all behavior node groups have a very similar structure (various inputs, one behavior output), we can also build a higher level UI for managing them. For example, if the Particle Solver node group is used as modifier, the UI could look like in the following mockup. Note that each panel corresponds to one behavior and that an arbitrary number of behaviors can be added. The individual behaviors would be node group assets coming from an asset library. Of course one always has more flexibility in the node editor, but this kind of UI should be capable of solving the most common needs without requiring the node editor at all. Lastly, this approach of using declarative systems also simplifies creating add-ons that have a custom high level UI for some node tree. That’s because it’s way easier to write code that merges a few behaviors together automatically, than it is to edit potentially deeply nested node groups. Beyond Simulations As mentioned, this behavior-based approach works well for all kinds of physics simulations. However, using declarative systems also makes sense in other cases where the user thinks more about the what and not the how. One example for such a system is Line Art. A potential Line Art node is supposed to generate curves based on various features (e.g. silhouette, non-occluded edges and intersections). Here again, the user does not really care about how the curves are computed exactly, but only what kind of curves are expected. One could build node groups describing the different kind of curves, pass these to the Line Art node which then outputs all the requested curves. Something similar could be done for landscape generation, where behaviors encapsulate terrain features or asset scattering rules. There is always a bit of a trade-off when choosing between the imperative and declarative approach, but for high level tools used by potentially millions of people, the declarative approach is often better because it significantly reduces the cognitive load at the cost of a little less flexibility. How do Bundles and Closures fit in? Bundles and closures currently an experimental feature. They have many interesting uses on their own, but the primary reason for why we are working on them is to allow building these declarative systems. You can check out the pull request for more details, but in short a Bundle allows passing multiple values along using a single link and a Closure allows passing around functions that can be evaluated anywhere. Each individual behavior will generally be encapsulated as a bundle. The exact structure of that bundle is determined by the solver that uses the behavior. For example, a very basic mesh particle emitter could look like in the image below. Note that the Type in the bundle will be used by the solver to figure out what kind of behavior this is. The solver will call the Emit callback to actually create the new particles. The inputs and outputs of the callback are also determined by the particle solver. Closures are a fundamental feature to make this whole system work, while at the same time being a fairly niche feature that most people will not have to know about. Most people will just use systems that heavily rely on closures under the hood to be able to provide a good high level user interface. Feedback can be posted in the corresponding forum thread.0 Comments 0 Shares
-
-
WWW.BLENDER.ORGReinforcing the Blender Trademark GuidelinesReinforcing the Blender Trademark Guidelines April 18th, 2025 News Francesco Siddi html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" As Blender grows in popularity, the amount of products and projects using “Blender” as part of their name is growing as well. While there are some short-term benefits in incorporating the “Blender” name in a product, there are often unintended consequences when it reaches a wider audience. For example: An extension becomes popular and starts charging a fee to download. Using Blender as part of the brand could imply a relationship or endorsement from the Blender project. A community website or service starts being considered official or affiliated to the project, because it contains the Blender name in it. In reality, the project is fully independent. With the introduction of the Blender Extensions platform, branding issues become more obvious, as add-ons that have “Blender” in their name could appear right inside the software, creating confusion. As general branding guidelines for products or services, we recommend third parties to avoid using the name “Blender”, especially not as the start of the name. Blender Foundation wishes to distinguish official products or services that way. For example: Blender Conference, Blender Extensions, Blender ID. This explained in detail in the Trademark Policy page. While we understand that using the name “Blender” in a product will give quick recognition and short term benefits, it is also common sense that in the mid and long term developing a unique own brand for products will always be more beneficial. It allows you to trademark it, promote it, and expand it beyond Blender itself. Besides that, brand diversity in the Blender ecosystem is highly desirable, as it reflects the amount of independent business happening around Blender. This is what happened with Blender Market. At the time (15 years ago), the idea of creating a commercial Blender-focused marketplace was truly pioneering, and that initiative was welcomed and supported by Blender Foundation. Initially, it was helpful for the market to bear Blender’s name as it helped with discoverability. Today, not so much, as it often gets mistaken for Blender’s actual marketplace. Blender does not have plans to create an add-on marketplace, but rather focus on the free extensions platform, so it becomes important to set a distinction between the two. Last year, Autotroph (the organization behind Blender Market) announced the decision to rebrand the marketplace to “Superhive”, in an effort to align with the Blender trademark policies. The rebrand is now in place, and this is something that Blender Foundation highly appreciates. Existing and new products featuring the Blender name are encouraged to follow Blender Market’s steps. We understand that websites and products using the name Blender already for a long time, will not find it practical or desirable to rename. Blender Foundation does not intend to seek legal or public action in cases of such trademark violations. However, Blender Foundation does reserve the right to not advertise confusingly named Blender products on the extensions platform. For more information, blender.org/about/trademark-policy.0 Comments 0 Shares
-
-
-
-
-
-
WWW.BLENDER.ORGBlender 4.4 ReleaseBlender 4.4 ReleaseMarch 18th, 2025Press ReleasesPablo Vazquez html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd"Blender Foundation and the online developer community proudly present Blender 4.4!Splash artwork: Flow Dream Well Studio, Sacrebleu Productions, Take FiveImage licensed under CC-BY-SA https://flow.movie/Featuring artwork from Flow, the Oscar-winning film entirely made in Blender. Read the user story with director Gints Zilbalodis.Showcase ReelWhats NewBlender 4.4 focuses on stability, with over 700 issues tackled through the Winter of Quality. Read more on the Code blog.Major improvements include a new way to pack multiple data-block animations into one Action using Slots, a full CPU rewrite for the Compositor, Grease Pencil stabilization and feature parity, Video Sequencer text editing improvements, and, as always, better performance and a more refined user interface to enhance the user experience.Watch the video summary on Blenders YouTube channel.Explore the release notes for an in-depth look at whats new!Thank you!This work is made possible thanks to the outstanding contributions of the Blender community, and the support of the over 7300 individuals and 35 organizations contributing to theBlender Development Fund.Happy Blending!The Blender TeamMarch 18th, 2025Support the Future of BlenderDonate to Blender by joining the Development Fund to support the Blender Foundations work on core development, maintenance, and new releases. Donate to Blender0 Comments 0 Shares
-
WWW.BLENDER.ORGBlender at GDC 2025Blender at GDC 2025March 11th, 2025EventsFrancesco Siddi html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd"A small delegation of the Blender team (Dalai Felinto and Francesco Siddi) is going to attend theGame Developers Conference GDC in San Francisco on 17-21 March, with the main goal to connect with individuals and teams using Blender to create interactive content.This year we are also organizing a Blender Meetup event (Birds of a Feather-style), focused on connecting producers, developers, TDs and artists working in studios where Blender is part of the pipeline. The goal is to share ongoing Blender development, Blender Studio pipeline insights, and give visibility to teams who wish to share their experience using Blender in production.This time we go for an informal setting on Tuesday, 18 March at 10:00AM at the Yerba Buena Gardens (in front of the Moscone conference center). Look for the Blender flag, bring your own coffee and see you there!If you want to connect besides the event, reach out to francesco at blender.org.0 Comments 0 Shares
-
-
-
CODE.BLENDER.ORGWinter of Quality 2025html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd"During the 20242025 winter (northern hemisphere), Blender developers focused on quality and stability. This blog post offers an overview of the work accomplished.BugfixesBetween December 1 and January 31, 2025, more than 500 reported issues were fixed.High severity bugs since January 1st 2025Here is a per-module breakdown of the fixed reports:Animation & Rigging: 24Asset System: 6Core: 10Grease Pencil: 82Modeling: 33Nodes & Physics: 71Pipeline & IO: 9Python API: 3Render & Cycles: 37Sculpt, Paint & Texture: 31User Interface: 82VFX & Video: 40Viewport & EEVEE: 76Additionally, many old reports have been double-checked and either fixed or closed (for example, if the issue could not be reproduced in a recent Blender version). Issues that were not reported by users were also addressed.Module Work OverviewIn addition to fixing bugs, developers also spent time tackling technical debt, updating documentation, and stabilizing certain areas of the code. Lets take a look at the reports from each module.Animation & RiggingMultiple bugs have been closed (either fixed or re-investigated and closed).Pose Library:Added support for slotted actions, including multi-object poses.Added support for pushing to the pose library.New tools for slotted actions have been implemented.Switching assigned actions is now smoother.Library Overrides for slotted actions have improved.Code documentation is much better, APIs have been polished, and the new animation system is generally easier to work with for future development.CompositorThe CPU compositor rewrite has been completed, aiming to improve maintainability by unifying the CPU and GPU compositors. In the process: Tens of thousands of lines of code were removed. Performance of many operations was improved. The behavior of CPU and GPU devices was unified.The main benefit of this rewrite is that future development of the compositor will be faster and easier.Grease PencilThe focus was on stabilization following the significant Grease Pencil v3 changes introduced in Blender 4.3. A total of 91 bug fixes were committed, 86 of which were high-severity bugs, many addressing regressions.ModelingThe focus was on fixing crashes and older unresolved bugs. 17 issues were addressed, including several long-standing bugs and 8 crashes. In total, 74 issues were closed.Nodes & PhysicsThe team kicked off the quality project by developing a tool to categorize all module issues and split them into batches. This tool made the project feel far less daunting, as they were only tackling 20 issues per day instead of 700.The focus was on addressing reports for active projects (rather than end-of-life features like the particle system or inactive areas such as the cloth modifier). In total, they closed about 150 issues and resolved several long-standing, frustrating problems.They also made time for some important refactors, including:Reducing the number of changes required to add a new node.Starting a replacement for the CustomData system.Implementing various smaller improvements.Render & CyclesThe team improved test coverage, fixed around 40 bugs, and cleaned up the code to better align with modern C++ standards.Sculpt, Texture & PaintThe team focused on bug fixes, as well as improving testing and documentation.IssuesClosed 25 high severity reports.Followed up on issues tagged with Need Information from DevelopersClosed 57 additional issues (Design, To Do, and Bugs).Created a tracking issue for Sculpt/Paint undo problems.Testing & DocumentationAdded a BVH building unit test.Work is ongoing for a sculpt stroke + render test.Added a technical documentation overview for mesh painting.USD (Universal Scene Description)The team focused on a variety of tasks, including adding additional test coverage, completing partially implemented features, and upgrading the user manual.They also investigated, fixed, and prototyped performance improvements for USD import, which were also useful in determining some other areas of Blender that should be improved.User InterfaceThey focused on reducing the number of UI issues to a more manageable state. Out of more than 1000 open issues, 275 were closed.Viewport & EEVEEThe team migrated overlay, selection, and image engine to the new drawing manager API. This migration fixes known and long-standing issues with overlays, and it paves the way for future optimizations, including removal of a global lock/freeze.The team also focused on improving test cases, fixing high-severity issues, and better platform support.Video Sequence EditorThe team focused on bug reports, code cleanups, and refactors:The Sequence -> Strip rename in the codebase (and to some extent the Python API) to make the code less confusing.All movie read/write-related C++ code is now in imbuf/movie, with clearer naming and improved structure.The code for VSE effects and modifiers was cleaned up and modernized to align more with C++ standards, resulting in a slight performance boost.Retiming code received fixes, cleanups, and improved developer documentation.Several UI interactions were polished for better usability.Future Quality ProjectsAfter a year of exciting new features like EEVEE-Next and Grease Pencil v3, the Blender Winter of Quality was the perfect opportunity to tackle loose ends and enhance stability.While quality is always a core focus of new development and part of the daily work of module teams, having a dedicated period to focus solely on it was great. As a result, future quality projects are being considered, and feedback is currently being gathered to learn about the experience for developers and identify areas for improvement in the future.Support the Future of BlenderDonate to Blender by joining the Development Fund to support the Blender Foundations work on core development, maintenance, and new releases. Donate to Blender0 Comments 0 Shares
-
More Stories