-
-
Notifications
You must be signed in to change notification settings - Fork 3.9k
Per-Object Motion Blur #9924
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Per-Object Motion Blur #9924
Changes from all commits
Commits
Show all changes
46 commits
Select commit
Hold shift + click to select a range
f06f766
Motion blur implementation
aevyrie 7a4a86a
Updated example readme
aevyrie 94425b6
remove unused imports from shader
aevyrie d44d366
CI nit
aevyrie 82aed62
fix for web and improve noise
aevyrie 0b04699
improve example
aevyrie 81d5e32
fix system ordering ambiguity
aevyrie 6e42f15
More example improvements
aevyrie 2a4f346
motion based sample filtering
aevyrie 1efce9d
Improve naming
aevyrie f20bccf
Tweak example and defaults
aevyrie e318c86
fix offset error
aevyrie cd59ff5
Add more context for motion filtering
aevyrie 7e484ce
add depth and vel checks
aevyrie c1de46b
Merge remote-tracking branch 'origin/main' into motion-blur
aevyrie fe2b514
Fix merge errors
aevyrie 6b221b6
reduce artifacts
aevyrie 6f342c0
allow single sample setting
aevyrie e8d79b7
shader docs
aevyrie 5b6c845
simplify shader and fix artifacts from discontinuity
aevyrie e49f71d
Remove dependency on depth prepass
aevyrie 7540805
Fix warning in release builds
aevyrie 4268f88
Fix artifacts caused by missing samples
aevyrie f27d0be
improve docs and defaults
aevyrie f243e8b
Blur bg samples to fix under blur
aevyrie 23e4f9e
Documentation and review feedback
aevyrie 783f368
review feedback
aevyrie 1e0fdc4
Remove use of bevy_internal
aevyrie 975ca29
Improve demo appearance
aevyrie 6c4ee36
Merge remote-tracking branch 'origin/main' into motion-blur
aevyrie d2cdb06
Fix for renaming ViewQuery to ViewData
aevyrie 51fb2d3
Cleanup
aevyrie 65586be
Review feedback
aevyrie ed2c1c7
Merge branch 'main' into motion-blur
aevyrie 9d5aa8b
review feedback
aevyrie 8be1d96
Avoid running nodes with no blur
aevyrie 3067923
improve example
aevyrie 79ec4c2
Merge branch 'main' into motion-blur
aevyrie 5c117ed
Merge remote-tracking branch 'upstream/main' into motion-blur
aevyrie e917032
Add moving camera toggle to example
aevyrie cb7c5ca
Shorten example descriptions
aevyrie 46a2b55
Merge branch 'main' into motion-blur
aevyrie 3aa4799
update example
aevyrie 8a642ad
Use generated example readme
aevyrie d335565
Review feedback
aevyrie 673e90d
Merge branch 'main' into motion-blur
aevyrie File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -26,6 +26,7 @@ pub mod graph { | |
MainTransparentPass, | ||
EndMainPass, | ||
Taa, | ||
MotionBlur, | ||
Bloom, | ||
Tonemapping, | ||
Fxaa, | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,168 @@ | ||
//! Per-object, per-pixel motion blur. | ||
//! | ||
//! Add the [`MotionBlurBundle`] to a camera to enable motion blur. See [`MotionBlur`] for more | ||
//! documentation. | ||
|
||
use crate::{ | ||
core_3d::graph::{Core3d, Node3d}, | ||
prepass::{DepthPrepass, MotionVectorPrepass}, | ||
}; | ||
use bevy_app::{App, Plugin}; | ||
use bevy_asset::{load_internal_asset, Handle}; | ||
use bevy_ecs::{ | ||
bundle::Bundle, component::Component, query::With, reflect::ReflectComponent, | ||
schedule::IntoSystemConfigs, | ||
}; | ||
use bevy_reflect::{std_traits::ReflectDefault, Reflect}; | ||
use bevy_render::{ | ||
camera::Camera, | ||
extract_component::{ExtractComponent, ExtractComponentPlugin, UniformComponentPlugin}, | ||
render_graph::{RenderGraphApp, ViewNodeRunner}, | ||
render_resource::{Shader, ShaderType, SpecializedRenderPipelines}, | ||
Render, RenderApp, RenderSet, | ||
}; | ||
|
||
pub mod node; | ||
pub mod pipeline; | ||
|
||
/// Adds [`MotionBlur`] and the required depth and motion vector prepasses to a camera entity. | ||
#[derive(Bundle, Default)] | ||
pub struct MotionBlurBundle { | ||
pub motion_blur: MotionBlur, | ||
pub depth_prepass: DepthPrepass, | ||
pub motion_vector_prepass: MotionVectorPrepass, | ||
} | ||
|
||
/// A component that enables and configures motion blur when added to a camera. | ||
/// | ||
/// Motion blur is an effect that simulates how moving objects blur as they change position during | ||
/// the exposure of film, a sensor, or an eyeball. | ||
/// | ||
/// Because rendering simulates discrete steps in time, we use per-pixel motion vectors to estimate | ||
/// the path of objects between frames. This kind of implementation has some artifacts: | ||
/// - Fast moving objects in front of a stationary object or when in front of empty space, will not | ||
/// have their edges blurred. | ||
/// - Transparent objects do not write to depth or motion vectors, so they cannot be blurred. | ||
/// | ||
/// Other approaches, such as *A Reconstruction Filter for Plausible Motion Blur* produce more | ||
/// correct results, but are more expensive and complex, and have other kinds of artifacts. This | ||
/// implementation is relatively inexpensive and effective. | ||
/// | ||
/// # Usage | ||
/// | ||
/// Add the [`MotionBlur`] component to a camera to enable and configure motion blur for that | ||
/// camera. Motion blur also requires the depth and motion vector prepass, which can be added more | ||
/// easily to the camera with the [`MotionBlurBundle`]. | ||
/// | ||
/// ``` | ||
/// # use bevy_core_pipeline::{core_3d::Camera3dBundle, motion_blur::MotionBlurBundle}; | ||
/// # use bevy_ecs::prelude::*; | ||
/// # fn test(mut commands: Commands) { | ||
/// commands.spawn(( | ||
/// Camera3dBundle::default(), | ||
/// MotionBlurBundle::default(), | ||
/// )); | ||
/// # } | ||
/// ```` | ||
#[derive(Reflect, Component, Clone, ExtractComponent, ShaderType)] | ||
#[reflect(Component, Default)] | ||
#[extract_component_filter(With<Camera>)] | ||
pub struct MotionBlur { | ||
/// The strength of motion blur from `0.0` to `1.0`. | ||
/// | ||
/// The shutter angle describes the fraction of a frame that a camera's shutter is open and | ||
/// exposing the film/sensor. For 24fps cinematic film, a shutter angle of 0.5 (180 degrees) is | ||
/// common. This means that the shutter was open for half of the frame, or 1/48th of a second. | ||
/// The lower the shutter angle, the less exposure time and thus less blur. | ||
/// | ||
/// A value greater than one is non-physical and results in an object's blur stretching further | ||
/// than it traveled in that frame. This might be a desirable effect for artistic reasons, but | ||
/// consider allowing users to opt out of this. | ||
/// | ||
/// This value is intentionally tied to framerate to avoid the aforementioned non-physical | ||
/// over-blurring. If you want to emulate a cinematic look, your options are: | ||
/// - Framelimit your app to 24fps, and set the shutter angle to 0.5 (180 deg). Note that | ||
/// depending on artistic intent or the action of a scene, it is common to set the shutter | ||
/// angle between 0.125 (45 deg) and 0.5 (180 deg). This is the most faithful way to | ||
/// reproduce the look of film. | ||
/// - Set the shutter angle greater than one. For example, to emulate the blur strength of | ||
/// film while rendering at 60fps, you would set the shutter angle to `60/24 * 0.5 = 1.25`. | ||
/// Note that this will result in artifacts where the motion of objects will stretch further | ||
/// than they moved between frames; users may find this distracting. | ||
pub shutter_angle: f32, | ||
/// The quality of motion blur, corresponding to the number of per-pixel samples taken in each | ||
/// direction during blur. | ||
/// | ||
/// Setting this to `1` results in each pixel being sampled once in the leading direction, once | ||
/// in the trailing direction, and once in the middle, for a total of 3 samples (`1 * 2 + 1`). | ||
/// Setting this to `3` will result in `3 * 2 + 1 = 7` samples. Setting this to `0` is | ||
/// equivalent to disabling motion blur. | ||
pub samples: u32, | ||
#[cfg(all(feature = "webgl", target_arch = "wasm32"))] | ||
// WebGL2 structs must be 16 byte aligned. | ||
pub _webgl2_padding: bevy_math::Vec3, | ||
} | ||
|
||
impl Default for MotionBlur { | ||
fn default() -> Self { | ||
Self { | ||
shutter_angle: 0.5, | ||
samples: 1, | ||
#[cfg(all(feature = "webgl", target_arch = "wasm32"))] | ||
_webgl2_padding: bevy_math::Vec3::default(), | ||
} | ||
} | ||
} | ||
|
||
pub const MOTION_BLUR_SHADER_HANDLE: Handle<Shader> = | ||
Handle::weak_from_u128(987457899187986082347921); | ||
|
||
/// Adds support for per-object motion blur to the app. See [`MotionBlur`] for details. | ||
pub struct MotionBlurPlugin; | ||
impl Plugin for MotionBlurPlugin { | ||
fn build(&self, app: &mut App) { | ||
load_internal_asset!( | ||
app, | ||
MOTION_BLUR_SHADER_HANDLE, | ||
"motion_blur.wgsl", | ||
Shader::from_wgsl | ||
); | ||
app.add_plugins(( | ||
ExtractComponentPlugin::<MotionBlur>::default(), | ||
UniformComponentPlugin::<MotionBlur>::default(), | ||
)); | ||
|
||
let Some(render_app) = app.get_sub_app_mut(RenderApp) else { | ||
return; | ||
}; | ||
|
||
render_app | ||
.init_resource::<SpecializedRenderPipelines<pipeline::MotionBlurPipeline>>() | ||
.add_systems( | ||
Render, | ||
pipeline::prepare_motion_blur_pipelines.in_set(RenderSet::Prepare), | ||
); | ||
|
||
render_app | ||
.add_render_graph_node::<ViewNodeRunner<node::MotionBlurNode>>( | ||
Core3d, | ||
Node3d::MotionBlur, | ||
) | ||
.add_render_graph_edges( | ||
Core3d, | ||
( | ||
Node3d::EndMainPass, | ||
Node3d::MotionBlur, | ||
Node3d::Bloom, // we want blurred areas to bloom and tonemap properly. | ||
), | ||
); | ||
} | ||
|
||
fn finish(&self, app: &mut App) { | ||
let Some(render_app) = app.get_sub_app_mut(RenderApp) else { | ||
return; | ||
}; | ||
|
||
render_app.init_resource::<pipeline::MotionBlurPipeline>(); | ||
} | ||
} |
149 changes: 149 additions & 0 deletions
149
crates/bevy_core_pipeline/src/motion_blur/motion_blur.wgsl
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,149 @@ | ||
#import bevy_pbr::prepass_utils | ||
#import bevy_pbr::utils | ||
#import bevy_core_pipeline::fullscreen_vertex_shader::FullscreenVertexOutput | ||
#import bevy_render::globals::Globals | ||
|
||
#ifdef MULTISAMPLED | ||
@group(0) @binding(0) var screen_texture: texture_2d<f32>; | ||
@group(0) @binding(1) var motion_vectors: texture_multisampled_2d<f32>; | ||
@group(0) @binding(2) var depth: texture_depth_multisampled_2d; | ||
#else | ||
@group(0) @binding(0) var screen_texture: texture_2d<f32>; | ||
@group(0) @binding(1) var motion_vectors: texture_2d<f32>; | ||
@group(0) @binding(2) var depth: texture_depth_2d; | ||
#endif | ||
@group(0) @binding(3) var texture_sampler: sampler; | ||
struct MotionBlur { | ||
shutter_angle: f32, | ||
samples: u32, | ||
#ifdef SIXTEEN_BYTE_ALIGNMENT | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
// WebGL2 structs must be 16 byte aligned. | ||
_webgl2_padding: vec3<f32> | ||
#endif | ||
} | ||
@group(0) @binding(4) var<uniform> settings: MotionBlur; | ||
@group(0) @binding(5) var<uniform> globals: Globals; | ||
|
||
@fragment | ||
fn fragment( | ||
#ifdef MULTISAMPLED | ||
@builtin(sample_index) sample_index: u32, | ||
#endif | ||
in: FullscreenVertexOutput | ||
) -> @location(0) vec4<f32> { | ||
let texture_size = vec2<f32>(textureDimensions(screen_texture)); | ||
let frag_coords = vec2<i32>(in.uv * texture_size); | ||
|
||
#ifdef MULTISAMPLED | ||
let base_color = textureLoad(screen_texture, frag_coords, i32(sample_index)); | ||
#else | ||
let base_color = textureSample(screen_texture, texture_sampler, in.uv); | ||
#endif | ||
|
||
let shutter_angle = settings.shutter_angle; | ||
|
||
#ifdef MULTISAMPLED | ||
let this_motion_vector = textureLoad(motion_vectors, frag_coords, i32(sample_index)).rg; | ||
#else | ||
let this_motion_vector = textureSample(motion_vectors, texture_sampler, in.uv).rg; | ||
#endif | ||
|
||
#ifdef NO_DEPTH_TEXTURE_SUPPORT | ||
let this_depth = 0.0; | ||
let depth_supported = false; | ||
#else | ||
let depth_supported = true; | ||
#ifdef MULTISAMPLED | ||
let this_depth = textureLoad(depth, frag_coords, i32(sample_index)); | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
#else | ||
let this_depth = textureSample(depth, texture_sampler, in.uv); | ||
#endif | ||
#endif | ||
|
||
// The exposure vector is the distance that this fragment moved while the camera shutter was | ||
// open. This is the motion vector (total distance traveled) multiplied by the shutter angle (a | ||
// fraction). In film, the shutter angle is commonly 0.5 or "180 degrees" (out of 360 total). | ||
// This means that for a frame time of 20ms, the shutter is only open for 10ms. | ||
// | ||
// Using a shutter angle larger than 1.0 is non-physical, objects would need to move further | ||
// than they physically travelled during a frame, which is not possible. Note: we allow values | ||
// larger than 1.0 because it may be desired for artistic reasons. | ||
let exposure_vector = shutter_angle * this_motion_vector; | ||
|
||
var accumulator: vec4<f32>; | ||
var weight_total = 0.0; | ||
let n_samples = i32(settings.samples); | ||
let noise = utils::interleaved_gradient_noise(vec2<f32>(frag_coords), globals.frame_count); // 0 to 1 | ||
|
||
for (var i = -n_samples; i < n_samples; i++) { | ||
// The current sample step vector, from in.uv | ||
let step_vector = 0.5 * exposure_vector * (f32(i) + noise) / f32(n_samples); | ||
var sample_uv = in.uv + step_vector; | ||
let sample_coords = vec2<i32>(sample_uv * texture_size); | ||
|
||
#ifdef MULTISAMPLED | ||
let sample_color = textureLoad(screen_texture, sample_coords, i32(sample_index)); | ||
#else | ||
let sample_color = textureSample(screen_texture, texture_sampler, sample_uv); | ||
#endif | ||
#ifdef MULTISAMPLED | ||
let sample_motion = textureLoad(motion_vectors, sample_coords, i32(sample_index)).rg; | ||
#else | ||
let sample_motion = textureSample(motion_vectors, texture_sampler, sample_uv).rg; | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
#endif | ||
#ifdef NO_DEPTH_TEXTURE_SUPPORT | ||
let sample_depth = 0.0; | ||
#else | ||
#ifdef MULTISAMPLED | ||
let sample_depth = textureLoad(depth, sample_coords, i32(sample_index)); | ||
#else | ||
let sample_depth = textureSample(depth, texture_sampler, sample_uv); | ||
#endif | ||
#endif | ||
|
||
var weight = 1.0; | ||
let is_sample_in_fg = !(depth_supported && sample_depth < this_depth && sample_depth > 0.0); | ||
if is_sample_in_fg { | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
// The following weight calculation is used to eliminate ghosting artifacts that are | ||
// common in motion-vector-based motion blur implementations. While some resources | ||
// recommend using depth, I've found that sampling the velocity results in significantly | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
// better results. Unlike a depth heuristic, this is not scale dependent. | ||
// | ||
// The most distracting artifacts occur when a stationary foreground object is | ||
// incorrectly sampled while blurring a moving background object, causing the stationary | ||
// object to blur when it should be sharp ("background bleeding"). This is most obvious | ||
// when the camera is tracking a fast moving object. The tracked object should be sharp, | ||
// and should not bleed into the motion blurred background. | ||
// | ||
// To attenuate these incorrect samples, we compare the motion of the fragment being | ||
// blurred to the UV being sampled, to answer the question "is it possible that this | ||
// sample was occluding the fragment?" | ||
// | ||
// Note to future maintainers: proceed with caution when making any changes here, and | ||
// ensure you check all occlusion/disocclusion scenarios and fullscreen camera rotation | ||
// blur for regressions. | ||
let frag_speed = length(step_vector); | ||
let sample_speed = length(sample_motion) / 2.0; // Halved because the sample is centered | ||
let cos_angle = dot(step_vector, sample_motion) / (frag_speed * sample_speed * 2.0); | ||
let motion_similarity = clamp(abs(cos_angle), 0.0, 1.0); | ||
if sample_speed * motion_similarity < frag_speed { | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
||
// Project the sample's motion onto the frag's motion vector. If the sample did not | ||
// cover enough distance to reach the original frag, there is no way it could have | ||
// influenced this frag at all, and should be discarded. | ||
weight = 0.0; | ||
} | ||
} | ||
weight_total += weight; | ||
accumulator += weight * sample_color; | ||
} | ||
|
||
let has_moved_less_than_a_pixel = | ||
dot(this_motion_vector * texture_size, this_motion_vector * texture_size) < 1.0; | ||
// In case no samples were accepted, fall back to base color. | ||
// We also fall back if motion is small, to not break antialiasing. | ||
if weight_total <= 0.0 || has_moved_less_than_a_pixel { | ||
accumulator = base_color; | ||
weight_total = 1.0; | ||
} | ||
return accumulator / weight_total; | ||
} | ||
aevyrie marked this conversation as resolved.
Show resolved
Hide resolved
|
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.