Skip to content

Commit ade70b3

Browse files
Per-Object Motion Blur (#9924)
https://github.com/bevyengine/bevy/assets/2632925/e046205e-3317-47c3-9959-fc94c529f7e0 # Objective - Adds per-object motion blur to the core 3d pipeline. This is a common effect used in games and other simulations. - Partially resolves #4710 ## Solution - This is a post-process effect that uses the depth and motion vector buffers to estimate per-object motion blur. The implementation is combined from knowledge from multiple papers and articles. The approach itself, and the shader are quite simple. Most of the effort was in wiring up the bevy rendering plumbing, and properly specializing for HDR and MSAA. - To work with MSAA, the MULTISAMPLED_SHADING wgpu capability is required. I've extracted this code from #9000. This is because the prepass buffers are multisampled, and require accessing with `textureLoad` as opposed to the widely compatible `textureSample`. - Added an example to demonstrate the effect of motion blur parameters. ## Future Improvements - While this approach does have limitations, it's one of the most commonly used, and is much better than camera motion blur, which does not consider object velocity. For example, this implementation allows a dolly to track an object, and that object will remain unblurred while the background is blurred. The biggest issue with this implementation is that blur is constrained to the boundaries of objects which results in hard edges. There are solutions to this by either dilating the object or the motion vector buffer, or by taking a different approach such as https://casual-effects.com/research/McGuire2012Blur/index.html - I'm using a noise PRNG function to jitter samples. This could be replaced with a blue noise texture lookup or similar, however after playing with the parameters, it gives quite nice results with 4 samples, and is significantly better than the artifacts generated when not jittering. --- ## Changelog - Added: per-object motion blur. This can be enabled and configured by adding the `MotionBlurBundle` to a camera entity. --------- Co-authored-by: Torstein Grindvik <[email protected]>
1 parent 9592a40 commit ade70b3

File tree

11 files changed

+1046
-8
lines changed

11 files changed

+1046
-8
lines changed

Cargo.toml

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -766,6 +766,17 @@ description = "Loads and renders a glTF file as a scene"
766766
category = "3D Rendering"
767767
wasm = true
768768

769+
[[example]]
770+
name = "motion_blur"
771+
path = "examples/3d/motion_blur.rs"
772+
doc-scrape-examples = true
773+
774+
[package.metadata.example.motion_blur]
775+
name = "Motion Blur"
776+
description = "Demonstrates per-pixel motion blur"
777+
category = "3D Rendering"
778+
wasm = false
779+
769780
[[example]]
770781
name = "tonemapping"
771782
path = "examples/3d/tonemapping.rs"

crates/bevy_core_pipeline/src/core_3d/mod.rs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ pub mod graph {
2626
MainTransparentPass,
2727
EndMainPass,
2828
Taa,
29+
MotionBlur,
2930
Bloom,
3031
Tonemapping,
3132
Fxaa,

crates/bevy_core_pipeline/src/lib.rs

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ pub mod core_3d;
1515
pub mod deferred;
1616
pub mod fullscreen_vertex_shader;
1717
pub mod fxaa;
18+
pub mod motion_blur;
1819
pub mod msaa_writeback;
1920
pub mod prepass;
2021
mod skybox;
@@ -53,6 +54,7 @@ use crate::{
5354
deferred::copy_lighting_id::CopyDeferredLightingIdPlugin,
5455
fullscreen_vertex_shader::FULLSCREEN_SHADER_HANDLE,
5556
fxaa::FxaaPlugin,
57+
motion_blur::MotionBlurPlugin,
5658
msaa_writeback::MsaaWritebackPlugin,
5759
prepass::{DeferredPrepass, DepthPrepass, MotionVectorPrepass, NormalPrepass},
5860
tonemapping::TonemappingPlugin,
@@ -89,6 +91,7 @@ impl Plugin for CorePipelinePlugin {
8991
BloomPlugin,
9092
FxaaPlugin,
9193
CASPlugin,
94+
MotionBlurPlugin,
9295
));
9396
}
9497
}
Lines changed: 168 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,168 @@
1+
//! Per-object, per-pixel motion blur.
2+
//!
3+
//! Add the [`MotionBlurBundle`] to a camera to enable motion blur. See [`MotionBlur`] for more
4+
//! documentation.
5+
6+
use crate::{
7+
core_3d::graph::{Core3d, Node3d},
8+
prepass::{DepthPrepass, MotionVectorPrepass},
9+
};
10+
use bevy_app::{App, Plugin};
11+
use bevy_asset::{load_internal_asset, Handle};
12+
use bevy_ecs::{
13+
bundle::Bundle, component::Component, query::With, reflect::ReflectComponent,
14+
schedule::IntoSystemConfigs,
15+
};
16+
use bevy_reflect::{std_traits::ReflectDefault, Reflect};
17+
use bevy_render::{
18+
camera::Camera,
19+
extract_component::{ExtractComponent, ExtractComponentPlugin, UniformComponentPlugin},
20+
render_graph::{RenderGraphApp, ViewNodeRunner},
21+
render_resource::{Shader, ShaderType, SpecializedRenderPipelines},
22+
Render, RenderApp, RenderSet,
23+
};
24+
25+
pub mod node;
26+
pub mod pipeline;
27+
28+
/// Adds [`MotionBlur`] and the required depth and motion vector prepasses to a camera entity.
29+
#[derive(Bundle, Default)]
30+
pub struct MotionBlurBundle {
31+
pub motion_blur: MotionBlur,
32+
pub depth_prepass: DepthPrepass,
33+
pub motion_vector_prepass: MotionVectorPrepass,
34+
}
35+
36+
/// A component that enables and configures motion blur when added to a camera.
37+
///
38+
/// Motion blur is an effect that simulates how moving objects blur as they change position during
39+
/// the exposure of film, a sensor, or an eyeball.
40+
///
41+
/// Because rendering simulates discrete steps in time, we use per-pixel motion vectors to estimate
42+
/// the path of objects between frames. This kind of implementation has some artifacts:
43+
/// - Fast moving objects in front of a stationary object or when in front of empty space, will not
44+
/// have their edges blurred.
45+
/// - Transparent objects do not write to depth or motion vectors, so they cannot be blurred.
46+
///
47+
/// Other approaches, such as *A Reconstruction Filter for Plausible Motion Blur* produce more
48+
/// correct results, but are more expensive and complex, and have other kinds of artifacts. This
49+
/// implementation is relatively inexpensive and effective.
50+
///
51+
/// # Usage
52+
///
53+
/// Add the [`MotionBlur`] component to a camera to enable and configure motion blur for that
54+
/// camera. Motion blur also requires the depth and motion vector prepass, which can be added more
55+
/// easily to the camera with the [`MotionBlurBundle`].
56+
///
57+
/// ```
58+
/// # use bevy_core_pipeline::{core_3d::Camera3dBundle, motion_blur::MotionBlurBundle};
59+
/// # use bevy_ecs::prelude::*;
60+
/// # fn test(mut commands: Commands) {
61+
/// commands.spawn((
62+
/// Camera3dBundle::default(),
63+
/// MotionBlurBundle::default(),
64+
/// ));
65+
/// # }
66+
/// ````
67+
#[derive(Reflect, Component, Clone, ExtractComponent, ShaderType)]
68+
#[reflect(Component, Default)]
69+
#[extract_component_filter(With<Camera>)]
70+
pub struct MotionBlur {
71+
/// The strength of motion blur from `0.0` to `1.0`.
72+
///
73+
/// The shutter angle describes the fraction of a frame that a camera's shutter is open and
74+
/// exposing the film/sensor. For 24fps cinematic film, a shutter angle of 0.5 (180 degrees) is
75+
/// common. This means that the shutter was open for half of the frame, or 1/48th of a second.
76+
/// The lower the shutter angle, the less exposure time and thus less blur.
77+
///
78+
/// A value greater than one is non-physical and results in an object's blur stretching further
79+
/// than it traveled in that frame. This might be a desirable effect for artistic reasons, but
80+
/// consider allowing users to opt out of this.
81+
///
82+
/// This value is intentionally tied to framerate to avoid the aforementioned non-physical
83+
/// over-blurring. If you want to emulate a cinematic look, your options are:
84+
/// - Framelimit your app to 24fps, and set the shutter angle to 0.5 (180 deg). Note that
85+
/// depending on artistic intent or the action of a scene, it is common to set the shutter
86+
/// angle between 0.125 (45 deg) and 0.5 (180 deg). This is the most faithful way to
87+
/// reproduce the look of film.
88+
/// - Set the shutter angle greater than one. For example, to emulate the blur strength of
89+
/// film while rendering at 60fps, you would set the shutter angle to `60/24 * 0.5 = 1.25`.
90+
/// Note that this will result in artifacts where the motion of objects will stretch further
91+
/// than they moved between frames; users may find this distracting.
92+
pub shutter_angle: f32,
93+
/// The quality of motion blur, corresponding to the number of per-pixel samples taken in each
94+
/// direction during blur.
95+
///
96+
/// Setting this to `1` results in each pixel being sampled once in the leading direction, once
97+
/// in the trailing direction, and once in the middle, for a total of 3 samples (`1 * 2 + 1`).
98+
/// Setting this to `3` will result in `3 * 2 + 1 = 7` samples. Setting this to `0` is
99+
/// equivalent to disabling motion blur.
100+
pub samples: u32,
101+
#[cfg(all(feature = "webgl", target_arch = "wasm32"))]
102+
// WebGL2 structs must be 16 byte aligned.
103+
pub _webgl2_padding: bevy_math::Vec3,
104+
}
105+
106+
impl Default for MotionBlur {
107+
fn default() -> Self {
108+
Self {
109+
shutter_angle: 0.5,
110+
samples: 1,
111+
#[cfg(all(feature = "webgl", target_arch = "wasm32"))]
112+
_webgl2_padding: bevy_math::Vec3::default(),
113+
}
114+
}
115+
}
116+
117+
pub const MOTION_BLUR_SHADER_HANDLE: Handle<Shader> =
118+
Handle::weak_from_u128(987457899187986082347921);
119+
120+
/// Adds support for per-object motion blur to the app. See [`MotionBlur`] for details.
121+
pub struct MotionBlurPlugin;
122+
impl Plugin for MotionBlurPlugin {
123+
fn build(&self, app: &mut App) {
124+
load_internal_asset!(
125+
app,
126+
MOTION_BLUR_SHADER_HANDLE,
127+
"motion_blur.wgsl",
128+
Shader::from_wgsl
129+
);
130+
app.add_plugins((
131+
ExtractComponentPlugin::<MotionBlur>::default(),
132+
UniformComponentPlugin::<MotionBlur>::default(),
133+
));
134+
135+
let Some(render_app) = app.get_sub_app_mut(RenderApp) else {
136+
return;
137+
};
138+
139+
render_app
140+
.init_resource::<SpecializedRenderPipelines<pipeline::MotionBlurPipeline>>()
141+
.add_systems(
142+
Render,
143+
pipeline::prepare_motion_blur_pipelines.in_set(RenderSet::Prepare),
144+
);
145+
146+
render_app
147+
.add_render_graph_node::<ViewNodeRunner<node::MotionBlurNode>>(
148+
Core3d,
149+
Node3d::MotionBlur,
150+
)
151+
.add_render_graph_edges(
152+
Core3d,
153+
(
154+
Node3d::EndMainPass,
155+
Node3d::MotionBlur,
156+
Node3d::Bloom, // we want blurred areas to bloom and tonemap properly.
157+
),
158+
);
159+
}
160+
161+
fn finish(&self, app: &mut App) {
162+
let Some(render_app) = app.get_sub_app_mut(RenderApp) else {
163+
return;
164+
};
165+
166+
render_app.init_resource::<pipeline::MotionBlurPipeline>();
167+
}
168+
}
Lines changed: 149 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,149 @@
1+
#import bevy_pbr::prepass_utils
2+
#import bevy_pbr::utils
3+
#import bevy_core_pipeline::fullscreen_vertex_shader::FullscreenVertexOutput
4+
#import bevy_render::globals::Globals
5+
6+
#ifdef MULTISAMPLED
7+
@group(0) @binding(0) var screen_texture: texture_2d<f32>;
8+
@group(0) @binding(1) var motion_vectors: texture_multisampled_2d<f32>;
9+
@group(0) @binding(2) var depth: texture_depth_multisampled_2d;
10+
#else
11+
@group(0) @binding(0) var screen_texture: texture_2d<f32>;
12+
@group(0) @binding(1) var motion_vectors: texture_2d<f32>;
13+
@group(0) @binding(2) var depth: texture_depth_2d;
14+
#endif
15+
@group(0) @binding(3) var texture_sampler: sampler;
16+
struct MotionBlur {
17+
shutter_angle: f32,
18+
samples: u32,
19+
#ifdef SIXTEEN_BYTE_ALIGNMENT
20+
// WebGL2 structs must be 16 byte aligned.
21+
_webgl2_padding: vec3<f32>
22+
#endif
23+
}
24+
@group(0) @binding(4) var<uniform> settings: MotionBlur;
25+
@group(0) @binding(5) var<uniform> globals: Globals;
26+
27+
@fragment
28+
fn fragment(
29+
#ifdef MULTISAMPLED
30+
@builtin(sample_index) sample_index: u32,
31+
#endif
32+
in: FullscreenVertexOutput
33+
) -> @location(0) vec4<f32> {
34+
let texture_size = vec2<f32>(textureDimensions(screen_texture));
35+
let frag_coords = vec2<i32>(in.uv * texture_size);
36+
37+
#ifdef MULTISAMPLED
38+
let base_color = textureLoad(screen_texture, frag_coords, i32(sample_index));
39+
#else
40+
let base_color = textureSample(screen_texture, texture_sampler, in.uv);
41+
#endif
42+
43+
let shutter_angle = settings.shutter_angle;
44+
45+
#ifdef MULTISAMPLED
46+
let this_motion_vector = textureLoad(motion_vectors, frag_coords, i32(sample_index)).rg;
47+
#else
48+
let this_motion_vector = textureSample(motion_vectors, texture_sampler, in.uv).rg;
49+
#endif
50+
51+
#ifdef NO_DEPTH_TEXTURE_SUPPORT
52+
let this_depth = 0.0;
53+
let depth_supported = false;
54+
#else
55+
let depth_supported = true;
56+
#ifdef MULTISAMPLED
57+
let this_depth = textureLoad(depth, frag_coords, i32(sample_index));
58+
#else
59+
let this_depth = textureSample(depth, texture_sampler, in.uv);
60+
#endif
61+
#endif
62+
63+
// The exposure vector is the distance that this fragment moved while the camera shutter was
64+
// open. This is the motion vector (total distance traveled) multiplied by the shutter angle (a
65+
// fraction). In film, the shutter angle is commonly 0.5 or "180 degrees" (out of 360 total).
66+
// This means that for a frame time of 20ms, the shutter is only open for 10ms.
67+
//
68+
// Using a shutter angle larger than 1.0 is non-physical, objects would need to move further
69+
// than they physically travelled during a frame, which is not possible. Note: we allow values
70+
// larger than 1.0 because it may be desired for artistic reasons.
71+
let exposure_vector = shutter_angle * this_motion_vector;
72+
73+
var accumulator: vec4<f32>;
74+
var weight_total = 0.0;
75+
let n_samples = i32(settings.samples);
76+
let noise = utils::interleaved_gradient_noise(vec2<f32>(frag_coords), globals.frame_count); // 0 to 1
77+
78+
for (var i = -n_samples; i < n_samples; i++) {
79+
// The current sample step vector, from in.uv
80+
let step_vector = 0.5 * exposure_vector * (f32(i) + noise) / f32(n_samples);
81+
var sample_uv = in.uv + step_vector;
82+
let sample_coords = vec2<i32>(sample_uv * texture_size);
83+
84+
#ifdef MULTISAMPLED
85+
let sample_color = textureLoad(screen_texture, sample_coords, i32(sample_index));
86+
#else
87+
let sample_color = textureSample(screen_texture, texture_sampler, sample_uv);
88+
#endif
89+
#ifdef MULTISAMPLED
90+
let sample_motion = textureLoad(motion_vectors, sample_coords, i32(sample_index)).rg;
91+
#else
92+
let sample_motion = textureSample(motion_vectors, texture_sampler, sample_uv).rg;
93+
#endif
94+
#ifdef NO_DEPTH_TEXTURE_SUPPORT
95+
let sample_depth = 0.0;
96+
#else
97+
#ifdef MULTISAMPLED
98+
let sample_depth = textureLoad(depth, sample_coords, i32(sample_index));
99+
#else
100+
let sample_depth = textureSample(depth, texture_sampler, sample_uv);
101+
#endif
102+
#endif
103+
104+
var weight = 1.0;
105+
let is_sample_in_fg = !(depth_supported && sample_depth < this_depth && sample_depth > 0.0);
106+
if is_sample_in_fg {
107+
// The following weight calculation is used to eliminate ghosting artifacts that are
108+
// common in motion-vector-based motion blur implementations. While some resources
109+
// recommend using depth, I've found that sampling the velocity results in significantly
110+
// better results. Unlike a depth heuristic, this is not scale dependent.
111+
//
112+
// The most distracting artifacts occur when a stationary foreground object is
113+
// incorrectly sampled while blurring a moving background object, causing the stationary
114+
// object to blur when it should be sharp ("background bleeding"). This is most obvious
115+
// when the camera is tracking a fast moving object. The tracked object should be sharp,
116+
// and should not bleed into the motion blurred background.
117+
//
118+
// To attenuate these incorrect samples, we compare the motion of the fragment being
119+
// blurred to the UV being sampled, to answer the question "is it possible that this
120+
// sample was occluding the fragment?"
121+
//
122+
// Note to future maintainers: proceed with caution when making any changes here, and
123+
// ensure you check all occlusion/disocclusion scenarios and fullscreen camera rotation
124+
// blur for regressions.
125+
let frag_speed = length(step_vector);
126+
let sample_speed = length(sample_motion) / 2.0; // Halved because the sample is centered
127+
let cos_angle = dot(step_vector, sample_motion) / (frag_speed * sample_speed * 2.0);
128+
let motion_similarity = clamp(abs(cos_angle), 0.0, 1.0);
129+
if sample_speed * motion_similarity < frag_speed {
130+
// Project the sample's motion onto the frag's motion vector. If the sample did not
131+
// cover enough distance to reach the original frag, there is no way it could have
132+
// influenced this frag at all, and should be discarded.
133+
weight = 0.0;
134+
}
135+
}
136+
weight_total += weight;
137+
accumulator += weight * sample_color;
138+
}
139+
140+
let has_moved_less_than_a_pixel =
141+
dot(this_motion_vector * texture_size, this_motion_vector * texture_size) < 1.0;
142+
// In case no samples were accepted, fall back to base color.
143+
// We also fall back if motion is small, to not break antialiasing.
144+
if weight_total <= 0.0 || has_moved_less_than_a_pixel {
145+
accumulator = base_color;
146+
weight_total = 1.0;
147+
}
148+
return accumulator / weight_total;
149+
}

0 commit comments

Comments
 (0)