An OPENRNDR template named "Flow" to jump start a music visualization with the most useful tools prepared.
Run a template demo.
Adapt it to your liking or start from the empty template.
Discover useful APIs for working with music, visual content, visual effects and much more.
The result should be stunning visuals - audio reactive and in sync.
I am Lukas Henke, a VJ and creative coder from Germany.
This template is my "expansion" of OPENRNDR that suits the needs for live music visualization. But I also wanted to share this template, providing the tools and a demo to run for everyone.
Feel free to clone and experiment with it, or fork it and make your own version.
- IntelliJ IDEA Community or Professional installed. (Other IDEs are discouraged.)
- Git configured in IntelliJ IDEA.
- A physical or virtual microphone is configured as your system's default microphone. See Audio Routing.
Open IntelliJ IDEA. Navigate to File → New → "Project from Version Control...".
Add this repo's url and clone this repository.
Navigate to the demo at src/main/kotlin/FlowTemplateDemo.kt and run "Current File".
(Similar to OPENRNDR Template)
Press f1 to toggle the available commands. The other keys are used for the demo itself.
👉 Watch the Full Version on YouTube
You might notice that this is a GitHub template, not a library. This way, you can change everything as you please, even internal workings.
It is recommended to start with the FlowTemplate_Demo1.kt.
To experiment with the demo, some starting points might be:
- Setting the bpm to your favorite song's bpm (See: songbpm.com)
- Changing the color palette
- Tweaking content values
- Writing your own visual groups (See:
audioGroup,diamondGroupandcircleGroup) - Writing your own bpm-based envelopes (See:
kickandflash)
Even better, start from scratch and build something of your own. Try things out, follow the fun and enjoy !
If you are computer graphics beginner or veteran,
a VJ or video creator,
creative coder, motion designer, multimedia artist or just interested in cool looking visuals to enhance your music experience -
then this might be for you.
This template aims to build on top of OPENRNDR to quickly build live music visuals programs, with well integrated APIs that can be used right away.
- Based on OPENRNDR
- Useful APIs for live music visualization
- Emphasis on single-file config
- Demo program to run for yourself
It is not a full-blown VJ software, but a starting point for your own project.
Start with val beatClock = extend(BeatClock(bpm: Int)) and get a beat-tracking extension.
Use its phase counting the beats (with decimals) for beat-based effects.
Bind envelopes via val cubicInOut by bpm.bindEnvelope { … } to get a cyclic Envelope and
use its value to animate stuff, like size, position, color, etc.
Start with val audio = Audio(), create audio processors like VolumeProcessor or ConstantQProcessor via its audio.create<...>Processor methods.
Then call audio.start() to let the audio analysis run in a background thread.
Fetch the latest audio data from each processor.
Provides common values for working with audio, like the frequency range a human can hear, also called "Acoustic Range": LOWEST_FQ/HIGHEST_FQ.
Or typical frequency analysis ranges BASS, MID and TREBLE.
Start with val colorRepo = colorRepo { … } and set up your color repository.
Use palette = listOf( … ) using your favorite color model.
From ColorRGBa to ColorXSVa or even ColorLABa.
Then use colorRepo[colorIndex: Int] to get a color from your palette.
Start with val myVisualGroup = object: VisualGroup { … } or write your own class,
to inherit from visual group.
This is the main API to create your own ... well, visual groups. This allows to organize your components and define an isolated draw procedure for each group.
Implement the Drawer.draw() method to draw your visuals.
Then, call myVisualGroup.draw(), as well as the other groups' draw function, inside your draw loop.
Start with val inputScheme = inputScheme(myInputDevice: KeyEvents) { … } and set up your input scheme.
The template uses the default keyboard input device, but you can provide your own device interface.
For the keyboard, keys can always be used by their name, like 'escape' or 'k',
or by their layout-independent key code, if they have one, like KEY_ESCAPE ('k' doesn't have one).
Use inputScheme.keyDown { … } and specify String/Int.bind(myDescription) { myAction() }
to bind keys to actions.
You can also track keys by different tracking styles: PIANO and TOGGLE.
Piano tracking treats a key as active as long as it is pressed.
Toggle tracking treats a key as inactive at first, but toggles its active state on every press.
Use their active state anywhere with inputScheme.isKeyActive(myKey: String/Int).
In general, you can unbind/untrack keys and dynamically change key bindings during runtime.
-- Under Reconstruction --
I am happy with the current state of the template, but there is more to come.
Several APIs are planned as listed below. Most current APIs are subject to change, either by refactoring or by adding new features. So just expect some basic integration.
For the progress, I marked those tasks as done, that are stable and used in the template. All the other tasks require some more work.
- FlowProgram
- Executable Template
- Demo 1
- Demo 2
- Demo 3
-
BeatClock -
Audio -
ColorRepo -
VisualGroup -
InputScheme -
FxRepo(Under Reconstruction) -
UiDisplay -
Realtime filters, currently just OneEuroFilter
- BeatClock
- Beat tracking
- Reintegration: Use orx-delegate-magic to simplify
Envelope
- Audio API
- Audio dispatch logic
- Setup based on system settings/ used processors
- Audio device selection
- General Volume Processor
- Range-Specific Volume processor (ConstantQ)
- Source → Cache/Filter middleware → Value provider (Refactoring)
- Color API
- Color Repo
- Color Palette
- Color Picker
- Sampling, like linear blend or along a color path
- Reintegration: Use orx-palette
- Content API
- Visual group for organizing content
- Inline-object property/function definition and access
- Input Scheme API
- Input Scheme definition
- Common Devices
- Keyboard, by name or key code
- Mouse (use default
program.mouse) - MIDI (use orx-midi + orx-osc)
- Key tracking
- PIANO style
- TOGGLE style
- track/untrack
- Key binding
- bind/unbind
- Hard-bound keys (like
f1for hiding the UI,escapefor exiting the app) - Input recording
- Binary, Analog 1D, Analog 2D
- Mouse recording (with pre-calibration)
- Linear sampling, 2D sampling
- As Envelope, binding to animation
- RenderPipeline API
- Complete Reconstruction
- Scenes API
- Scenes
- Transitions
- Scene Navigator
- Key Rebinding
- UiDisplay API
- Basic UI
- Can be hidden
- Tracking values and displaying them
- Reintegration: Use standard GUI (with orx-gui/orx-panel)
- FFMPEG Video API (based on OPENRNDR
VideoPlayer)- VisualGroup-based Video player
- Start, Stop, Move to time, etc.
- Param Picker
- Like orx-gui, but in a different application / interface
- Live program or similar to change values on the fly (like OliveProgram, but only for hand-picked variables)
- Spotify API integration
Visit the official website and run your first program with the help of the Guide.
You can also check out their GitHub, where the Original Template also resides.
If you have questions on a topic, want to see what others have done or want to share your own creations, you can visit the Discourse forum.
You can learn Kotlin or refresh your memory in the Reference documentation. In particular, you might find working with Collections useful.
If you want to quickly test something in Kotlin, you can go to the official Kotlin Playground.
If you want to route your system's audio output to a virtual audio input device,
then you should use a program for that.
(For example, the Windows feature "Use output as input" will mess up your system.)
The easiest way is usually to ...
- Windows: Install JackAudio2.
- MacOS: Install BlackHole. The 2 channel version is sufficient.
- Linux: Install JackAudio2 or PulseAudio.
Website to search your song's name and artist anf get its bpm. I found this one to be the most reliable.
Article on acoustics and musical frequency ranges
Comprehensive audio analysis (and synthesis) library for the JVM. Provides most of the Audio API functionality.
Update filter for noisy live data. Useful for input filtering, like audio or mouse movement.
Highly recommended beginner online-book to learn about shaders, GPU-CPU relationship, pros and cons of shaders, when and how to use them and much more. Covers the basics and presents concise examples.
Archive of articles about shaders and computer graphics in general. Easy to understand, highly informative.
Forum for GLSL-based fragment shaders (and even full render passes). Treasure trove of visual content.
Music visualizer, now continued as ProjectM
- original: https://www.geisswerks.com/milkdrop/
- ProjectM: https://github.com/projectM-visualizer/projectm
Fractal images that change based on genetic algorithms. Contains long list of "artistic math functions".
- S. Draves, E. Reckase. "The Fractal Flame Algorithm", Nov 2008. URL: https://flam3.com/flame_draves.pdf




