Just some Processing sketches. Source code for visuals we use at Soul Ex Machina.
The project is divided into multiple modules.
The :core
module contains the core stuff like audio processing, tools, remote control handlers, extensions, etc.
The :playground
module serves as, well... playground. Used to quickly create a new sketch and play around. I'm using the Koin DI framework, so you can inject here whatever is defined in the CoreModule
. Have a look around.
The :visuals
module is meant to be used in live environment at the parties. There is an abstraction layer in form of Mixer
and Layer
s, which allows me to blend multiple scenes together. Also, have a look around, proceed at your own risk, ignore legacy
package 😅 (I like to change things, API is generally unstable).
The :raspberrypi
module contains standalone RPi application that can be distributed using the Application Gradle plugin.
This project depends on local Processing 4 installation, so go ahead and install it if you haven't already. Then create a local.properties
file in project's root directory and configure the core library and contributed libraries' paths:
processing.core.jars=/path/to/core/processing/libraries
processing.core.natives=/path/to/core/processing/libraries/<os-architecture>
processing.core.natives.rpi=/path/to/core/processing/libraries/<os-architecture>
processing.libs.jars=/path/to/core/processing/libraries
On macOS it might look like this:
processing.core.jars=/Applications/Processing.app/Contents/Java/core/library
processing.core.natives=/Applications/Processing.app/Contents/Java/core/library/macos-x86_64
processing.core.natives.rpi=/Applications/Processing.app/Contents/Java/core/library/linux-aarch64
processing.libs.jars=/Users/matsem/Documents/Processing/libraries
Note the difference between processing.core.natives
and processing.core.natives.rpi
.
The Raspberry Pi libs have to be configured if you wish to use the :raspberrypi
module.
The Gradle buildscript will look for Processing dependencies at these two paths. Dependencies are defined in CommonDependencies gradle plugin. Open it up, and you can notice that this project depends on some 3rd party libraries, which need to be installed at processing.libs.jars
path. Open your Processing library manager (Sketch > Import Library > Add library) and install whatever libraries are specified in the build.gradle
file.
Current list of library dependencies is
val processingLibs = listOf(
"minim", // audio input everything (input source, fft analysis, etc.)
"themidibus", // MIDI control protocol implementation
"VideoExport", // I use this to export video teasers synced with external audio file
"box2d_processing", // for physics (look for Gravity sketch in playground module)
"video", // video playback
"extruder", // 2d shape -> 3d shape extrusion
"geomerative", // text -> shape, svg -> shape conversion
"peasycam", // adds camera handling to the sketches, nice to have when prototyping
"PostFX", // can apply post-processing shaders to video output
"oscP5", // OSC control protocol implementation
"blobDetection" // library to find "blobs" on image
)
The Raspberry Pi app can be installed using ./gradlew raspberrypi:installDist
task and zipped using ./gradlew raspberrypi:distZip
task.
See the Application Plugin docs for more info.
Project must be built using JDK 17. You can use SDKMAN! with provided .sdkmanrc
file to use correct JDK version.
You can run the project with Gradle run
task. Be sure to include the --sketch-path
argument so sketches can properly resolve the data folder containing resources needed by some Sketches.
./gradlew playground:run --args='--sketch-path=/path/to/project/'
./gradlew visuals:run --args='--sketch-path=/path/to/project/'
There are also IntelliJ Run configurations in .run
folder which you can use to run the app from IDE. Just be sure to edit their configuration to match your setup.
Currently, the project supports 3 remote control options:
- If you own Traktor Kontrol F1, the
KontrolF1
class is for you - I use it for quick prototyping. It handles most of KontrolF1's hardware features, like pad buttons (with colors feature), encoder, knobs and faders. - If you'd like to try the
:visuals
module, go ahead and get yourself the TouchOSC app and load it withAstral.touchosc
layout that can be found in thetouchosc
folder. This layout uses MIDI and OSC protocol and there is aGalaxy
class that handles most of TouchOSC MIDI controls. For future, I plan on to get rid ofGalaxy
and migrate everyhing to OSC protocol, which leads us to the last option - OSC - The most convenient way, though, is to use the OSC (Open Sound Control) with Delegated Properties
First, make your sketch/class implement the OscHandler
interface, which makes you provide the OscManager
class.
class OscHandlerExample : PApplet(), OscHandler {
override val oscManager: OscManager by lazy {
OscManager(
sketch = this,
inputPort = 7001, // Port that this computer is listening on
outputIp = "192.168.1.11", // IP of phone running TouchOSC
outputPort = 7001 // Port, the TouchOSC app is listening on
)
}
}
Then, you can create all sorts of properties tied to various OSC controls, like buttons, faders, labels, LED indicators, etc. Check out the dev.matsem.astral.core.tools.osc.delegates
package for full list. Example:
private var fader1: Float by oscFaderDelegate("/1/fader1", defaultValue = 0.5f)
Most of the delegated properties support value assign, so, if for example you create the fader variable and at some point in time you assign the value into it, the corresponding control in TouchOSC app will reflect that change.
Movie library stopped working after migration to Processing 4. Related issue: #56
App crashes with Instance creation error : could not create instance for [Single:'dev.matsem.astral.core.tools.audio.AudioProcessor']
It's probably one of these things:
- Your system has no audio input devices. Make sure to have at least one audio input device, as it's required by audio processing toolchain.
- Your system has no audio input devices compatible with
minim
library. (Might happen on newer macOS versions). Happens on my M2 Pro Mac. You can use a virtual audio input device to mitigate this. - If you're on macOS, make sure to grant microphone permissions for your Processing installation. The simplest way to do this is to run some Processing sample from
sound
library. The system will ask you to grant microphone permission.