Log InCreate An Account
  1. Videos
  2. »

Fragment Synthesizer desktop capture, processing.js and Google Images

Fragment is an online collaborative additive / spectral synthesizer, visuals are created by a live GPU script which accept many sort of inputs (webcam, images etc), the visuals are then fed into an additive synthesis engine in real-time.

This video show what can be done with the camera input, this use ffmpeg to record a part of my desktop content (a second screen), the desktop capture is then fed to a video device (v4l2loopback) which can be used like a webcam.

This method allow to use ANY programs to feed the synthesizer in real-time.

The first part of the video are processing.js sketchs from https://www.openprocessing.org

The second part are images from Google images website, this part has capture problems due to wrong capture settings.

Fragment Audio Server is enabled and the audio output is fed into a DAW (Renoise) in the background which apply some effects (mostly reverb and delay).

You can try it now at : https://www.fsynth.com
Documentation: https://www.fsynth.com/documentation.html
Forum : https://quiet.fsynth.com

Renoise : https://www.renoise.com/

v4l2loopback (Linux) : https://github.com/umlaeute/v4l2loopback
Open Processing : https://www.openprocessing.org/