Beam for live Features

Hiya, I’m just digging in to using Beam. I want to use it to create an Audio reactive light show fed by my Ableton Audio tracks.

I’m now wondering if its possible to use video files as a source for lights controlled by Beam? Kind of like I’ve seen people use Madmapper where each light behaves as a pixel?

Also for people using Beam in a live context, do you have any realtime control over the show or is it all preset, perhaps using different ableton scenes to launch different controller / plugin settings?

For example, I would like the lighting operator of my show to be able to control the intensity of the reactiveness to audio mappings using midi controllers. Each show venue will vary slightly and fixtures may vary so I feel it is important the lighting operator has hands on adjustment of things like beam width, shutter speed, brightness mid show.

I’m also wondering if anyone is using M4L plugins to feed parameters to beam. For example I would like to use harmonic data to affect things like colour and movement. I would like to map for example a 3 colour gradient (from an audio analyser readout eg spectrogram) spread across 20 or so lights.

In other words…Please come at me with your crazy ways to control lights using Beam reacting to audio in real time!!

Thanks!

1 Like

Hello, I am currently working on using BEAM for live to do the same thing, having reactive lights based on live audio. Have you figured out any tricks/tools to help with this, and if so would you mind sharing? Thankyou!

Hiya, unfortunately I’ve not got very far using Beam, although I’ve put quite a lot of time into it, partly because my system (Mac Studio) started crashing with Ableton, Beam and Capture all running at the same time, whenever I tried to rotate the camera in Capture.

Then Beam updated and the change in the way it outputted meant half the lights in my viz stopped working. With forum help this got solved.

I was able to get the lights to react to the audio, but it was not in a way that I found attractive or anything close to what I’m seeing in my head which is quite intricate, generative and crystalline.

The main challenge for me is that i want to control 20-30 moving head lights, each of which has 24-30 dmx channels, so each channel would be a controller line in Ableton, and I want the parameters changing uniquely and in waves. If I use lighting software, the software knows what each DMX channel does and so the output I am programming is correctly labelled, as opposed to drawing controller lines in Ableton where all you see is the number from 0 to 127.

In addition to that, I’ll probably have 60 or so LED tubes, Pyro effects, and then whatever is in the flown rig in the venue that I’m performing in.

So for me to program one Chase with each moving head Behaving uniquely will probably take about an hour.

I also did a bit of research and found that most fully featured lighting softwares come with effects engines that do all this stuff for you. I don’t want to spend my life drawing controller lines on automation tracks in Ableton. I’m also really keen to have a team working on this with me so it’s not just me doing the programming And I think if I stay within Beam I will end up being the key person after all.

Beam say that you can unpack their apps which are built in Max for Live and reprogram them, but I’m not a programmer, I’m not at the stage in my life where I want to develop that expertise.

What kind of show do you have in mind?

The show I had in mind is a LOT more simple. This is for my senior project, I wanted to use Beam and Live together to create/program a way for lighting effects to be produced on the spot with whatever audio is coming through Ableton. However, in our rehearsal room, the we only have 5 Chauvet 4bars, and like 2 slim pars, about as simple as simple gets.

Hi @Petrichor and @Hungry_Man,

You can use Live’s Envelope Follower to map amplitude values from audio signal to any lighting parameter.

Beam’s Generic Instrument allows you to control groups of fixtures within a tag by assigning the same modulation parameter to multiple Param dials. You can use this feature to map different frequency ranges to different subsets of lights, by sending your audio signal to multiple Envelope Followers, with each one preceded by a band-pass EQ/filter for the specific frequency range you want to target. Then, map the values from each Envelope Follower to one of the Generic parameters (making sure to select the same modulation parameter for each parameter slot).

Directly visualizing amplitude values of the entire audio mix can sometimes appear chaotic due to the complexity of the audio signal. Often it is more effective to only visualize a range of frequencies, or even only certain tracks/layers. Smoothing (Envelope Follower’s Rise and Fall parameters) can also help.

Here is a simple example with 128 moving heads, where:

  • amplitude values of a synth’s audio signal are mapped to Depth parameter of 3 Beam LFO devices modulating intensity, pan and tilt of moving heads
  • amplitude values of 4 frequency ranges of drums’ audio signal filtered using Live’s Auto Filter devices are mapped to intensity of 4 sub-groups of moving heads

Here is another clip where @TarikBarri demonstrates the use of multiple Envelope Followers & different EQ shapes in the context of Videosync.

Beam lets you merge any number of lighting signals as if you are mixing layers of audio, so you can combine multiple control sources for a single fixture or group of fixtures, as shown here.

Hope this helps - let me know if you have any further questions.

I was able to get the lights to react to the audio, but it was not in a way that I found attractive or anything close to what I’m seeing in my head which is quite intricate, generative and crystalline.

I suggest sharing some concrete video examples of what you have in mind, as well as any examples of failed attempts, maybe we can help.

The main challenge for me is that i want to control 20-30 moving head lights, each of which has 24-30 dmx channels, so each channel would be a controller line in Ableton, and I want the parameters changing uniquely and in waves.

Drawing a separate automation lane for every DMX channel is definitely not the way we intend Beam for Live to be used, that would be very painful :slight_smile:
The number of DMX channels your fixtures have is not reflected in Live - you work with named modulations (e.g. pan, tilt, stroberate) and the tag-based approach means you don’t need an individual automation lane for every fixture in your patch unless you specifically want that. As mentioned in my previous post, you can use Generic’s ability to spread its params across groups of lights within a tag (e.g. if you have 128 lights and 4 params using the same modulation, each param will control 32 lights), Beam’s LFO Spread parameter or use MIDI notes.

The Beam for Live 2’s Demo Project, which you can access via Beam > Open Demo Project… and are probably already familiar with demonstrates some of that, but a good example is also the Beam for Live 1’s Demo Project, which you can run by going to Beam > Open Playground… and then running the User Library/Beam/2.0 Playground Project/Beam Playground Example.als.

I also did a bit of research and found that most fully featured lighting softwares come with effects engines that do all this stuff for you. I don’t want to spend my life drawing controller lines on automation tracks in Ableton. I’m also really keen to have a team working on this with me so it’s not just me doing the programming And I think if I stay within Beam I will end up being the key person after all.

Beam say that you can unpack their apps which are built in Max for Live and reprogram them, but I’m not a programmer, I’m not at the stage in my life where I want to develop that expertise.

There are several simple ways to create rich lighting effects that don’t require any programming knowledge. Beam for Live’s sequencing and effect engine is Ableton Live - think of all the many ways you can generate musical material and apply this mindset to lighting.
Besides controlling modulation parameters with automation (which you definitely don’t have to draw for every individual DMX channel or modulation - you probably also don’t individually automate every parameter of every individual voice of a polyphonic synth), you can also use Modulators (besides Envelope Follower also e.g. Expression Control, Shaper, or a number of third-party M4L devices), or Beam’s own LFO device.
You can trigger envelopes for lights using MIDI notes, which you can manually enter, play and record with an external MIDI instrument, use Live’s MIDI Effects (e.g. Arpeggiator, Random, Chord), MIDI Generators (built-in ones & third-party such as MIDI Tools) and M4L Sequencers.
You can mix and merge lighting signals generated using the different approaches using Racks and Group Tracks, and apply further lighting processing on that (e.g. by a lighting operator that can control group effects via MIDI or OSC).
You can even combine approaches and bring in lighting signals from a lighting console or another lighting software and process that with Beam and Live. As any tool, there are of course also things it cannot do, and creative situations where you are better off using another tool.

That being said, a lot of the above is probably not fully obvious to a lot of users - we definitely recognize the need for more examples and tutorials. Also, I can see how having some lighting presets (aka. “chases”) you can just drag-and-drop as a starting point could be helpful, as a folder of classic drum breaks or even a sacrilegious MIDI chord progression pack can sometimes be helpful. Added to the to-do list & thanks for the feedback!

1 Like

Hi Luka,

Thanks so much for your detailed response and for all the ideas.

I certainly hadn’t thought of midi generators and midi effects as other ways to control beam tag groups as you suggested. Also using using filters to restrict the audio response to specific frequency ranges makes sense.

I will dive in and have another go using some of these techniques you suggest.

Drag and drop presets or clips that can then be edited would be a great way to get going quickly and to be able to see Beam’s possibilities on their own equipment or stage plot for someone who is new to the software.

I recommend you check out a tablet app called ‘lightrider’, I have used it recently to control my lights at a party, it allows you to instantly create chases with no programming at all. Being able to drop Some of these movements and colour change behaviours straight onto a track would be helpful and save a lot of time.

1 Like

Good ideas, thanks for sharing!

You’re very welcome. I’d be happy to help test this or give feedback.