Live experience: s/v live at SP3
This is an article about live VJing or live demoing. At Epidemic, we have always wanted an experience like this, so we decided to create a live demo with a duration of 20 minutes for the Synthesis Party 3 (a party we were organizing ourselves). This article is on the one hand a modest reflexion on live demos, and on the other an explanation of the one we did.
What is a live production?
For now there is no officially accepted definition. Live productions might also be called live acts, but this bears the danger that it might be messed up with a music live act, which is very different from a demo live act.
Here is our definition. It's a show, with music and demo-engine-powered visuals, that are fit together in real-time. It obliges you to:
Have a technology permitting live (read, no basic 3ds player).
Prepare the live: prepare the tracklisting, and a set of visuals.
Have a great place to play, good for sound and visuals.
Get an attentive audience.
Be ready to drive your live.
As this definition is a little laxist on what is done during the show, and what's done off the show, it's up to everyone to say what they're able to do. See "S/V Making of" below for details on what was live and what wasn't. Don't mix up live with impro, it's quite different.
S/V live at Synthesis Party 3
live at Synthesis Party 3
23:15, 14 March 2003
by Epidemic collective
in a Breakbeat style
S/V: part1 on scene.org (77mb)
S/V: part2 on scene.org (248mb)
(videos by gol / wipe)
After Run Away and Run Away (Responsible Remix) (the demos epidemic created in 2002), Francis made smode more and more live-able. In fact, even before releasing Run Away (Responsible Remix), we decided to do the live for Synthesis Party 3.
Tresh / Epidemic (also a member of Duplex) was ready to give a try creating a breakbeat set (usually he's more about house and variants). We asked him for a set of about 20 minutes, and he made a perfect one, a month before the live. Francis, Jb, and I were working on the visuals. We worked on the concept of a mini-invit to give a boost to our creativity, and the result was good; see the Synthesis Party 3 website.
At the party, we merged the graphics. Jaia, my fellow dude, came with a great opentro and an awesome scene for the live. Xos, not really constant in his production flow, created some really good 2d layers for the scenes we already had. We finished the attribution of the scenes for each part of the music tracklist and did the binding for the midi keyboard. We hope people enjoyed it as much as we did doing it. Additional thanks to Gruiiik and Blaine, for their help.
S/V Making of
For the music, Tresh (call him DJ Tresh), the guy on the left of the screen in the video, mixed the breakbeat playlist using 2 turntables, a laptop, and the system called Final Scratch.
1 - Ken Ishii - Extra
2 - Bt - SmartBomb
3 - Way Out West - Domination
4 - Hybrid - How Soon Is Now
5 - Slam - Positive Education (Josh Wink remix) (start @ 1'01)
6 - Groove Doctor - 100% Natural
7 - Josh Wink - Verano Azul (start @ 1'23)
8 - Technasia - Future Mix
9 - Technasia - Hydra (start @ 0'58)
10 - Radix & LLuvia - Tournesol
He managed to get a constant BPM throughout all the live, which was really helpful because it permitted to avoid all resync. 3 weeks before the live, Tresh gave us an mp3 of the recorded track mixing; it permitted to feel the atmosphere of the live and prepare scenes that would fit even better.
I listened to this playlist about 20 times in order to be able to drive the live. We did not try the live much, only once just before the show, to make sure that nothing was missing. We wanted to keep the live feeling complete, not just like another demo experience. The experience was intense.
Making the visuals was a really big part. I did not want to do another Run Away remix, but it was hard to get a new visual style. Working with Francis, and Jb brought new good ideas, it was very helpful.
We had about 3 weeks of serious work to prepare the live.
What was off
The list of things that were done during the preparation phase.
The scenes: the setup of each scene.
The effect sets and states sets: we created different states to be able to make the scene vary during the show (changing camera, colors, enabling/disabling effects).
The export of bindings: with smode you have to bind some midi key or controller/keyboard/mouse/else event to actions. Smode provides a learning module permitting to bind actions like changing states, making variation controllers, or playing timeline/scripts. We exposed to the midi input the control events for each scene.
Some little syncro loops, like: set camera1, decrease light, set camera2, increase light. The most useful loops where those using 2d alpha layers and changing cameras.
Some automatic live changes: some effects are completely live, then change on the measures of the music, like camera_mesh: permitting to have a different camera on each measure based on 2 random vertices of a mesh; camera_particle: the same with 2 random particles of a particle system.
The roadmap: we chose the order of the scenes following the music. We made our decision based on the key moments of the syncro which were intense moments of the music.
What was live
The list of things that were done during the show.
The realtime rendering:)
The macro syncro: Changing scenes.
The precise syncro: Effect states changes, and yeah, each flash ;)
Some camera changes.
Most effect variations: Each scene had 7 controllers exported to the keyboard that permitted precise control of colors, size of particles, scale of objects.
You may think it's not much, but in fact it's pretty much for one person, so we needed 2 ;)
How to drive the live
Driving the live means activating effects, states, changing controls, changing cameras, changing scene,...
We applied a simple method of making decisions of what to do during the live. We needed:
1 person doing all the precise syncro, on advice of the others. He has to know the playlist very well because he must not miss the changes of the music. During the live everyone did some, but I did the most because I knew the playlist and the scenes the best.
1 person doing all the macro syncro, he has the roadmap in his hands and says when to change the main scene. For that, he prepares the other to do a transition. It was Francis' job :)
We splitted the keyboard in areas for: scenes, camera, states, precise syncro (like flash or 2 layers). Plus a [panic !] key for restoring a known scene when it did not go like it should. As you may have noticed, it was used a few times.
Here is a list of little extras that did a lot for the quality of the live:
The place: the party was really nice, the room was good, and the people were really supportive. Thanks to all the orgas and the attenders for that.
The sound and video quality: Sly did a great job, the sound quality was awesome, you can feel all that in the video of the live, but it was even better than you can imagine ;) The video projector quality was really impressive.
The dual screen: the dual screen permitted to limit some of the leaking bugs of the live. We had a screen with all the panel of bindings available.
The light spots and the laser: Sly (him again ;) managed spots, smoke and laser on the scene during the live. You can see it on the video, it was a big plus.
Smode is a completely interactive authoring tool, and we always did some live tests. During the creation process of our demos, this was almost inevitable ;) The big features for live acts we implemented in our tool since the creation of Run Away (Responsible Remix) are:
Live Time/Beats/Speed: Now the time reference is relative to the beat of the music, to be precise: to 60 bmps. So if the music is playing at 60 bmps, the effect will be played for a duration of 1 sec if it is told to be played for 1 sec. If the music is playing at 120 bmps, however, the effect will only be played 0.5 sec.
More complex states: States also include controllers. The way it was changed to was quite different from the way I was accustomed to using it, but I can't deny the new system is very useful.
Live oriented effects: More effects that are able to vary alone, like cameras, or texture lists, etc.
FFT input: The input of your sound card or the mp3 you're working on can generate events (changes or variations).
Support of multimonitors with OpenGL: Dual screen was a key, but painful feature to implement.
Input learning system: When you want to bind effects and more on a key, you simply press the key and do the action, it will record it.
Timeline: a wrapper over controls and scripts, supporting loops.
GUI improvements: Drag & drop system, plus right button on controllers for constructing timelines. Icons for effect activity monitoring. Impoving.
An impressive thing was that there was no crash during the live act; in fact there was no crash at all on the day when the live demo was performed.
smode: new concepts
Smode does not stop here, it has already been improved since, including great new features:
Scene composition: removes the channel concept.
Efficient rendering to texture, using extensions:)
Transparent faces sorting.
Plus much more... give a try at http://smode.smousse.net.
More about VJing
The following sceners are in touch with VJing:
Blasphemy and Alien prophets
Zden / Satori
Gridflow for jMax: http://www.artengine.ca/jmax/gridflow/
This is no exhaustive list.
Greetings for the article goes to Francis, Gruiiik, Jaia, Jb, Tresh, Cymagine, Flod, Knos for the complementary information about VJing scene, and Adok/Hugi.
Sanx for Epidemic