Archive for the ‘Processing’ Category

Signal Culture, May 7-21, 2015

May 21, 2015

I’m a resident toolmaker at Signal Culture Thursday May 7- Thursday May 21.

“Signal Culture was founded in 2012 by Jason Bernagozzi, Debora Bernagozzi & Hank Rudolph with the idea to create an environment where innovative artists, toolmakers, curators, critics, and art historians who are contributing to the field of media art will have time and space to make new work and to interact with one another. Signal Culture encourages creation of new work, building of community, and connection to history in the field of experimental media art by providing artists, researchers, and innovators with residencies, resources, and exhibition opportunities.”


Signal Culture is located in Owego NY. It’s not the Experimental Television Center. It’s in a different location, several blocks north, with it’s own studio, workshop, library and space for artists to stay. Their residency program includes artists, toolmakers, and researchers. You can apply here.

A few days before my residency, I download the current Arduino IDE, check out the Ardcore module. It works! I download music files to an external drive, then upload recent video files from my camera, new input for the AVSynth.

Here’s a pre Signal Culture patch, my fifth (or sixth). Interconnected VCOs, controlled through the CV2 inputs, produce just the right amount of audio chaos.



I’m not taking the bus. Instead, late Wednesday afternoon, Mary Ann and I drive from Lowell MA to Hudson NY. We stay overnight at our friend Kitty’s. Early the next morning we cross the Hudson River on the VanWinkle Bridge, drive north west through the Catskills passing through Preston Hollow, then south west on Interstate 88 through Binghamton, and on to Owego NY. We hit the ATM, then Original Italian Pizza for brunch. I load in at 12noon, Mary Ann steels herself for a long, boring drive home, takes off.

Total travel time, 6 hours out and, for Mary Ann, 5 hours back.

Day 1: Signal Culture is reconfigured. The office and toolmakers’ studio swapped places. I’m at the front of the building, on the second floor, directly below the artists’ studio. There’s an attached bedroom and bathroom, my own suite, sweet!

Toolmakers’ Studio

My workspace

My workbench

I set up my computer, AVSynth, video monitor, speakers. Kristin Reeves, artist-in-residence arrives late afternoon. Debora, Jason, Kristen, Dave and I go for a beer at the Barleycorn.

Day 2: Work all morning on the Ardcore module, check out the CLK interrupt, review the template sketch, and make notes on the Arduino/Ardcore IDE. Then I set up audio. I need a mono to stereo mini adapter, in the parts cabinet. There are a lot of parts!

I go over to Dave’s, from 2-10pm I hang out while he designs, builds and tests a Sync Separator module for the AVSynth. Dave uses the same circuit-building techniques that he used 40+ yrs ago, discrete components soldered together on perf board. The Sync Separator module takes video in, buffers it, loops it back out, then strips the sync, and outputs Vertical Drive, VD. VD out patched to the Ardcore CLK in gives me a frame count, actually a field count to use as I develop Ardcore sketches or programs.

Jones AVSynth

Dave at work.

Day 3: Install and test the new Jones Sync Separator, no problems. I play around with video on the MVIP for the rest of the morning.

Jones Sync Separator, ready to install

Jones Sync Separator, up and running!

Look closely, you can see that the Sync Separator has replaced the Multiple on the upper rack. The Multiple is temporarily on the lower rack. I plan to update the LFOs with smaller modules, add 2 new buffered Multiples, all in a single rack.

Grocery shopping, fruit and veggies, salad for lunch 8^)

I work on software in the afternoon, modifying the Arduino Ardcore template. I want to add the frame counter and test the controls with specific MVIP functions. I’m not ready to test the program yet, still decoding the Arduino ports. So I switch to regular Processing to work on a fractal number generator.

Dave invites Kristin and me for dinner. Brian Murphy joins us. We go back to Dave’s for a demo of the Jones AVSynth, as shown above, that includes a Core module, 4 Video rate VCOs, Keyer, Sequencer, and more.
Dave invites Kristin and me for dinner. Brian Murphy joins us. We go back to Dave's for a demo of the Jones AVSynth, as shown above, that includes a Core module, 4 Video rate VCOs, Keyer, Sequencer, and more.

Jones AVSynth, screen capture.

Brian, Kristin and Dave.

Day 4: I upload my version of the template program to the Ardcore, it works. I make modifications, they work. So far so good, I continue to tweak the program.

My workspace.

Vertical Drive from the Jones Sync Separator generates an interrupt on the Ardcore. I detect the interrupt and trigger a pulse out on D0. The template program divides down and triggers a pulse out on D1. It reads A0, A1, A2, A3 and output the values to the computer. They appear in a window on my Mac. The template program is in a second window. I can write code, upload it, and see the results immediately.

Xmas penguins & bears, Richmond VA, 2014

I try various patches. I patch the DAC to the Bitswap and Mode CV inputs on the MVIP. Eventually I want to select effects using the Ardcore. I also discover that I can read audio signals on A2, A3. This might prove useful.

I play until exhausted then work on random number generators. I have a white noise, a brown noise, and two fractal generators running in Processing. Thinking about the frame counter, I realize that Vertical Drive actually gives me a field count, 2x the frame count. This might also prove useful!

Day 5: In the morning I continue to work on random number generators in Processing – white noise, brownian motion and fractal. The fractal is the most interesting as it feels organic, but it’s hard to program, a bounds problem.

I break for lunch then switch to implementing the frame counter. The frame counter is a snap, up an running in no time. Getting used to the Arduino programming environment, based on Processing. I rework the Ardcore template, speeding it up, hopefully. I switch to Processing and resuscitate my movie projector. It reads 24 frames then ‘projects’ them randomly, at random frame rates. Better than the video of Xmas penguins and bears but still a way to go. There’s no direct to way to create a .mov file in Processing.

Time to play, the result, starting with the Ardcore and Jones video modules –

Computer signals shown in cyan, video in yellow, sync in green.

Starting at the left end of the rack, the computer uploads a program to the Ardcore. The right end of the rack has two Jones video modules. Video passes through the Jones Sync Separator to the MVIP, as it come through Vertical Drive is stripped and sent back to the Ardcore. The program clock syncs with the video, the programmer has a frame count and can use this to generate control voltages and trigger pulses. Video out goes from the MVIP to my monitor.

Control voltages shown in red.

Between the Ardcore on the left and Jones video modules on the right are a several Doepfer audio synth modules. From left to right, 2 LFOs, 2 VCOs, Ring Mod, VCFS, and a Mixer. The Ardcore puts out a trigger pulse on D1 and a control voltage through the DAC. The trigger pulse goes directly to the audio mixer, percussion. The DAC output controls bitswap on the MVIP. The first LFO controls brightness on the MVIP, the second selects modes on the MVIP. The 2 VCOs are patched ‘west coast style,’ the sine output of one controls the frequency of the other. The triangle outputs go to the Ring Modulator. The square wave outputs to 2 Multiples. The output of the Ring Modulator and the 2 square wave outputs, through the Multiples, go to the Mixer. The 2 square wave outputs also control hue1 and hue2 on the MVIP.

Audio shown in blue.

The Ardcore D1 digital output directly into the mixer, who knew! It’s dividing down Vertical Drive, input from the Jones SS, and putting out a trigger pulse. The Ardcore is a rhythm machine! There’s still DO and a whole lot more to explore using the DAC. The 2 VCOs are patched ‘west coast style,’ the sine output of one controls the frequency of the other. The triangle outputs go to the Ring Modulator. The square wave outputs to 2 Multiples. The output of the Ring Modulator and the 2 square wave outputs, through the Multiples go to the Mixer. Audio out goes directly to my powered speakers.

All together now –


I take pictures and record video on my smart phone. Debora, Dave and I go for dinner at Las Chica’s Taqueria.

Day 6: In the morning, I update my patch diagrams, breaking out signal types and highlighting the various modules used – computer and video in and out, control signals, audio out and, finally, a composite showing the complete patch. I update my blog.

After lunch I put the case on the workbench and reinstall modules. Eurorack modules are small and when using a lot of patch cords, the controls can be hard to access, no room for ‘knob twiddling.’ I group them by function and leave plenty of space for patch cords. I bring the case back to my workspace, patch the basics and fire it up! Better than before, plenty of breathing room for the modules, cables and me. The video is super clean!

‘The Reconfiguration’

A nice, clean output signal.

I spend the evening making notes, working on a schedule, and watching my basic patch cycle through MVIP effects.

Day 7: In the morning I fine-tune the new layout and update my patch template.

My basic patch

Input modules move to the left, outputs to the right. Control modules on top, processing modules on the bottom. There’s plenty of room. The output modules are on the far right, one on top of the other, all the output controls in one easy to get at location! My basic patch is just that, basic. The Jones Sync Separator sends Vertical Drive to the Ardcore. The Ardcore counts it down and outputs a trigger pulse. The pulse is patched directly into the audio mixer on the far right. Continuing left to right across the top rack, the 2 LFO’s are patched to the frequency controls of the 2 VCOs directly below. The sine wave output of the first VCO is patched to the Bitswap control on the MIVIP and the triangle wave output goes directly into the audio mixer. Likewise, the sine wave output of the second VCO is patched to the Mode control on the MVIP and the triangle wave output goes directly into the audio mixer. And that’s it!

I spend the balance of the morning programming random number sketches in Processing. I’m making progress.
I go grocery shopping around noon.

In the afternoon I reformat my ‘Leaves’ video for the AVSynth using Premiere. A simple process, I ‘downsize’ the video from 1024×768 to 640×480. That done, time to play. As noted the basic patch is minimal, yet it produces some very complex images. The VCOs are set to the same low frequency, the control voltage inputs set to minimal gain, and the LFOs cycle as slowly as possible. The result visually is that the MVIP cycles slowly through only a few effects, controlled by the VCOs. Because the VCOs change frequency slowly and out of out of sync, you see every possible combination of Bitwap and Mode effects. The video loops every 2 minutes. This means that the effect combinations have a new set of image to work with each time around.

I’m able to read the effects easily. I see the VCOs as they go in and out of phase, horizontal bars move up and down within the frame, solidifying as they slow down and breaking apart up as they speed up. Because there are 2 VCOs sliding up and down in frequency at slightly different rates, the image divides into layers. As the effects change, the layers combine to produce new, unexpected results. Finally, the sound from the same 2 VCOs matches the movement on the screen, and underneath is a constant pulse derived from the video that matches the ‘flicker rate’ of the changing images.

Break for dinner at the Calaboose with Debora and Kristin. Kristin returns to Indiana tomorrow, her residence over. We come back to Signal Culture for 'show & tell.'

Day 8: Up early, we all meet upstairs in the kitchen, say good bye to Kristin. I return to my studio with my cup of coffee to do some video editing. I transfer all my recent camcorder movies to the computer, edit, and convert them to standard NTSC video 640×480.

I update my random number routines, one last time.

I create an Ardcore page on my blog and spend the rest of the day researching programming for the Ardcore module. I clean up the template and post the code. I decide that random numbers are nice but, i also need something like a simple waveform generator, running off VD, in order to figure out the MVIP Bitswap and Mode controls. I sketch out a simple triangle wave generator. That done, I start thinking about Peter B’s Swoop module, based on triangle waves. I can generate a triangle wave and control the rise and fall using analog inputs A0 and A1, or externally using A2 and A3. Even better I can generate two triangle waves using A0 and A1, or A2 and A3, to define the shape, and output the difference.

Late afternoon I take a break, walk downtown, past the old ETC studio on Front St, along the Riverwalk, then back to Signal Culture.

I upload yesterday’s video to Youtube. The quality is less than desirable. In fact, it sucks! I delete the video and check out Vimeo.

Day 9: I have the place to myself, sort of. Hank is upstairs in the artist’s studio integrating a new raster manipulator into the system. I update the blog, create a Jones MVIP page, upload a short video to Vimeo.

I finish up by noon, turn on the AVSynth, open the Arduino interface, time to program a random control voltage out. Easier than I’d thought! I spend the rest of the afternoon and evening playing with the system.

Random control voltage patch.

Starting on the left side of the patch, I load the Arduino sketch to the Ardcore. I build on my basic patch, the video is running, the LFOs control the VCOs that, in turn control effects on the MVIP. The Ardcore and the VCOs send audio to the mixer. I patch the output of the Ardcore DAC to the frequency control input on the first VCO. You can hear the results immediately, bleeps and bloops. I patch the Ardcore DAC output to the MVIP, the results can be seen immediately. The image moves with the sound.

Like the D1 output pulse patched to the audio mixer, the random control voltage is triggered by dividing down the CLK input or VD. I try varying the timing in relation to the sound pulse. I try randomizing the duration. In the end I return to a simple, straightforward relationship between the pulses.


Of course these are screen shots, stills. Kinda’ misses the point!

Day 10: Breakfast at Parkview, I walk around for a bit. The weather is sullen, typical Owego. I update my notes, edit images and, after lunch create an AVSynth page.

The image is a challenge, what to do with the blank spaces in the case? Before we left for Owego Mary Ann gave me an old circuit board she thought might make an interesting panel. So I search online for pictures of circuit boards. I find one with a fish, perfect! I add the latest version of my basic patch, still work to be done.

It’s Saturday night. Debora, Jason, Hank, Dave, Brian, and I meet for dinner at Tioga Trails. After dinner Jason, Brian and I visit Dave in his studio. Dave has his Video Synthesizer patched into a simple color encoder, amazing! Jason uploads smartphone videos. Dave breaks out the Laphroaig, a long-standing tradition going back to the early days at the Experimental Television Center.


I return to the studio all warm and fuzzy.

Day 11: Sunday, Hank is back, working in the artist’ studio upstairs, Debora is dressed for summer.
Yesterday, Brian donated a book he’d purchased for 80 cents at a garage sale, ‘Gary Hill: Art + Performance’. I find a quote from Gary about ‘the surfing mind’ and message it to John Craig Freeman. Almost immediately John Craig replies asking for the book title and isbn#. My work, for today, is done!

Not quite, I think about updating the Ardcore random number sketch, adding controls for bounds and options for brown noise and fractal noise; about generating rhythm patterns using digital outputs D0 and D1; and about a possible Video Shredder module for the Jones Video Synthesizer.
David and I talk about the shredder over dinner at TiogaTrails.

After dinner I record a few short videos directly off the monitor, using my camcorder.

Day 12: My goal for this toolmakers’ residency is to configure my AVSynth, specifically –

1) to sync audio and video,
2) to integrate the Ardcore into system,
3) to write a simple program to control the Jones MVIP,
4) to see, hear and feel the link between sound and image, and
5) to have good clean video as a result.

Thanks to Dave’s Jones Sync Separator and the Ardcore, I can sync MVIP effects to video. The Jone’s SS Vertical Drive out is patched to the Ardcore CLK in. The Ardcore’s Arduino microprocessor senses the VD pulse, my program flashes the green LED on the front panel, increments a counter, divides down the pulse, and when appropriate flashes the red LED, sets D1 high, and sends a new random control voltage to the DAC out.

In my basic patch the DAC out is patched to the first VCO and to a MVIP effects input. The sound changes frequency in sync with with VD. The MVIP switches between effects during vertical blanking. The result is a simple audio signal and a good clean video output signal.

You can see, hear and feel the relationship between the sound and the image.

Ok, I’ve met my goals, but my attempts to upload video, don’t go well. Vimeo is better, not as good as I would like, but better. I have a basic account. Dave suggests that upgrade from a basic to pro account. When I can afford it.


Day 13: I record mixes, of Saturdays videos, an ongoing experiment into audio visual composition.

While the mixes render, I read over my notes on Shnth programming, listen to some online tutorials, and do a little maintenance on the workbench. I plug in the Shnth, lo’ and behold it lives!

Dave and I drive out to Ralph and Sherry Hocking’s for lunch.

Back at Signal Culture, I continue to record mixes, will post a couple later.

Day 14: I pack up, all but the AVSynth. Hank vacuums, organizes, checks out the system in the studio upstairs. Jason opens up a small color monitor, soon to be a ‘color wobbulator’.

Today is ‘dog & pony day’.

Dave Jones arrives at noon he joins Jason, they talk about ways to modify or ‘wobble’ the monitor. Jonas Bers arrives for a visit. We sit down together for lunch, all but Hank. He’s playing with slow moving fish and turtles on the Hornbacher Raster Scan Modulator.

After lunch, we tour the artists’ studio. A curator from Hunter College NY and the director of the ETC Archives at Cornell arrive to check out the ‘wobbulator’. The curator would like one for an exhibition ay Hunter this coming Fall. The director would like one for the collection at Cornell. They talk with Jason and David about building a simple b&w version for display.

Jonas’ head is spinning and he hasn’t yet seen the Jones’ Video Synthesizer. Dave, Jonas and I walk over to Dave’s Studio. Dave walks Jonas through the capabilities of the Video Oscillator. His mind is ‘blown’.

Back at Signal Culture I pack up the AVSynth. I’m hitching a ride with Jonas back to Kitty’s in Hudson NY.

Day 15: Mary Ann arrives at 9am, we say goodbye to Kitty, return home to Lowell MA.

Signal Culture: a new video

January 24, 2015


Recorded during my residency at Signal Culture, October 2014.

Input – Quicktime movie loop; audio and control – Serge and Doepfer modules; processing – Jone’s MVIP. Playback from a Mac G5, recorded on Mac Pro.

Final edit in Premiere, published January 24, 2015.

Signal Culture, a new video

January 15, 2015


Recorded during my residency at Signal Culture, October 2014.

Input – Quicktime movie loop; audio and control – Serge and Doepfer modules; processing – Jone’s MVIP. Playback from a Mac G5, recorded on Mac Pro.

Final edit in Premiere, published January 13, 2015.

Introduction to Processing

December 6, 2012

A couple of simple sketches from the book, Processing: A Programming Handbook for Visual Designers and Artists by Casey Reas and Ben Fry, MIT Press.

In this first sketch setup() initializes or sets some important variables. size(480, 120) sets the width and height of the output window. smooth() turns on anti-aliasing, which removes the “jaggies” from curves and edges.

draw() draws to the output window. It loops frame by frame. Each frame it checks the mouse button. If “mousePressed” is “true” then the fill color is set black, fill(0). Else “mousePressed” is “false” and the fill color is set white, fill(255). Having set the fill color, black or white, draw an ellipse centered on the mouse position, ellipse(mouseX, mouseY, 80, 80). This is a circle because width = height.

// Processing demo, sketch 1

void setup() {
  size(480, 120);

void draw() {
  if (mousePressed) {
  } else {
  ellipse(mouseX, mouseY, 80, 80);

The second sketch draws a series of 5 lines to the output window. There’s no setup() or draw(), simply a list of commands. This is the “brute force method” of programming. It assumes a number of defaults, for example the size of the output window 100 x 100. Lines are drawn with a “pen.” stroke(255) sets the pen color, in this case white. strokeWeight(5) sets the pen width in pixels. We know what smooth() does.

The output window is measured in pixels (picture elements), 100 pixels wide and 100 pixels high. The upper left corner is 0, 0 and the lower right corner is 99, 99. In the first sketch (above) the mouse position is available as mouseX and mouseY. mouseX being the horizontal position measured from left to right and mouseY being the vertical position measured from top to bottom of the output window. Lines are drawn from start point to end point, line(xs, ys, xe, ye). Each point has an x or horizontal location and a y or vertical location.

// Processing demo, sketch 2


line(10, 80, 30, 40);
line(20, 80, 40, 40);
line(30, 80, 50, 40);
line(40, 80, 60, 40);
line(50, 80, 70, 40);

The third sketch, same as the last sketch. However this time the lines are specified in relation to a single x, y point. I start by defining 2 integer variables x and y and setting them equal to 5 and 60. This is the starting point for the group of 5 lines.

Line 1 from   5, 60 to 25, 20;
Line 2 from 15, 60 to 35, 20
Line 3 from 25, 60 to 45, 20
Line 4 from 35, 60 to 55, 20
Line 5 from 45, 60 to 65, 20

So what can we do with this? The first thing to try is changing the values assigned to x and y. Changing x moves the whole group left or right. Changing y moves the group up and down. What if we “programmed” x to change frame by frame? Then the whole group of lines would move automatically across the screen …

// Processing demo, sketch 3

int x =   5;
int y = 60;

line(x,         y, x + 20, y – 40);
line(x + 10, y, x + 30, y – 40);
line(x + 20, y, x + 40, y – 40);
line(x + 30, y, x + 50, y – 40);
line(x + 40, y, x + 60, y – 40);



November 16, 2011

What is Processing? Here’s a brief explanation from the book –

“The Processing language is a text programming language specifically designed to generate ad modify images. Processing strives to achieve a balance between clarity and advanced features. Beginners can write their own programs after only a few minutes of instruction, but more advanced users can employ and write libraries with additional functions. The system facilitates teaching many computer graphics and interaction techniques including vector/raster drawing, image processing, color models, mouse and keyboard events, network communications, and object-oriented programming. Libraries easily extend Processing’s ability to generate sound, send/receive data in diverse formats and to import and export 2D and 3D file formats.”
~ Processing: A Programming Handbook for Visual Designers and Artists, Casey Reas & Ben Fry

Processing can be downloaded from

I bought the book last year and worked through the examples. Then the Art Department at UMass offered me a section of Digital Foundation. Why not teach art students programming?

“The ability to ‘read’ a medium means you can access materials and tools created by others. The ability to “write’ in a medium means that you can generate materials and tools for others. You must have both to be literate. In print writing, the tools you generate are rhetorical; they demonstrate and convince. In computer writing, the tools you generate are processes; they simulate and decide.”
~ Allan Kay, Xerox Parc and Apple Corp

The Processing Development Environment [PDE] consists of a simple text editor for writing code, a message area, a text console, tabs for managing files, a toolbar with buttons for common actions, and a series of menus. When programs are run, they open a display window.

Processing Development Environment (PDE)

Sketches are written using the text editor. The message area gives feedback when saving and loading. The console display text output from the sketch. It also displays error messages (in red). Toolbar buttons allow you to run and stop the sketch, create a new sketch, open, save and export the sketch.