Pedals for Ableton Live

For a concert later this year, a friend is gonna use Ableton Live to play some sounds, while also playing other instruments, and so he wanted to be able to control Live with his feet. Ableton Live is very nice, control wise, in that it is easy to map keys on the keyboard to do almost anything, and also that you can easily switch over to a mode where two rows of keys map up to a little more than one octave on a musical keyboard.

The switches for each pedal

The switches for each pedal

Having some pedals from old organs laying around, we decided they would be ideal for the physical part of the interface. And as they, on the lowest level just consist of one full octave of switches, the easiest thing was just to make it work like a keyboard. Mapping the “white” pedals to “asdfghjk”, and the “black” to “wetyu”.

I ripped the controller board out of an old USB keyboard, and used a multimeter in continuity mode to find out which pins matched the keys we wanted. Then it was just a matter of solder two wires from each of the pedal switches to the right pins on the controller card. Since some keys shared pins, we also matched those up, and connected the matching pins, to reduce the clutter.

The pedals all connected to the keyboard controller

The pedals all connected to the keyboard controller

Now it is just a matter of planning and rehearsing, finding out what sounds to map to what pedal. Having the ability in Ableton Live, to also have different trigger modes on the sounds makes it a very nice program for live performances, but I guess that was the whole idea of the software, so it’s not surprising.

The Viscount c.100 volume pedal

Viscount c.100 volume pedal

Viscount c.100 volume pedal

Just a small update on the organ I have been gutting.

I have found out a lot, and will write some more about it real soon. I just wanted to write some words on this recent find, as I think it is a bit cute.

Earlier today I took the whole organ apart, to save some room in my studio, and was just trying to figure out the volume pedal’s resistance and stuff, for use with a guitar or something, maybe. Anyway, using my multimeter to measure the leads from the pedal didn’t give me any clues, so I opened it up.

Instead of the potentiometer I expected to find, the makers have used a small light bulb, and a light dependent resistor in a small room under the pedal. When you move the pedal, a plate divides the room more or less, and thus letting more or less light in on the LDR. Not very advanced, and not a new concept (to me), but I thought it funny that such a solution was used here.

Lamp and LDR in the Viscount c.100 volume pedal

Lamp and LDR in Viscount c.100 volume pedal

I have also figured out, in theory, how to hook up the spring reverb, and hopefully that will be something I will write about in my next post, as I am gonna build a small effect box to have it in. The plan is to make it both usable as a guitar effect, and as an insert effect in a mixer.

Contour tracing in Processing

I recently read the book Aarons code, and with that the thought of writing some contour tracing functions came up. When I also, a few days later got an idea for a project, I got to it. Do searches on Google pointed me in the direction of the blob extraction algorithm that OpenCV uses, and so I found “A linear-time component labeling algorithm using contour tracing technique”.

Lately I have also started looking into using NetBeans as an IDE for programming with the Processing library, so I decided that I would implement the algorithm in the paper in java, using Processing.

The code is provided below, and is provided as is. It can probably be optimized a bit, and/or rewritten to be better. But for now it works for me.

If you have any suggestions to changes, or you use it, please leave a comment so I can check it out.

 /*
 * An implementation of
 * «A linear-time component labeling algorithm using contour tracing technique»
 * by Fu Chang, Chun-Jen Chen, and Chi-Jen Lu
 *
 * Written in Java for use with the Processing (www.processing.org) library.
 *
 * Code written by Greger Stolt Nilsen
 * (gregersn@gmail.com / http://gregerstoltnilsen.net)
 * Copyright 2010, Greger Stolt Nilsen
 *
 * Code is provided «as is», and no warrenty is given. Use at your own risk.
 *
 * This program is free software: you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation, either version 3 of the License, or
    (at your option) any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License
    along with this program.  If not, see <http://www.gnu.org/licenses/>.

 */
/*
 * Create a new instance with:
 * ComponentFinder finder = new ComponentFinder(this, image);
 * where "image" is the PImage (or PGraphics) to be traced.
 * use the find()-function, to get all the contours in the contours arraylist.
 * Each contour is by itself an ArrayList of PVectors.
 *
 * render_blobs() will return a PImage object that is a rendering of all the
 * blobs in the traced image.
 *
 * The Image to be traced need to be black (0x000000) objects on white background.
 * So use the threshold-filter to get to that.
 *
 * Also, unless the outer border of pixles is white, the finder will hang.
 */

package ComponentFinder;

import processing.core.*;
import java.util.ArrayList;

public class ComponentFinder
{
    PApplet parent;
    public ArrayList contours;
    int[] inout;
    int[] L;
    PImage I;
    int C;

    // An array with the 8 direction coordinates (x1, y1, x2, y2... x8, y8)
    int cd[] = {1, 0, 1, 1, 0, 1, -1, 1, -1, 0, -1, -1, 0, -1, 1, -1 };

    public ComponentFinder(PApplet p, PImage img)
    {
	parent = p;
	I = img;
	C = 1;
	L = new int[I.width*I.height];

    }

    int getNumContours()
    {
	if(contours != null)
	    return contours.size();
	else return 0;
    }

    public int find()
    {
		// Initialize the blob map to zero.
		for(int i = 0; i < I.width*I.height; i++)
		{
		    L[i] = 0;
		}

		// Make a new array list to store each contour.
		contours = new ArrayList();

		C = 1;
		
		int pos = I.width; // Start analyzing the image at the second line

		I.loadPixels();

		// Label the first line as blank pixels.
		for(int x = 0; x < I.width; x++)
		    L[x] = -1;

		for(int y = 1; y < I.height; y++)
		{
		    for(int x = 0; x < I.width; x++)
		    {
			if(L[pos] == 0) // Only process unlabeled pixels
			{
			    if(((I.pixels[pos]&0xffffff) != 0x000000))
			    {
				// White pixel
				L[pos] = -1;
			    }
			    else if(((I.pixels[pos]&0xffffff) == 0x000000))
			    {
				if((I.pixels[pos-I.width]&0xffffff) != 0x000000)
				{
				    // If the pixel above is a white pixel,
				    // then this is the start of an outer contour
				    L[pos] = C;
				    // Start tracing an outer contour
				    contours.add(contourTrace(new PVector(x, y), 0, C));
				    C++;
				}
				if(((I.pixels[pos+I.width]&0xffffff) != 0x000000) && (L[pos+I.width] == 0))
				{
				    // If the pixel below is a white pixel, then it is the start of an
				    // inner contour, and we get the id for the outer contour
				    // then start tracing the inner contour
					int l;
					if(L[pos] != 0 ) l = L[pos];
					else l  = L[pos-1];
					
					if(L[pos] != 0) L[pos] = l;
				        contours.add(contourTrace(new PVector(x, y), 1, l));
				}
				if(L[pos] == 0) if(L[pos-1] > 0) L[pos] = L[pos-1];
			    }
			}
			pos++;
		}
	}
	I.updatePixels();
	return getNumContours();
    }

    // Traces a countour at the given pixel S,
    // r is set to 0 for outer contours, and 1 for inner contours.
    // c is the label to give the contour
    ArrayList contourTrace(PVector S, int r, int c)
    {
	int next = 8;
	ArrayList cont = new ArrayList(32);
	PVector current = new PVector();
	cont.add(S.get());
	if(r == 0) // Outer contour
	    next = Trace((PVector)cont.get(cont.size()-1), 7, c);
	else if(r == 1) // Inner contour
	    next = Trace((PVector)cont.get(cont.size()-1), 3, c);

	if(next == 8) return cont; // The pixel is isolated.

	PVector T = new PVector(S.x + cd[next<<1], S.y + cd[(next<<1)+1]);
	cont.add(T.get());

	next = Trace((PVector)cont.get(cont.size()-1), (next+6)%8, c);

	if(next == 8) return cont;

	while(true)
	{

	    current.x = (float)((PVector)cont.get(cont.size()-1)).x + cd[(next<<1)%16];
	     current.y = (float)((PVector)cont.get(cont.size()-1)).y + cd[((next<<1)+1)%16];
		next = Trace(current, (next+6)%8, c);

		// Check if we are back at start position
	    if((current.x == S.x) && (current.y == S.y) && (current.x + cd[next<<1] == T.x) && (current.y + cd[(next<<1)+1] == T.y))
	    {
		return cont;
	    }
	    else
	    {
		cont.add(current.get());
	    }
	}
    }

    int Trace(PVector P, int n, int c)
    {
		for(int i = 0; i < 8; i++)
		{
		    // if pixel at position n+i is black, return n+i
		    PVector ni = new PVector(P.x+cd[((i+n)%8)*2],
					    P.y+cd[((i+n)%8)*2+1]);


			if((I.pixels[(int)(ni.y)*I.width+(int)(ni.x)]&0xffffff) == 0x000000)
		    {

				L[(int)(ni.y)*I.width+(int)(ni.x)] = c;
				return (i+n)%8;
		    }
		    // if not, mark with a negative number
		    else
		    {
				if(L[(int)(ni.y)*I.width+(int)(ni.x)] == 0)
				    L[(int)(ni.y)*I.width+(int)(ni.x)] = -1;
		    }
		}
		return 8; // Should only happen for isolated pixels.
    }

    public PImage render_blobs()
    {
		PImage blobs = parent.createImage(I.width, I.height, PApplet.RGB);
		blobs.loadPixels();
		parent.colorMode(PApplet.HSB);
		for(int i = 0; i < L.length; i++)
		{
		    if(L[i] > 0)
				blobs.pixels[i] = parent.color(PApplet.map(L[i], 1, C, 0, 255), 255, 255);//|0xff000000;
		    if(L[i] == -1)
				blobs.pixels[i] = parent.color(0, 0, 255);
		    if(L[i] == -2)
				blobs.pixels[i] = parent.color(0, 0, 0);
		    if(L[i] == 0)
				blobs.pixels[i] = parent.color(0, 0, 128);
		}
		blobs.updatePixels();
		return blobs;

    }
}



Gutting an old transistor organ

I’ve had an old Viscount c.100 transistor organ standing around in my
studio for quite a while, and I’ve finally come around to start gutting
it. The goal is to learn a bit more about electronics, as well as try to
be able to use the spring reverb, the rhythm generator and the pedals
for something else.

D7C_9061.jpg

What I am thinking with the reverb is to put it in it’s own box, so that
it can be used as an insert effect on mixers, or as a pedal effect for
guitar.

The rhythm section can be used for what it is for, but I want to build
it into a smaller box, so it is portable and can be used in different
situations. Also, since I have figured out it gates the bass pedals, I
will try and add connections so it can be used as a trigger/gate for
other stuff.

The pedals I will build into a box, with the oscillators so that it can
be used on it’s own as a foot driven bass keyboard for live performances.

I am self thought when it comes to electronics, and I have not studied
it very well, but I do know some basics. So this is a challenge, but a
good one.

In lack of a proper oscilloscope I am using xoscope with a sound card probe.

More pictures, and some sketches will come when I figure out something
useful, and have organized my notes.

Picasa face detection to Lightroom

I have written a Perl script to move data gathered from Picasa’s face detection function, into the XMP metadata of my NEF files, to be used in Lightroom.

Continue Reading