Forum rss-feed


Developers: Digging into the code

Most Recent

written by: geert

Hi barnone,

Given your current drive to dig into the code, we've created an open-source prototype OSC EigenD agent with extensive documentation.

More information here:

We hope that this will facilitate your journey into our code and bootstrap your development projects.

Take care,


written by: GoneCaving

Tue, 10 May 2011 19:40:13 +0100 BST

Hey all,
I've spent the last few train journeys on my daily commute digging into the code, mainly reading the python code in plg_simple trying to make sense of the Agents, and been wondering how others have downloaded the codebase, and are poking around trying to find out how it hangs together?

I wonder if there's a good way for us to pool our findings?


written by: 0beron

Wed, 11 May 2011 11:18:17 +0100 BST

There's a developer section on the wiki, seems a good place to start mapping things out.
Once we figure out how sections of it work we can submit some patches with some comments.

written by: dhjdhj

Sat, 1 Oct 2011 16:44:30 +0100 BST

I just downloaded the entire tree --- however, all I really want is the bottom level USB interface to the hardware and code that exercises that interface. Is such a thing part of the build and if so, is there a way to JUST build that?

(I know, I haven't looked at the code yet, but if I'm going to have a go at this, I'd like to make it as painless as possible :-)

written by: NothanUmber

Sat, 1 Oct 2011 17:27:34 +0100 BST

Perhaps implementing a subclass of alpha1/2::active_t::delegate_t (e.g. \lib_alpha2\alpha2_active.h) that delegates the data further into Max could be a good start. (There are examples e.g. in EigenD\lib_alpha2\src\a2dump.cpp for the keys or in EigenD\lib_alpha2\src\audio_test.cpp for the headphone etc.).
Or you'd have to go even further down e.g. by directly using the usbenumerator in EigenD\picross\pic_usb.h. Or you could directly speak with the USB drivers as it's done in EigenD\picross\src\pic_usb_win32/macosx/generic.cpp.

As far as I know the sources for the drivers themselves are not released.
Never played with this stuff just came accross when trying to get a first overview, so take everything with several sacks of salt...


written by: barnone

Wed, 5 Oct 2011 21:48:21 +0100 BST

Yes, lib_alpha alpha2:active_t has the functions to get raw key presses.

and interact with the alpha.

I might try just capturing some debug output as a simple exercise.

written by: barnone

Thu, 6 Oct 2011 04:49:56 +0100 BST

Ok, yeah a walkthough of all the different layers here would save a lot of time. I haven't worked with binding python to native code but I'm assuming at this point that "piw" stands for "Python Interface Wrapper"? Starting to understand the organization with the .pip files and how SConscript works.

The challenge is that we have a very dense codebase with native code, wrapper code bridging to python, an interpreted language belcanto that seems to assemble and wire a lot of components together, so it's difficult to get bearings. There are almost no comments. I can read code but could use to be oriented a bit.

Even defining some of the basic naming could help a lot.

I want to understand the bootstrap process and how everything gets wired together, ever if it's a high level narrative.

I'll post back as I find stuff. thx.

written by: john

Thu, 6 Oct 2011 08:45:08 +0100 BST

If you want the easiest, cleanest way to get OSC data out from an Eigenharp, probably the best thing to do is to simply write an Agent that is an OSC server for the schema you want. An EigenD setup with just that Agent plumbed directly to the instrument input Agent would be very light (probably startup in a second or so), use little memory and then EigenD would take care of the very large pile of difficult and error prone USB hassles out there. It seems an awful lot of work to do to try and make a low level driver that actually already exists. It's worth remembering that the weight and complexity of any given Setup scales pretty much with the number of Agents loaded, so tiny setups are actually quite light - the desire to not run the main EigenD process is probably more philosophical than useful here.

In order to do just that it's probably best to start with an existing Agent and modify it - something like the MIDI out perhaps. That also contains all the logic you need for data decimation (which you will likely need). The real challenge (and what is holding us up with this) is getting the OSC schema right. The event model of MIDI is basically broken and we have spent a lot of time thinking about this recently. However if what you want if just MIDI style data, its all made a lot easier and you could probably knock something up in a week or so...


written by: jim

Thu, 6 Oct 2011 10:02:22 +0100 BST

In the hope of kicking off something exiting, I'm going to write an agent today which
will dump raw keyboard information out into OSC.

Geert will be building a simple setup which will just load this agent; It'll be so small
it should load in <2s.

It's our goal to move all the inter-agent communications in EigenD to OSC. I'd like
to use this agent together with a Wiki page to start a reference implementation for
an OSC schema.

I will try and provide copious commentary in the various bits.


written by: geert

Thu, 6 Oct 2011 11:43:45 +0100 BST

I've posted a bare-bones MIDI only setup for the Alpha together with the Belcanto for creating it from scratch. This should be helpful if you're only using EigenD to send MIDI data to external applications. It loads very quickly.

written by: dhjdhj

Thu, 6 Oct 2011 13:23:07 +0100 BST

ok --- this is starting to get interesting (grin)

If the data comes out as MIDI through EigenD processing, won't it still have the limitation where if one tries to use pitchbend or aftertouch (say) on one key while holding another one down and they're both on the same channel, then the effect will be heard on both notes (because of the way MIDI works)?

This is one of the reasons I proposed to get raw information and process it with Max before sending it out as MIDI. For example, if I'm playing with two hands, then a dynamic algorithm that says that if two keys are far enough apart (where 'far' could be defined in different ways such as the note interval or (more interesting) the number of physical rows between them) then generate the two notes on different channels. This kind of thing is trivial in Max.

I don't think that OSC is necessary or even right for this --- it seems to me that just getting the raw data (key number pressed, X,Y,Z values, etc) as integer lists into Max is sufficient (and both easier and more efficient as well).

At the bottom level, I don't think the drivers should be producing any semantic information beyond what is happening physically on the eigenharp.

An intermediate step that produces OSC from that raw data could be available for systems that need it.

written by: geert

Thu, 6 Oct 2011 13:51:05 +0100 BST

If you set the MIDI converter to poly, it will automatically put new notes on other channels, this also applies to the CC messages that are then sent out for the same key and the pitch bend.

This can either be done through the Stage tab, through the GUI windows of the MIDI converter in the global settings or through Belcanto.

written by: dhjdhj

Thu, 6 Oct 2011 14:32:59 +0100 BST

I know that --- but I've been bitten by that issue many times....I don't necessarily want the channel number to be driven by the number of keys being held down.

I can so easily imagine wanting the flexibility to control the MIDI channel by how hard I hit the key (to get a completely different sound no matter where I am physically on the keyboard), or by the relative location of two rows, and/or many other variations.

Floating keygroup splits become trivial to implement as well.

This is stuff that is trivial to do in Max given just the raw key information as input. There are a lot of people (potential customers!) who know how to build such things in Max.

written by: barnone

Thu, 6 Oct 2011 14:42:30 +0100 BST

Thanks for the responses guys

"the desire to not run the main EigenD process is probably more philosophical than useful here. "

Yes I agree completely.

@jim @geert
"Geert will be building a simple setup which will just load this agent; It'll be so small it should load in <2s.

Thank you. The hello world of a simple agent setup that bypasses the midi converter. This is what we need.

"At the bottom level, I don't think the drivers should be producing any semantic information beyond what is happening physically on the eigenharp. "

Yes this is exactly my opinion on the matter as well. MIDI has note semantics and it's not clear which exact physical key on the harp was pressed.

I'm also interested in communication via OSC the other way, ie. addressing the LEDs on the harp. Obviously something has to be listening on a port to get this inbound OSC. What object in the EigenD universe would be appropriate to house this implementation? I don't want to push my luck though. I appreciate that you guys will give us an agent example.

If you have OSC out, then it's trivial to get the information you want into MAX, right? IMO john is right that leveraging what's already in eigenD is the way to go. The large setups are what are heavy. A small setup would be very lightweight and would run headless with no UI easily.

Anyway, understanding how to load a bare bones setup is extremely useful. Geert, the link you posted is illuminating as well. Thx.

written by: dhjdhj

Thu, 6 Oct 2011 15:00:37 +0100 BST

However, Max knows how to deal with OSC but I don't know what is going to be encoded in the OSC packets.

If the packets will just contain raw data (physical key number, x,y,z data and so forth) then that's fine and my only caveat is that OSC is an expensive way to provide this.

However, as I recall, OSC (which I haven't really looked at in years) is also designed to have deeper semantic information and I don't want any of that imposed. I particularly don't want to spend precious CPU cycles decoding OSC messages that have strings in them (e.g. look at the stuff here) as CPU cycles need to be reserved as much as possible both to process the raw data quickly (latency counts) and to allow VSTs/AUs to run.
Once you start dealing with strings, then one has to deal with extra memory management (even if Max is doing it) and that's expensive.

I just think this problem is being made harder than it needs to be. OSC seems like a hammer when the nail could be pushed in with a fingertip.

But I'm excited that at least this kind of thing is being considered.

If you have OSC out, then it's trivial to get the information you want into MAX, right? IMO john is right that leveraging what's already in eigenD is the way to go. The large setups are what are heavy. A small setup would be very lightweight and would run headless with no UI easily.

written by: barnone

Thu, 6 Oct 2011 15:23:53 +0100 BST

I think given the OSC example, you can interpolate to any communication method you wish.

written by: dhjdhj

Thu, 6 Oct 2011 15:26:26 +0100 BST

There is of course one huge benefit of going forward with OSC in the short term.....since Max already understands OSC, there is no need to build a Max External before one can start experimenting.

written by: dhjdhj

Thu, 6 Oct 2011 15:53:43 +0100 BST

If it is the case that EigenD can just act as glue between the hardware and some other processing system (such as Max) without introducing significant timing delays/latency and without imposing any semantics other than reporting what has physically happened (key was pressed, breath was blown), then I agree completely.

However, if it imposes any abstractions that prevents one from knowing the underlying physical behavior, then it shouldn't be there.

The other downside of keeping EigenD (and OSC) is that only apps that support OSC can work with it whereas a simple mechanism that just reports physical information about the Eigenharp allows new apps to be written that don't need to understand the more complicated OSC protocol.

"the desire to not run the main EigenD process is probably more philosophical than useful here. "

written by: barnone

Thu, 6 Oct 2011 23:31:23 +0100 BST

Getting back to the topic at hand which is sharing some info from digging into the code. I'm spending a bit of my lunch hour digging around and learning stuff hopefully.

Was looking at the bootstrap process in eigend a bit.

One nice hint is to run it from the commandline by going to ./tmp/bin
eigend --stdout

[edit] I guess that hint was already mentioned in the Debugging thread

This pipes the log output to stdout and gives you some nice verbose startup information.

I found in ./app_cmdline
and eigend.cpp in ./app_eigend2

the app in ./tmp/bin loads the GUI. Looking through the .cpp, it's calling the eigend backend to bootstrap and load the current setup etc.

I wanted to run eigend as strictly a commandline app. I wasn't able to run the or .pyc file as it's not finding it dependencies etc.

Any hints on how to best fire up the cmdline version? I guess I could create a new version of eigend.cpp that skips the JUCE GUI and fires up some basics like loading a specific setup.

The ./tmp/bin directory is full of a lot of little cmdline utilities. cheatsheet shows the belcanto reserved words I guess. Some interesting things in there like "digeridoo", "strummer" etc.

Some of the apps look like I don't want to run them. Especially anything to do will device calibration, firmware loading etc.

plg_simple has the agents. I think I might try to add one myself and fire it up, see what happens.

written by: jim

Fri, 7 Oct 2011 09:59:57 +0100 BST

The command line version is pretty much defunct these days (actually, I thought I'd deleted it)

The best way of starting from the command line is to run

tmp/app/ --stdout --noauto

(The path might be a bit wrong, I'm on a Windows box atm)

That file is actually a link to the tmp/bin file, but Mac apps need to be run inside an App bundle to
function correctly.

You can run this inside gdb or whatever. Running inside the debugger is usually more reliable
than attaching a debugger later.

On windows, you can run tmp/bin/eigend from something like cygwin, or tmp/bin/eigend_con from cmd.
All the gui executables have _con versions which are console subsystem versions.

Attaching visual studio for debugging once eigend is running works well.

written by: barnone

Fri, 7 Oct 2011 20:24:13 +0100 BST

Ok, I have my developent env setup now with these components on Eclipse 3.6.1

Eclipse CDT
Eclipse PyDev
Eclipse eGit

Don't really need eGit but hey, worth trying out.

I'm very familiar with Eclipse, so it's proving helpful for exploring the code as well.

I'm taking notes and will collect some information to post to Wiki for other Developers, thanks

Please log in to join the discussions