Grasshopper

algorithmic modeling for Rhino

All,

I'm going to start with an abstract of my project,  just for feedback.  (SEE ILLUSTRATION!)

I am currently designing an installation that basically works like a glorified MIDI controller, to drive an interactive exhibit of sounds and visual effects.   It will be a small existing space, or built room.   The primary sensors will be a 4x4 grid of piezos in floor modules to emulate a common MIDI control surface.  That signal will be generating MIDI notes, as well as manipulating geometry in grasshopper to be projected.  I know there are other platforms for audio-video performance like this, but I would like to see how far I can get in Grasshopper.

I have succeeded in calibrating the floor pads to send MIDI messages to a program of my choice (working with Live).  At this point there are a couple options as to how to integrate all these operations into Grasshopper, and I wanted to throw this to the community to see if anyone had some early guidance.

1.  I assume I am going to have to write the MIDI transmit code into the Firefly Firmata sketch.  My question is, will I still be able to drive MIDI via USB in Ableton Live while I read the same data into Grasshopper via USB?   Or will my COM ports be hung up?  

2.  Assuming I can independently achieve the sounds I want in Live, and the video I want in GH... what seems to be the best course for achieving these simultaneously?

 

Views: 356

Attachments:

About

Translate

Search

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service