nted" in space (at instance definition creation phase): indicates the obvious fact that if garbage in > garbage out (try it).
2. Load the GH thing. Task for you: Using Named Views locate the points of interest as described further and make a suitable view. That way you can navigate rather easily around (hope dies last).
3. Your attractors are controlled from here:
The slider in blue picks some attractor to play with. You can use this while the K2 is running.
4. Don't change anything here (think of it as a black box: who cares how it works? nobody actually):
5. Enable the other "black box": job done your real-life stuff is placed:
6. Enable the solver: your "real-life" things start to bounce around:
7. Go there are play with the slider. A different attractor yields an other solution:
8. With real-life things in place if you disable the C# ... they are instantly deleted and you are back in lines/points and the likes:
9. Either with instance definitions or Lines/points change ... er ... hmm ... these "simple" parameters and discover the truth out there:
10. Since these are a "few" and they affect the simulation with a variety of ways ... we need a "self calibrating" system: some mini big Brother that does the job for us. Kinda like applying safely the brakes when it rains (I hate ABS mind).
NOTE: the rod with springs requires some additional code ,more (that deals with NESTED instance definitions) in order to (b) bounce as a whole and at the same time (b) elongates or shrinks a bit.
More soon.
…
ng/702/30
EDIT: DK2 works, not with positional tracking yet (14/09/15)
Source is here:
https://github.com/provolot/RhinoRift
Steps:
1) Download these files (also attached below):
https://github.com/provolot/oculus-grasshopper/raw/master/oculus-grasshopper_v0.4.ghx
https://github.com/provolot/oculus-grasshopper/raw/master/OpenTrackRiftGrasshopperUDP.ini
https://github.com/provolot/oculus-grasshopper/raw/master/oculus-grasshopper-test_v0.1.3dm
2) Download OpenTrack - http://ananke.laggy.pk/opentrack/, and setup/install. Once installed, double-click to open.
3) In OpenTrack, load the 'OpenTrackRiftGrasshopperUDP.ini' profile. Click the 'Start' button and move your Rift around - make sure that it looks like the Yaw/Pitch/Roll data is being sent. TX/TY/TZ will all be 0, as Oculus doesn't have absolute positioning data.
4) In Rhino, open the test 3dm. You'll notice that there are two viewports - called 'LeftEye' and 'RightEye'. These have been placed to mimic where the screens should be for the Oculus Rift --- but only when Rhino is in fullscreen mode, with the command 'Fullscreen'. The placement needs to be tweaked, but should work.
If you want to use your own model, you can load your own .3dm file in Rhino, then you can right-click on the viewport name, and go to Viewport Layout > Read from File. If you then load my test file, Rhino should open my two viewports, sized correctly, onto your model.
The placement of these viewports need to be tweaked; if you find a better viewport layout, upload an empty Rhino file with your viewports, and we can share eye-layout 'templates'!
5) In Grasshopper, open the .ghx definition. Everything that is multiple-grouped is a value that can be changed. Two things here:
- IPD: Set this and convert it to the proper units for your model.
- Left/right viewport names. In this case, leave this as-is, since you're using my example file.
6) Turn on the Grasshopper Timer, if it isn't on already.
7) In the GH definition, toggle 'SyncEyes' to be True. Then, in the left viewport, try orbiting around with the mouse. The 'RightEye' viewport should move around as well, pretty much simultaneously.
8) In OpenTrack, click 'Start', then toggle 'ReadUDP' to be True. You should see the 'OpenTrackInfo' panel fill with data that's constantly changing.
9) Move around the landscape with your camera, and when you set on a starting view that's ideal, click the triangle of the Data Dam component to 'store' the data.
10) Finally, toggle 'OculusMove' to be true. If all works correctly, both viewports should move based on the Rift's movement.
Let me know if you have any problems!
Cheers,
Dan…
Added by Dan Taeyoung at 11:47pm on December 10, 2013
sistance of radiative and convective heat transfer through the _filmCoefficient input on the "Create Therm Boundaries" component. This filmCoefficient in W/m2K represents the "U-Value" of the air film between the edge of the THERM materials and the surrounding environment that is at the specified _temperature. The extra resistance from this air film is why the full construction U-Value that you are getting out of THERM is a lower than just the (conductivity of material) / (depth of the material). Accounting for air films is particularly important when you get constructions that have a high overall conductivity (like a single pane window), since almost all of the resistance of such a construction is due to the air films.
To elaborate further, you might have noticed that, in the example files on hydra, I set this filmCoefficient to be either "indoor" or "outdoor", which basically uses some code that I wrote to autocalculate the film coefficient for you. I take into account both the emissivity of the material at the boundary (which gives you more air film resistance for lower emissivities) as well as the orientation of the boundary in the 3D space of the Rhino model. The code I wrote will take these parameters and match them to those published in ASHRAE Fundementals, which you can see in table 1 of the first page of this PDF:
http://edge.rit.edu/content/C09008/public/2009%20ASHRAE%20Handbook
I interpolate between these values in the event that your emissivity is not 0.05, 0.2, 0.9 or the orientation of your boundary is not any one of the 5 that they give.
I know that THERM also has the capability to actually run the radiative and convective formulas that you posted, Mauricio, as opposed to just using a single film coefficient to account for all of this resistance. The running of these formulas is particularly useful is the radiant temperature at the boundary is different than the air temperature. However, as long as you are ok with this assumption that the air and radiant temperatures are the same (which is the case for all of the situations that I have encountered), the film coefficient is perfectly sufficient. If anyone ever has need for this capability of running boundary conditions that have different radiant and air temperatures, please post here and I can think of a way to implement it. I rather like the simplicity of the current interface, though, and I think that I will keep it this way until we understand the purposes for why someone would need separate radiant and air temperatures.
-Chris…
and export the geometry out to VVVV to render it LIVE! RawRRRR. In this case, a digital audio workstation Ableton Live, a leading industrial standard in contemporary music production.
the good news is that VVVV and ableton live lite is both free.
https://www.ableton.com/en/products/live-lite/
i am not trying to use ipad as a controller for grasshoppper. I wanted to work with a timeline (similar to MAYA or Ableton or any other DAW(digital audio workstation)) inside grasshopper in an intuitive way. Currently there is no way of SEQUENCING your definition the way you want to see that i know of.
no more combersome export import workflows... i dont need hyperrealistic renderings most of the time. so much time invested in googling the right way to import, export ... mesh settings...this workflow works for some, for some not ...that workflow works if ... and still you cannot render it live nor change sequence of instruction WHILE THE VIDEO is played. and I think no one wants to present rhinoceros viewport. BUT vvvv veiwport is different. it is used for VJing and many custom audio visual installation for events, done professionally. you can see an example of how sound and visuals come together from this post, using only VVVV and ableton. http://vvvv.org/documentation/meso-amstel-pulse
I propose a NEW method. make a definition, wire it to ableton, draw in some midi notes, and see it thru VVVV LIVE while you sequence the animation the WAY YOU WANT TO BE SEEN DURING YOUR PRESENTATION FROM THE BEGINNING, make a whole set of sequences in ableton, go back change some notes in ableton and the whole sequence will change RIGHT INFRONT of you. yes, you can just add some sound anywhere in the process. or take the sound waves (sqaure, saw, whateve) or take the audio and influence geometric parameters using custom patches via vvvv. I cannot even begin to tell you how sophisticated digital audio sound design technology got last ten year.. this is just one example which isn't even that advanced in todays standard in sound design ( and the famous producers would say its not about the tools at all.) http://www.youtube.com/watch?v=Iwz32bEgV8o
I just want to point out that grasshopper shares the same interface with VVVV (1998) and maxforlive, a plug in inside ableton. audio mulch is yet another one that shares this interface of plugging components to each other and allows users to create their own sound instruments. vvvv is built based on vb, i believe.
so current wish list is ...
1) grasshopper recieves a sequence of commands from ableton DONE
thanks to sebastian's OSCglue vvvv patch and this one http://vvvv.org/contribution/vvvv-and-grasshopper-demo-with-ghowl-udp
after this is done, its a matter of trimming and splitting the incoming string.
2) translate numeric oscillation from ableton to change GH values
video below shows what the controll interface of both values (numbers) and the midi notes look like.
https://vimeo.com/19743303
3) midi note in = toggle GH component (this one could be tricky)
for this... i am thinking it would be great if ...it is possible to make "midi learn" function in grasshopper where one can DROP IN A COMPONENT LIKE GALAPAGOS OR TIMER and assign the component to a signal in, in this case a midi note. there are total 128 midi notes (http://www.midimountain.com/midi/midi_note_numbers.html) and this is only for one channel. there are infinite channels in ableton. I usually use 16.
I have already figured out a way to send string into grasshopper from ableton live. but problem is, how for grasshopper to listen, not just take it in, and interpret midi and cc value changes ( usually runs from 0 to 128) and perform certain actions.
Basically what I am trying to achieve is this : some time passes then a parameter is set to change from value 0 to 50, for example. then some time passes again, then another parameter becomes "previewed", then baked. I have seen some examples of hoopsnake but I couldn't tell that you can really control the values in a clear x and y graph where x is time and y is the value. but this woud be considered a basic feature of modulation and automation in music production. NVM, its been DONE by Mr Heumann. https://vimeo.com/39730831
4) send points, lines, surfaces and meshes back out to VVVV
5) render it using VVVV and play with enormous collection of components in VVVV..its been around since 1998 for the sake of awesomeness.
this kind of a digital operation-hardware connection is usually whats done in digital music production solutions. I did look into midi controller - grasshopper work, and I know its been done, but that has obvious limitations of not being precise. and it only takes 0 o 128. I am thinking that midi can be useful for this because then I can program very precise and complex sequence with ease from music production software like ableton live.
This is an ongoing design research for a performative exhibition due in Bochum, Germany, this January. I will post definition if I get somewhere. A good place to start for me is the nesting sliders by Monique . http://www.grasshopper3d.com/forum/topics/nesting-sliders
…
The first is that XML requires that there be one root tag, where GH does not necessarily require their be one trunk to build its tree. In more GH specific terms, any XML document would always require that a path be {0;....} and there could never be a path like {1;...} or {13;....}. The fact that GH rarely does this is inconsequential, because its possible, so therefore has to be dealt with somehow. The first option is to simply have the component throw an error and not write the XML. That's fine, but in order to use the data you'd have to start remapping or splitting out your tree... not fun. The second option would be to wrap the data in a root tag so that the XML requirements are met, however this is manipulating the data that you're saving which I don't like. The third option, would be to write each root trunk out into separate files. The easiest option is 2, but I don't like modifying the data, although it would be possible to put attribute on that root tag to discard it if it was later read back into GH.
The second issue is that in order to write comprehensible XML, you'd need to specify the element names along with the values, which has the potential to be very cumbersome. In your case, you have all your tag names there and are just looking to modify the data that was in those tags, but this is definitely the best case scenario. If the ability is there to write xml, then that means it can come from anywhere in GH, not just from a previously parsed xml doc. Inevitably, assuming one would go through the hassle of creating names for all their xml elements, invariably there would be an path that would be missed, so what would you do as the name for that element? Throw an error? write out something else based on the path? Taking this to the extreme, would it be possible for someone just to supply data with no element names whatsoever?
In thinking about that last question, it would be nice if the GH path could be used for writing out the element names, but unfortunately both curly brackets and semicolons are invalid in element names (ie so <{0;1;0;5}>SomeData</{0;1;0;5}> is invalid because of the {,}, and ; characters. Element names also can't start with a number). So in order to write out an actual GH path, it would need to be transformed in some manner so that the previous example might look something like this <GhPath_0-1-0-5>SomeData</GhPath_0-1-0-5>
There are a bunch of other potential issues as well that would likely leave writing xml from GH in somewhat limited state. Some of those include enforcing/validating schema and supporting for other xml features(such as CData).
Ultimately, when I first wrote the xml parser I wasn't sure what people's needs were going to be in regards to using XML in GH. Based on the two main issues I outlined above, I chose to leave writing xml out for the time being. Its not that it can't be done (far from it actually), some decisions just have to be made pertaining to those issues. It does seam like it would be something useful, so I'll begin to look into adding it, and if you've got some other suggestions or preferences about which solution would be better, then sound off.
EDIT:
One thing I should note, all of my comments above pertain to saving xml. Once you've parsed an xml file, its no longer xml, its just regular data with GH which can be manipulated however you prefer. It would be entirely possible to to use GH's tree manipulation and list tools to move chunks of xml around, remap xml, etc.…
esentar Digital Process: Generative Design Technologies Workshop; Taller especializado que se llevara a cabo en 4 de las ciudades mas importantes de la republica mexicana [Puebla] [Mexico DF] [Guadalajara] [Leon] en Enero y Febrero de 2012. http://gendesigntech.wordpress.com/
Enfocado principalmente a arquitectos, diseñadores industriales, diseñadores de interiores, Urbanistas, Artistas digitales, estudiantes y profesionistas afines al diseño; este Workshop tiene como objetivo proporcionar a los participantes los conocimientos y recursos tecnológicos que les permitan desarrollar los elementos de un proyecto desde la concepción hasta su aplicación de manera completa. Apoyándose en un conjunto potente y flexible de plataformas, los participantes aprenderán a generar, analizar y racionalizar morfologías complejas, formas orgánicas libres y algoritmos computacionales avanzados así como a producir visualizaciones fotorealístas aplicables en diversos proyectos de Diseño. A lo largo de 5 dias de intenso trabajo, exploración y retroalimentación los participantes seran guiados en el desarrollo de un flujo de trabajo mas dinamico, que les permitira explotar al maximo el potencial de las herramientas y potencializar sus habilidades, aptitudes y capacidades. Instructores: Leonardo Nuevo Arenas [Complex Geometry] José Eduardo Sánchez [DesignNest] Daniel Camiro/Luis de la Parra [Chido Studio] http://issuu.com/chidostudiodiseno/docs/digprowork Conoce el programa aquí. http://gendesigntech.wordpress.com/program/ Para registrarte por favor visita. http://gendesigntech.wordpress.com/registro
…
he last nights, let me try to describe it:-disclaimer: I'm an industrial designer, my coding experience can be compared to your, when you were 4 year old :)-disclaimer 2: I did a picture at the end of the post that maybe explains more than my words
the component has 2 inputs (Start Value, End Value) and one output (Picked Value)
this phantomatic component (which I would refere to as "dynamic value picker") supports any amount of domains on every input -> it works as if they come grafted, from a "longest list" component
The component "at rest" shows only one slider -with question marks on both edges-
For every couple on inputs you connect (1 Start Value connection + 1 End Value connection) it would visually generate a new slider (exactly like a "number slider" component)main difference from the "number slider" component, this one would show the Start Value and End Value numbers at the edges of each thus generated slider
Right click -> edit on it would recall a window similar to the "number slider", with the main difference that only the first part of those options would be present (see attached image for clarity)Whatever slide accuracy you set, it will affect the whole "dinamic value picker" phantom component (if you set "integer numbers" and for any reason one or more inputs are "floating points numbers", the component automatically rounds the inputs to the best "Integer", and allows you only to pick integer numbers in-between)
If you suddenly change a "Start Value" or an "End Value" input, the affected slider/sliders in the component will try to stay as close as possible to the same % value they were before (example if the domain was from 5 to 11, integers only, and you first picked the value 8, the slider was exactly in position 50%: when you change the End Value domain to 21 the slider will set itself to 13 - yes, I picked an easy one lol )
When you first plug a couple of Start Value + End Value, the slider sets itself to Picked Value = Start Value
It could also be possible to supply negative values as Value End and positive values as Value Start: the slider let you pick a number on that domain regardless of the numerical order you use
Last thing, but it's just fancy imagination, if you zoom-in the output (Picked Value) connection dot, a little - and + appears (like in other common components), letting you add a new cursor to every existing slider (it could be possible to customize the color of the new cursor to avoid confusion)
This is the exact description of what I would ask to the lamp genie :)
I attach a pic I just did, in the hope to better explain myself: picture link
and of course thank you again for reading this long poem!
…
rring to the above image)
Area
effective
effective
Second
Elastic
Elastic
Plastic
Radius
Second
Elastic
Plastic
Radius
of
Vy shear
Vz shear
Moment
Modulus
Modulus
Modulus
of
Moment
Modulus
Modulus
of
Section
Area
Area
of Area
upper
lower
Gyration
of Area
Gyration
(strong axis)
(strong axis)
(strong axis)
(strong axis)
(strong axis)
(weak axis)
(weak axis)
(weak axis)
(weak axis)
A
Ay
Az
Iy
Wy
Wy
Wply
i_y
Iz
Wz
Wplz
i_z
cm2
cm2
cm2
cm4
cm3
cm3
cm3
cm
cm4
cm3
cm3
cm
I have a very similar table which I could import to the Karamba table. But I have i_v or i_u values as well as radius of inertia for instance.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
dimensjon
Masse
Areal
akse
Ix
Wpx
ix
akse
Iy
Wpy
iy
akse
Iv
Wpv
iv
Width
Thickness
Radius R
[kg/m]
[mm2]
[mm4]
[mm3]
[mm]
[mm4]
[mm3]
[mm]
[mm4]
[mm3]
[mm]
[mm]
[mm]
[mm]
L 20x3
0.89
113
x-x
4,000
290
5.9
y-y
4,000
290
5.9
v-v
1,700
200
3.9
20
3
4
L 20x4
1.15
146
x-x
5,000
360
5.8
y-y
5,000
360
5.8
v-v
2,200
240
3.8
20
4
4
L 25x3
1.12
143
x-x
8,200
460
7.6
y-y
8,200
460
7.6
v-v
3,400
330
4.9
25
3
4
L 25x4
1.46
186
x-x
10,300
590
7.4
y-y
10,300
590
7.4
v-v
4,300
400
4.8
25
4
4
L 30x3
1.37
175
x-x
14,600
680
9.1
y-y
14,600
680
9.1
v-v
6,100
510
5.9
30
3
5
L 30x4
1.79
228
x-x
18,400
870
9.0
y-y
18,400
870
9.0
v-v
7,700
620
5.8
30
4
5
L 36x3
1.66
211
x-x
25,800
990
11.1
y-y
25,800
990
11.1
v-v
10,700
760
7.1
36
3
5
L 36x4
2.16
276
x-x
32,900
1,280
10.9
y-y
32,900
1,280
10.9
v-v
13,700
930
7.0
36
4
5
L 36x5
2.65
338
x-x
39,500
1,560
10.8
y-y
39,500
1,560
10.8
v-v
16,500
1,090
7.0
36
5
5
I have diagonals (bracings) which can buckle in these "non-regular" directions too, and they do. If I could add those values then in the Karamba model I could assign specific buckling scenarios..... I can see another challenge which will be at the ModifyElement component, I will not be able to choose these buckling lengths, in these directions.
Do you think this functionality can be added within short, or should I try to find another way to model these members?
Br, Balazs
…
es of the mesh faces as points. Is the component HingePoints reordering the points in any particular manner, i.e. point 1 and point 2 being the opposite corners of the quad faces?
1. Solution exception:Cannot marshal 'parameter #5': Invalid managed/unmanaged type combination (Arrays can only be marshaled as LPArray, ByValArray, or SafeArray).
I have attached the script and a couple of snapshots showing the red component inside the origami cluster.
Aside from this error, I had also noticed that the two last addition components which provide the RestAngle and Strength Values to the Kangaroo Physics component were loosing the '0' value entered as 'A' (they were getting Null instead). I have replaced the manual entry with a panel entry. However, I still cannot get the Kangaroo Physics component to run the simulation (or the forces might be somehow equal to the initial ones, and hence not getting the system to move in any way).
I am out of ideas as to why this might be happening. I had worked with Kangaroo before and really liked it. This origami component is really promising and I would really like to get it to work on my machine.
The grasshopper version I am using is: 0.9.064. Kangaroo is 00.096.
I would be very grateful if you could have a look into it and let me know your thoughts.
Thanks for your help!
Best
Maria
…
es which you can see below in my mesh repair report I ran on the mesh after baking it.
This is a bad mesh.
Here is what is wrong with this mesh: Mesh has 2 non manifold edges. <<------because of the duplicate face Mesh has 1 duplicate face. Skipping face direction check because of positive non manifold edge count.
General information about this mesh: Mesh does not have any degenerate faces. Mesh does not have any extremely short edges. Mesh does not have any naked edges. Mesh does not have any self intersecting faces. Mesh does not have any disjoint pieces. Mesh does not have any unused vertices.
Continuing the repair process does rid the duplicate face. I also realized that rhino 5 has the command called "ExtractDuplicateMeshFaces" which works quite nicely.
However, this "method" is not available in rhinocommon currently. So I just wonder who can I ask to add it? It seems it would make sense to be in rhinocommon considering we have methods available for each of the other tests run by mesh repair. The reason I am interested in this command is it seems to work very fast.
Thanks
…
Added by Michael Pryor at 12:53am on December 11, 2014