Grasshopper

algorithmic modeling for Rhino

Hi!,

 

I've been working on a really heavy model in GH for two months. In that process, sometime I've needed to bake geometry in a intermediate step of the process and refence it to GH again to disable first components in my definition to focus process power in the next stage.

 

Could be posible o add a button or option to bake+reference geometry or data in some point of the definition? (layer and grouping options could be great but not a priority).

 

Best Regards.

 

Ángel Linares

Views: 3666

Replies to This Discussion

 

Yes. VRay's Proxies are interesting. I think the hack is that it works directly with the GPU, and sends the info straight to the graphic pipeline, and so, is able to handle much larger scenes. Optimised for many parallel pipelines?

 

The whole idea of breaking up the script into semi indepencent segments also raises the question of parallel or multiprocessing.

 

If a cluster or component can be processes independently, this must be able to leverage mulllticores, the GPU and better utilise the extra memory that comes with 64bit?

 

It may take more memory, and run as separate processes, but it will make better use of today's hardware?

 

The other question is whether its worthwhile measures like making things 'volatile' can be automated? I guess it's an LOD question. CityEngine allows transactions to be tagged with an LOD level attribute. The user has the option to define this and the script will ignore or disable the appropriate transactions.

 

Going back to VRay, I think it also automatically 'culls' or selects the appropriate instance/version of the component based on the view distance. Objects that are further away are of course dispalyed using the less detailed instances. I wonder if this can be applied to a dependency graph app like GH?

 

 

Not sure how the baked geometry lives on in your project. can you expand?

 

In catia, the user is encouraged to think seprately in terms of specification (parametrics) and brep (resultant or baked geometry). The user can reference 'brep' geometry as input or part of an ordered geometry set / powercopy, but it is recommended that this geometry are 'features' and not raw.

 

This is not always possible, for example, when we need to use a subelement like an edge or vertex of a intersection or extrude operation.

 

In any case, catia seems have both stuff live. There isn't such a big divide between pre and post baked geometry. the spec info is different, and being parametric, easier to be used as part of a dependency graph.

 

Guessing:

 

If you can already bake some elements, it sounds like you should be able to just bake and skip over / suppress those portions of the graph? Not sure if there is a component for this. Part of User Objects?

 

It's more than ikely that you would still need access to some of the parametric data associated with the baked portion of the graph, without the overheads of having everything live? Maybe all you need is to also 'bake' and attach some tagged arrays that can be read as plain old data downstream. Or better yet as a self describing object or datatype (with metadata?) that can be used to support manual edits downstream. Repeatability would mean that some "indexing". hopefully "semantic" is required. I guess this coming with user data?

 


Ok, so here's what I came up with in one day. A new object called Geometry Cache, looks like this, depending on how it's connected:

 

 

I do my best to maintain as much information as possible, so if you Bake, then change the layer of some of the objects, then bake again, the new objects should end up in the same layer.

 

All baking and deletion operations go into the Rhino undo buffer so (assuming the undo buffer is not getting flushed) you should always be able to revert to a previous state.

 

Data tree layout is maintained and when geometry is missing from the Rhino document, nulls will be inserted instead. Only problem so far is that if ALL geometry in a certain list is missing it might go wrong.

 

The word "circles" is something you have to pick yourself. By default the Cache Object has no name and is therefore non-active.

 

--

David Rutten

david@mcneel.com

Poprad, Slovakia

Bravo!

Great!!!

 

Man, I don't know how to be thankfull enought with you for this component :) All heavy definitions are gonna be esay to deal with. Workflow between Rhino and GH is now "automatic" and closed.

 

Best Regards and thank you again :)

 

 

 

 

This discussion starts me wondering the many ways 'geometry cache" might be used.  I would appreciate anyone responding to this discussion with interesting ways they have made this special component useful in a definition.  It would also be interesting to hear from anyone that found the use of 'geometry cache' to create any practical problems down stream.

Thanks,

Stan

I've used it in a couple of projects. One that needs to bake 5 different groups of objects generated with grasshopper several times, edit in Rhino and then re-reference into GH. This was a small housing project.

 

The other was a special project in wich I used it to clean a clomplex generative definition of objects dividing it in two definitions: first generation (about 10 seconds to generate) and then baking and reference again disabling first part. I used this definition in the first drafts of a urban park project.

 

Hope this helps you a little bit.

 

Best Regards.

 

Seems like the Geometry Cache has developed in very promising ways!

 

Question: Can the Geometry Cache (GC?) be combined with Clusters?

 

One problem I notice in the office is that a lot of competition teams do not use GH or any other parametric tool, because of time limits. Unless the someone on the team is really really quick and confident. And even then, he would be a bottleneck beacuse he would be scripting in isolation.

 

Can GC be extended to cache a 'cluster' of nodes, with/without the associated geometry? This would allow another team member to work on that portion of the 'script' concurrently.

 

This would be like a visual equivalent of 'include' statements in programming, where the a script can nest/include other external scripts..... very much the way MCAD apps can assemble part models in an assembly. Parameters + algorithms in one part are accessible in another, eventhough they are stored separately.

 

For example, one scripter could be working on a high-level rig or 'skeleton' script. In a building project, this could be a 3d grid frame. Another team mate may be working on a cladding panel or structural framing. Both would 'reference' the skeleton script/3dm file via GC.

 

There would be the option to read in and use the 'skeleton' as reference geometry, which would be tagged with appropriate info like 'floor' and 'grid' numbers by the external script. There may be even a array/datatree of nodes or named points that define the grid intersections provided.

 

Or, the script could be just read in and merged into the 'host' script. I guess, there will need to be some 'type checking' to ensure that graph doesn't choke.

 

There would be a 'master composition' script/3dm where all the 'part' scripts would be assembled, and GC can already reference the 'dumb' parts of the project that were not worth doing in GH.

 

Relational Modeling, GH-style?

 

 

 

 

Dominic, the future of modeling I think is about skeletal modeling. Practices around this seem to be now emerging.

Sounds intriguing..... Any examples of this?

 

This is an overall trend that I am observing. I view the natural design process as the most perfect design process and the Euclidian process as much flawed.

 

The big break through of grasshopper is that it has liberated architects from older Euclidean concept of form. Programs are authoring froms (they did that from the beginning of CAD), but Grasshopper made designers comfortable with what was previously happening unseen.


So this is the big shift is happening now, but the import lesson from nature is that a butterflies are not born as a mini butterflies, there is process that takes an early stage conception that looks very different to the final stage result. As design activities get executed by programs I believe that a similar transformation will take place.

But don't ask me how ?

 

But don't aks me how ?

Maybe, but at some level the DNA needs to be 're-combined', 'spliced' or otherwise modified in terms of what they express.... in Euclidean space or some other frame of reference.

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service