Grasshopper

algorithmic modeling for Rhino

Dynamically link or mass-internalize .ghcluster references

I'd like to be able to use .ghcluster files and links in multiple .gh/.ghx projects on my own personal computer, but I don't see a good way to allow my co-workers to access the same .gh/.ghx projects without dragging in each of the .ghclusters and re-wiring all of the i/o.

I see two possible ways to use clusters like libraries across multiple computers without referencing a server or network location, which I understand causes GH to freeze temporarily.

We use Google Drive, so we each have local copies of all of the files. However, we each have our local GDrive folder in different locations, as several of us use GH/Rhino through Windows Virtual Machines. So, right now, I don't see a way to preserve the links to .ghclusters across different machines.

Is there a way to dynamically link the .ghclusters, such that the .gh project may look for .ghclusters within the same folder as the .gh file?

Alternatively, is there a way to save a copy of a .gh project in which all of the linked .ghclusters are internalized, without having to internalize dozens of individual clusters in each .gh project? (i.e. use a .ghcluster as a static library?)

We've had great success sharing C# scripts as .gha libraries for use in these same .gh projects, but many of the clusters I'm looking to export and reference use GH visualization features that we'd rather not have to re-script from scratch.

Views: 672

Replies to This Discussion

so I had to do some evil and unholy things to get this to work, but try dropping this script into your definition and setting "run" to true - it should internalize all the referenced clusters in your document. It probably won't do anything for nested clusters though...

Attachments:

Nice. Isn't that an idea for some kind of a plugin? One could image a gha which dynamically loads other dependencies (whatever those are) when invoked by the GH setup/loading process. 

Some time ago I made a Rhino plugin (never really used it much, but hey...) which copy pasted the entire folder structure from a network location to the GH libs folder with all the files inside.  That way you can skip the CAS mechanism. The plugin has a single command "update libs", which depending on the plugin setup either loaded the release or debug versions.

I'm writing this as an example of solving the dependencies problem in GH... given enough interest and more information about the solution to this problem in GH2, I think I could set up some loose github repo.

I think McNeel are working on a more robust first-class feature for managing dependencies in GH2 - a true package manager.

The next Rhino WIP in fact should have the first implementation of YAK (it's what we call our package manager). But it's probably not that useful yet for end-users, it's just there so we can all start testing it.

Excellent. Too bad about nested clusters, but this is a great start - thank you!

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service