algorithmic modeling for Rhino
We have a problem with referenced clusters. Everything works fine as long as we use clusters locally within our main .gh file.
Exporting a cluster to an external .ghcluster file and then referencing it back into our main .gh file works normally (with full functionality) as long as all the files are on my local hard drive (e.g. "C:"). But as soon as I try to reference a .ghcluster or even open one saved on an external drive (our company's network drive) Rhino and Grasshopper instantly freeze und crash. Every time. No matter the file size.
We have already checked, if maybe whitespaces or dots in the file's path could have caused the crashes. But even eliminating them by renaming the folders doesn't fix the problem.
So obviously GH's clusters does not support referencing files from external network locations. Does anyone have an idea why? Were stuck here, because we want to severely increase the complexity of our project's main gh. file and enable more team members to contribute to the main file via external referenced cluster files.
Thanks for your help.
I can only confirm that and have no solution or workaround. Is there anyone who is working with referenced clusters in a server based environment?
We have it working quite nicely here. On one specific very large and complex project we're using a lot of clusters and they're linked to a network drive. They are in the same folder, or only 1 hierarchy up as where the main .gh file is saved as well. The network drive is mapped as Q: and we're using that drive location for it. So that seems to work fine actually.
In older versions of GH the clusters we're crashing quite often and then the cluster just dissapears in the main file, but that seems to work fine now in R5 and GH 0.9.0014
I'm sorry, unfortunately I don't have any solutions or clues for you for what to do to get it working though.
Thanks for your answer.
We've by now figured out through extensive testing, that is is obviously a rights management problem on our server. While users have full access, grasshopper seems to be blocked somehow, regardless of our firewall settings.
However we've encountered an additional problem, while testing cluster references on my local hard drive:
If I rename a .ghcluster file that has been referenced into gh. files and then open one of those .gh files, the cluster doesn't work but as a orange triangle on it and I can relink the missing .ghcluster file by clicking on "update" and choosing either a new cluster or the renamed old one.
However, if I rename the folder containing my .ghcluster file and then open the .gh file it is referenced into, the cluster is missing completely (it's just gone!) all of its in and outputs are severed and there is no way to relink the cluster to a new .ghcluster file.
Any clues on why this happens?
We've by now figured out, that Rhino/GH don't actually crash when referencing files from our server. The programmes just freeze. If you have the patience to wait for about 3-10 minutes they eventually recover and you can continue to working normally. This however is not very practical...
(Additional information: We have a virtualized Windows SPS environment, might this be the problem? Locally - on my hard drive - it works fine.)
Futhermore we've discovered the following bug/feature:
We export a cluster and reference it back into our .gh file, then copy the .ghcluster file to a different location and rename the copy (without opening or changing it), then also reference the copied version back into the .gh file. Now Grasshopper shows two clusters with two different file paths, but claims that they both are the same ("this cluster occurs twice in this document"). If I double click one of them, make a change and save, both clusters get changed, even though they are separate .ghcluster files.
This would follow the logic that David laid out in this entry (http://www.grasshopper3d.com/page/clusters09), that GH identifies a cluster not by its file name or location but by its internal ID.
An addition we would very much appreciate for the next GH update, would be the option to right click a referenced cluster and then not only be able to "update" it but to also to "relink" it to a new or different source.
Right now you have to rename or delete the .ghcluster file in order to relink a cluster via the update option. You can also overwrite the old cluster und update. However, sometimes we want to keep the old version or disentangle one of a clusters many instances and relink just one, with out loosing its various inputs and outputs by referencing the new version and reconnecting it.