Grasshopper

algorithmic modeling for Rhino

Hi everybody!

A short story: (maybe dumb... who care!)

A: Hey B, it's "false"!

B: Ok.

- 0.01ms later -

A: Hey B, it's still "false"!

B: Ok, I got it.

- 0.01ms later -

A: Hey B, you'll never believe it!... It's still "false"!

B: OK! I got it! Tell me when it changes!

- 0.01ms later -

You got it.

This might happen when sliding a slider, and some part of a big definition continues to be recalculated (maybe also with heavy tasks) very often uselessly.

Every GH component "refresh" whenever any of its inputs refresh...

Those updating/expiration "waves" could be unhandy (http://www.grasshopper3d.com/forum/topics/control-knobs-in-c)

Not if coding!

I did something like this, working with a decimal value converted to integer:

The data dam(s) are set to "0.25 sec" so we can see that it keeps updating...

By using a c# component (or other similar) we can "filter" the updating waves, passing datas only when it changes, etc...

Now, would it be dangerous? In which ways?

Can this be done for every kind of data? (geometries, values, booleans, etc)

In my ignorance i would say every component should do this "check" to lighten the load on the cpu...

But more than anything else... was this already solved in a better way?

In GH2 it would be cool to have a "filter" component.

Maybe I'm just wrong at everything....

Give me your thoughts!

Cya! :D

(attached c# script, also some cool component positioning inside, stuff just learnt)

Views: 1183

Attachments:

Replies to This Discussion

This is a possible alternative approach to running solutions, however it doesn't work well within the current GH SDK. The current approach is to just erase all data as soon as something expires, and this wave of erasures propagates throughout the document. All the current data is lost, so when new data comes along there's nothing there to compare it against. In order for your suggestion to work, old data needs to be kept long enough so it can be compared to new data. If, after the comparison, it turns out the new data is identical to the old data, then the old data can be re-used downstream.

Just beyond the fact that the current system is not set up for such an approach, there are several problems with it. Chief among them are:

  1. In addition to calculating a new solution, we now also have to perform a lot of comparisons. I don't know how common it is for a new solution to be able to re-use large chunks of data from a previous solution, but I do know you have to compare all the values before you can be certain that the before and after states are identical.
  2. Comparisons can be complicated or even impossible for certain data types. It is also impossible sometimes to define a universally agreed on equality. If you have two breps that have the exact same shape, but the order in which faces are stored is different, is it still the same brep? How about userdata on those breps? Is that allowed to differ? If not then you're in trouble, because plug-in specific user data is not accessible from a different assembly so you'd have no way to test for equality.

The good news is that GH2 will retain solution data longer than GH1. This is necessary because GH2 solutions run on background threads and the old data needs to still be previewable in Rhino viewports while the new solution computes. This at least provides a mechanism by which data before and after a solution could be compared, and even a way for old data to be re-instated without new data having to be computed.

But even in this new scheme I'm hesitant to try and be 'clever' about this. Because if you get it wrong you introduce a bunch of really hard to detect and work-around bugs, and that's not even to mention the increased complexity of the solution logic in general, which is already incredibly convoluted and difficult to debug.

Thanks for the exhaustive answer, David... as expected.

Doing this check all the time it would be a heavy load, ok...

Maybe something like a MD5 checksum?

Still I don't quite get why all the data needs to be expired and recalculated every time... but I am certainly far away from even imagining the whole complexity of the situation here (:

... maybe what i need could be solved just with a smart use of data dam component.

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service