Grasshopper

algorithmic modeling for Rhino

i have a scripting component that creates a surface then i use AreaMasProperties to get the area of the surface. the problem is the time it takes to calculate. i used a stopwatch() to time each event. 

it takes :
5ms to calc the surface
3ms to create OnMassProperties mp = new OnMassProperties();

and 400-600 ms to run the area calc , outSrf.AreaMassProperties(ref mp, true, false, false, false);

it seams that this is way to long for the area calc. the brepArea component only takes 50-60ms to run the exact same calc.

to test this i made a C# node in GH and added

OnMassProperties mp = new OnMassProperties();
  x.AreaMassProperties(ref mp, true, false, false, false);
A = mp.Area();

when you compare that to the area component it evals much slower. that may not be a fair test since the c# node has to go through a compiler but i also tried the same code inside a custom component that i made in VS and compiled into a gha and got the same slow results. which makes no sense since it should do the same thing that (and similar code) the brep area component does. 

Views: 358

Replies to This Discussion

Hi Robert,

I don't know what's going on, but the compilation of the code happens before the script is run, so it will not affect the time it takes to run specific operations inside the script. The only difference between your code and my code is that I compute 2 things instead of 1, which should actually mean yours should run faster:

AreaMassProperties(mp, True, True, False, False)

--
David Rutten
david@mcneel.com
Poprad, Slovakia
i cannot figure it out either. the code should run the same. whats odd is that even 60ms that your area component takes to run is long. it really should be about 5-10 going by how fast other apps can run area cals (like GC). perhaps you can try it in .7 code and see if you get the same results
I haven't converted the Surface components yet, but it seems like an interesting benchmark to do.

I rather suspect our Area and Volume algorithms have been designed for accuracy rather than speed since nobody expected them to be used as part of a 'real time' environment. I can ask the Seattle office whether or not there's room for easy improvement.

--
David Rutten
david@mcneel.com
Poprad, Slovakia
sounds great.

also, there are 2 more overloads to that function which let you define rel_tol and abs_tol as double. i assume that means relative and absolute tolerance. if i set them to .001 or 25 i get no change in speed. i guess if you just add an overload with an option that can adjust the tolerance level that would probably speed things up. and as a comparison of algorithms GC calculates the area to 7 decimal places and does so in near real time.

since grasshopper is working in the realm of real time feedback i assume the GH user will be pushing Rhino's calculations to the limit. it may warrant the optimization of the calcs for some of the more used properties of each element . also it may be useful in the future to create lightweight geometry classes without all the extra overhead and only target the properties needed in a real-time environment .

btw, the fact that you respond so quickly is amazing and i thank you for all the time, energy and effort you put into GH and its community.
I'm not entirely certain how the tolerance gets treated during the calculations. Tolerance often doesn't mean what people think it means. For example, if you compute a curve|curve intersection, the tolerance isn't an indication of how accurate the answer is, but rather whether or not certain answers are accepted as true intersections.

Most computations in Rhino will be performed to the limit of double-precision accuracy.

But you're right, Grasshopper has brought to light a lot of bottlenecks in Rhino core performance.

--
David Rutten
david@mcneel.com
Poprad, Slovakia

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service