There's a cool software called vvvv, I?m sure you know it. One of the amazing things it has implemented is a comand to check memory consumption by each node. It would be supercool if we could see (maybe color, or value) all components memory consumption... To optimize definitions and let GH work smooth...
I have no idea how difficult is this, but I think would help a lot to create better deffinitions!
Taz, what would you use this information for? It's not like components are processed in parallel, and it's not like there are any unforeseen delays. Components are simply solved as quickly as possible given the constraints of the hardware and the Rhino SDK.
--
David Rutten
david@mcneel.com
Poprad, Slovakia
Permalink Reply by taz on December 29, 2009 at 12:48pm
errr... I thought it was a follow-up to what Suryansh was saying.
Is the critical path (chain) time the same as total run time?
If one wanted to speed up a definition, one could start by reconfiguring components on the critical path. The critical path would be an analytical tool.
Is this overly simplistic in the reality of the computer world?
It works well enough for construction management...
Very sweet. Profiling is one of those things that can open your eyes as to what might be a potentially good way of doing something and a potentially bad way. I think this will be quite helpful in find out some "best practices". Now how this makes its way into the interface, I have no clue :)
If a user turns this on could you not trigger Grasshopper to solve a few times until the average solve time is within some tolerance. I understand this would be deadly with heavy defs (which is why people would use it, to trim the heavy tree). But there could be a timeout too...if solving takes too long, then exit the solving....hmmmmm