"Thx for the math!
But what I am wondering is the memory behavior, when passing this data through a set of components. And what also is strange, when the definition gets recalculated with less elements (e.g. 1000) than the memory consumption…"
"Let's do some math on 2.4 million variables:12000 x 200 = 2.400.000(double: 64bit / 8 byte)2.400.000 x 8 byte = 18.3MBIf we consider GH_Number (used inside datatrees), it has some additional fields:Type Description = 41 byteType…"
"Added a simple example to show the problem.
In this case it makes no difference since it is a 130MB problem (at least on my computer) but I got some files with more components where the memory consumption goes over my 4GB RAM.
Thanks for asking. Flux's product focus changed pretty radically and we had to put our second GH plug-in on pause. I'm sure that there will be some exciting developments in the future, but for the time being we…"
"Each parameter creates a shallow copy, but when you modify data deep copies are made first. The datatrees are copied, but that shouldn't result in too much overhead. On the other hand 12000 branches is quite a lot. Can you post a file that…"
A thing that is kind of troubling is the memory issue when having e.g. large (e.g. 12000 Branches, with an average of 200 values) DataTrees. When passing such structures arround a few components my memory gets filled (only of 4 GB but…"
I don't have much time before my ride arrives, but I'll quickly try and respond to each point:
"I've used a lot of 3D programs. I've never encountered one as difficult as grasshopper."