Grasshopper

algorithmic modeling for Rhino

Hi,

First of all, thanks for the great work on HB! It is proving very fun and useful.

I am trying to do some optimization study on the cooling and heating load of a generic building by using Honeybee and Galapagos. There are two identical floors with 6 zones at the bottom floor and top floor separated by an adiabatic component in the middle. I have sliders affecting the thermal properties of the envelope, the WWR, and the shading depths.

I left my Galapagos running overnight and it ran out of RAM (have a 16GB RAM on Comp, Rhino was maxing out at 14.x GB).

At first I thought it was Galapagos causing this problem, but then I tried to manually adjust the sliders myself and found that each adjustment of slider would increase RAM, and very rarely it may decrease RAM. This was with the EPlus simulation component turned off, so I was basically just changing the parameters.

With about 17 changes in parameter sliders I was able to increase my RAM usage from 1.3GB to 4.6GB. This was with all previews off.

I tried the following:

UndoClear in Rhino - No effect

Solution -> Clear + Recompute - Adds to RAM usage

Is this something to do with Rhino? GH? HB? It seems that it is continuously remembering all previous states? Or would it have something to do with how I built the flow of GH (I'm quite new to it, so maybe it is something I am doing).

Greatly appreciate any help, would really want to be able to do some nice optimization runs. Thanks!

Views: 785

Replies are closed for this discussion.

Replies to This Discussion

Hi Timothy,

I can help you more if you can share your file. There are ways to optimize the ram usage for repetitive analysis.

Mostapha

Timothy,

I knew it was only a matter of time before someone brought this up.  The heart of the issue is that, each time a Honeybee component runs that alters HBZones, there is an entirely new copy of the zone object being made.  So, if you have a large model, with many components that alter the zones, and try to run several iterations of it, you can find yourself writing a lot of copies of zones to your memory quickly, which will max out your memory in the way you describe.

Because of this, I have actually gotten a lot of mileage out of upgrading my computer from 16 GB to 32 GB.  Still, as Mostapha suggests, there are a number of intelligent ways of laying out the GH script to minimize the copies of zones that get written to memory and will buy you more iterations before max-out.  If you upload the GH file, we can understand the memory pressure points.

-Chris

Hi Mostapha and Chris,

Thanks for the quick replies. I am guessing, unfortunately, there is no way to clear the copies of "old zones"?

I have uploaded my GH script, and would welcome any suggestions and comments.

Thanks both of you for taking the time to have a quick look.

Tim

Attachments:

There are ways to clear the copies which is why I needed your file. I'll check your file later today.

Thanks for the help.

I updated the GH file a bit, as I found out that leaving some fields not defined made the whole component (EPWindowMat and EPConstruction) undefined during the simulation.

Tim

Attachments:

Hi Timothy, I started looking into the file. I added a new method to remove older Honeybee zones but that doesn't really make a huge difference. I made a change in update EPConstruction component that is making the biggest difference on my system. Can you test the attached file and report back if the changes have made any improvements.

Attachments:

Hi Mostapha, thanks for the work.

My two observations:

The new component seems to slightly slow down the RAM usage increase, and it seems like there is a slightly higher probability that some RAM clears a bit after changing the parameters.

The script also solves faster after a change in parameter at the envelope thermal properties. However, there seems to be no output from the new EPConstruction component?

Tim

Thank you for testing. I wanted to make sure that I'm in the right direction. The output issue is a typo. I have a couple of other ideas that should help. Will get back to you soon.

Hi Timothy, Finally had a chance to re-visit this issue. Check the attached file and let me know how it works. I re-wrote how honeybee handles honeybee objects between the zones. This is as good as it can get in Rhino 5. In Rhino 6 we can make it even better!

Attachments:
Mostapha,
I'm glad that we are switching to this method even if it means taking out the ability to assign values based on orientation on a couple of components. Switching to list access should address a lot of issues that we experience with large honeybee definitions.
I was just reflecting on the suggestion of using tree access to assign parameters based on orientation and I realized that there are a lot of other things that people will probably want to use the data tree for. I think it's better to have a separate component to assign constructions or boundary conditions based on orientation and I can draw these components up soon.
-Chris

Hi Chris, Sounds good to me. Following Grasshopper's conventions is always a good idea.

Hi Mostapha,

I am facing the same type of issue with a recursive voxel aggregation. Within a loop of Anemone I use honeybee to test recursive geometries. with the 40 000 points, and therefore voxels to be tested, my memory runs out quick and rhino crashes.

I have attached the gh file.

So far I have been trying to pause the loop save the gh file, close rhino and reopen and start the loop again. However as soon as the ram is busy, pausing the loop is not possible.

Is there a way to clear the cache at each iteration or a way around this?

Olivier

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service