Grasshopper

algorithmic modeling for Rhino

[WIP] is there a way to save fully specified honeybee thermal zones in a component for reuse?

May I ask if there is a way to save the honeybee thermal zones with adjacency solved, which may take a long time for large model, zone program, loads, etc., all defined so as to save the time to open the file next time or reuse it for different workflow?

Thanks!

Views: 695

Replies are closed for this discussion.

Replies to This Discussion

Hi Mostapha,

Perfect now!!

Opportunities is a great word. Like it a lot.

For now, what are the definitions that are dumped and loaded? Just geometry? Didn't understand what yo wrote about the materials/constructions.

By now you'll say that the stage where you better do the dump is after the solveAdjacencies? Even there some definitions can be lost (materials, others)?

Thanks a lot for this one.

-A.

No worries. I hope the paper went well.

thanks! ...well, I "cooked" close to 4k words in 4 days... and I hope the paper will be accepted... 

Thank you very much, Mostapha, for the updated components!

I did a quick test and it seems that the default zone program set for the thermal zones can be "packed" and exported as the .HB file, and it can be loaded back to generate IDF file which contains the schedule, materials and constructions for the default zone program, and the E+ simulation went well with no error.

Do you mean there might be error when using self-customized materials, constructions, schedules for the zones during the export and load process using these two components? 

Thanks!

Great! Thanks for letting me know. As far as you're using the default materials, constructions and schedules you should be fine.

Also if you're loading custom stuff that you have it on your machine and in the file it should work fine.

The issue will only happen when there are custom stuff that are not available on the other machine. We can package them with the file later but it's not implemented yet.

For now, what are the definitions that are dumped and loaded? Just geometry? Didn't understand what yo wrote about the materials/constructions.

By now you'll say that the stage where you better do the dump is after the solveAdjacencies? Even there some definitions can be lost (materials, others)?

Hi Abraham. Everything gets dumped and then loaded by the components. The issue that I mentioned happens because in case of materials, constructions and schedules HBObjects carry the name and the address and not the full data. For instance if you have schedule set to c:\yourschedule.csv it will be saved correctly but just as a string to this address. This means if you load the objects on a different machine and try to run the simulation it will look for c:\yourschedule.csv and if that's not available it will fail running the simulation. It's the same with custom materials and constructions. HBObjects just carry the name and not the full definition.

I can add the option to save all the material, construction and schedules full definition with the file but it will results in potentially a much larger file. Currently there is no easy way to know if a material/construction or schedule is custom or is coming from the library so I have to save all the constructions and schedules with the objects. It also includes RADMaterials.

Thanks Mostapha,

This is clear now.

Most of the cases you'll do in your own machine. But this is the time of start thinking in shared libraries as you said "in the cloud, in the rain, etc". Opportunities, new opportunities.

-A.

Speaking of rain and its relationship to cloud-computing > http://halfblog.net/2011/11/29/the-telegraph-thought-councillor-tho...

After all these years it still makes me laugh loud. I feel bad for the guy.

:-)

Hi Abraham, Mostapha,
Really interesting topic indeed. Mostapha in one (of the many) discussions out there we had discussed about the LB_LB component and the path it reads the libraries from. I have been using flux to run "cloud" e+ simulations in multiple desktops but had issues with DL simulations as a batch file is not enough to do so. I have the feeling - need to test - that with these components and an option to assign a path to LB_LB to read the libraries from different directories than the local "c:\ladubug" that usually does now so we can host them on a shared file on a server all sort of analyses will be possible in multiple machines even with custom materials schedules etc. This would be simply taking things to another level! Great work again!

P.s these components will be significantly useful for free form structures and other special geomtries that tend to push the limits of GH, cpu, ram and myself s during set up! thanks a lot

Tasos

Exactly! Now you can transfer the objects and not the files so the issue with path goes away.

What makes me even more excited is the opportunity of having several people working on different part of definition using these new components and then sharing their finished Honeybee objects.

In a broader case while designers are working on the geometry side engineers can set up their definitions for detailed systems on their machines. Will make what we called "Remote Solving" much easier.

Do you think that we should always package materials, constructions and schedules with HBObjects? Should it be an optional input?

Mostapha

Nice.

I think it will be good to have different "levels" of packing: Just geometry, +materials, + schedules, +systems. Even better, maybe the level of upload can be decided on the fly ("I just want the geometry, but define my own materials, even though they are in the package", etc).

Another one: If defining many platforms, being able to export/import to those will be interesting and important (GH/Dynamo).

I'm just trying to fly a little bit :-)

-A.

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service