Grasshopper

algorithmic modeling for Rhino

Hi All!

As a quick question for Chris and Mostapha, and anyone else who knows, I am trying to run a parametric iteration study, but it is going to take a long time for them all to run if it can only run one energy and daylight simulation at a time.

Is there a way to run the idfs created as a group run like what energyplus launcher can do with native energyplus?

Views: 3011

Replies are closed for this discussion.

Replies to This Discussion

Beautiful, thank you Vinu, Anton, Mostapha, by now I ve managed to farm my company's desktops for E+ analysis of a massive Dome shaped building with high computation time. I used Flux to communicate the directories, and to copy libraries between the different desktops inspired from this post. I had some issued since E+ wouldn't run files located in shared folder so I needed to work around that and automatically copy the result files back to the shared folders when analysis is done. (any ideas how i could skip this step?)

I was also wondering if I could have LB and HB looking if the "_defaulFolders" already exist and if so not to create new ones as in this case I will not need to copy all the created materials from one machine to the other. 

Finally Mostapha I was wondering if I could run .bat files for Daylight analysis as described here. I tried to run the .bat files created from the "runDaylightAnalysis" component but unfortunately I receive no results so far. 

Thank you again for the posts! 

Best,

Tasos

Hi Tasos,

Very interesting! Happy to hear that you got it to work.

I'm not sure if I understand the first part of the problem.

I assume the issue with running daylighting simulations is absolute vs relative file path. This is an issue that can be addressed.

Should we start a new discussion for both of your questions since they are not really related to this discussion and more importantly it will be easier for other users to follow the discussion and find the answers later.

Mostapha

Hi Mostapha, 

That is my guess as well for the daylight analysis as I had the same issue with the E+ analysis but I solved it and now I can run analyses without any problem. A "ReRun" component for DL analysis recipes in order to be able to create a new local folder and run the analysis from there would probably solve the issue. It would be great if you could also update the "ReRun IDF" component so it has the "WriteIt" command implemented as in the example file you made for Vinu. (the latest release only has the "RunIt" command if I am correct).

As for the first part of the problem, I (probably falsely) assumed that I need to update my custom EP libraries of the remote machines on the C:\ladybug folder (or in the "defaultFolder_" I ask LB_LB and HB_HB component to create) so I can run the E+ analysis on my remote machines with the materials I created for the recipe on my local machine. Base on this assumption I was wondering if the LB_LB and HB_HB components could first check if the "defaultFolder_" path that I input already exist before they create a new one and if it does they read and write from and to the existing contents of it (EPlibraries, RAD libraries e.t.c) instead of creating new ones. 

This way I could have all of the LB_LB and HB_HB components of the remote machines look in one shared "ladybug" folder. Is it a bit more clear now? 

finally I think the next step to the remote analysis process is to parse and cache the generated .csv's on a web-service application and read them on a local machine without waiting time. So practically create a web-service that will do exactly what the "read_EP" results does but on a server. (or on the background of a local machine). This would save significant amount of time since the generated .csv's (in my case) exceed 1GB and takes a while to be read within GH. I will try to bring this together and will update you but if anyone has some experience on building web-service apps in Python would be of great help!

Tasos

 

Hi Tasos,

There is a Honeybee_Refine Daylight Simulation component but it doesn't exactly do what you're looking for. Re-running daylight analysis is not as straight forward as idf file since it can be very different based on analysis type. Not to say that it is still possible. If I get a chance to re-write the libraries we can add a pkg file where we can track all the input files for the study and re-create the analysis as needed.

ReRun idf is now updated and looks like a standard Honeybee component (https://github.com/mostaphaRoudsari/Honeybee/commit/d6daf39cea34e9d...).

Back to the libraries Ladybug and Honeybee should not overwrite or re-create the files if you already have them in the folder. Let me know if this is not the case and we will address it.

I'm waiting for OpenStudio team for your last wish. They are already creating a web-service that can take care of what you mentioned above. Once they have it done you can use it next to the OpenStudio component.

Hi Vinu,

I've tried playing with your file and whatever I do. I can't get the component that runs batch files in parallel working, they simply don't run. Would you be able to do a quick skype with me so that I can see how it is working on your system?

Please add me on skype. My skype is - anton.szilasi

Thanks,


Anton

Anton, The analysis run in the background which means you won't see any pop-up window. How do you set it up on your machine?

Hi Mostapha, It feels uneasy without the pop-up and also it was not running or giving me data initially. So I Boolean toggled the re-run idf component and it runs fine for me. Am I doing it wrong? I have posted my file above.

I assume you're doing it right if it is running so I won't be concerned about the changes that you have made.

Thank you Mostapha!

Ah I didn't realize that, maybe it would be better if it was running with a pop up

Sure Anton, we can skype tomorrow! I live in pacific time though! Could you let me know the time!

ok just add me I haven't got your skype request yet

RSS

About

Translate

Search

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service