Grasshopper

algorithmic modeling for Rhino

Create multiple batch files so I can run them parallel on a 128-processor server

Hello everyone,

I'm going to calculate annual glare for 100 Rhino views. I would like to create 100 batch file so I can run them parallel on a server that has 128 CPUs. I have two questions:

1- What is the best way to create 100 batch files, one for each view?

The only thing I can think of is to create a list of the names of my views and connect it to "list item component" then animate the item index slider. However, each time the slider moves simulation starts then I close the Dos window to get a batch file, this means I have to close 100 windows (one at a time). Is there any way Honeybee can write batch files without running simulations.

2- I read somewhere that Radiance uses 1 processor, Does this means that I can run all 100 batch files simultaneously on a server that has at least 100 CPUs and get results in a speedy manner? 

Thanks

Rania

Views: 1200

Replies are closed for this discussion.

Replies to This Discussion

Raina,

You ask this question at a good time as Mostapha and I are in the process of making the components more amenable to applications such as this.  Whatever the final workflow we end up suggesting, I am confident that it will involve setting up sliders to run through design spaces for cases like yours.  So setting up the slider and a list of views should be relevant to your situation no matter what.

Animation of sliders works but you can also use the "Fly" component to run through all of the combinations of multiple sliders.

I think we will add in an option soon to run energy / daylight simulation components all of the way to generate all files (possibly including batch files) but just not execute them.  This way, you don't have to close out of the window each time.

We will also write something to execute all of the files for you but this part may vary depending on the parallel processing platform used.  In any case, your logic about 100 batch files on a 100 CPU computer makes sense as long as the operating system of this computer knows to run each batch file on a separate thread (which I imagine it does but I haven't worked too much with supercomputers yet).

Stay tuned and I will let you know when we add in the option to generate all files from the simulation components.

-Chris

Not sure if this is an idea but the mpi platform has commands like bind-to-core and by-socket, which bind processes to specific cores. I know it's used in CFD studies, could be used here as well(?)

Kind regards,

Theodore.

Theodoros,

I never used supercomputers before, this is my first time! I'll look into these commands, I think these commands will be extremely helpful in my situation.

Thanks for the help. 

Best,

Rania

Chris,

This is great, parallel processing is crucial when you need to perform hundreds of simulations. I'll stay stunned! 

For the time being, I'll just close 100 pop-up windows manually. 

Best,

Rania

Just a note about Theodore's parallel processing suggestion: because the parallel processing of the daylight studies is so simple (it's literally just one core per simulation), I don't think that you should have to use those commands.  These commands seem very relevant to CFD parallel processing, however, since you often need the cores to talk to one another when running parallel CFD studies.  As such, overriding the default tendency to run everything on one or a few cores for the CFD seems to be necessary.

Chris, thanks for the note. I'm still trying different  things and preparing the batch files before I start working on the server. I'll definitely keep you updated. 

Just a quick question, Are the batch file created right after the DOS window appear ok to use? in another word are the contents of the batch files created in the beginning of simulations stay the same after simulations are done?. I'm planning to close the pop-up dos windows as soon as they appear and I'm not sure if the batch files being written are valid to run later.

Rania,

Batch files are the same both before and after the simulation.  Closing the window will not change the batch file.

-Chris

Hi Chris,

How does this discussion relate to the up and coming connection with parallel works?

Awsome! thanks.

I just wanted to share my experience performing 400 annual glare analysis on a monster 128-CPU server. So here it goes

1- The server is based on Amazon's EC2 service. The server has 128 V-CPUs and 1.9 TB of RAM.  I think I'm going to start a GoFundMe campaign to buy one for myself :)

2- The server's cost is about $13 an hour. I get free access to supercomputer through my university and xsede.org because I earned an NSF Honorable mention last March, however, the supercomputers available through both resources are a little complicated for me to use, as opposed to the one available from amazon that has Microsoft server 2012 already installed.

3- I wanted to run 400 annual glare simulations for 400 different views. 

4- I tried a to perform annual glare simulation for one view on my Dell XPS that has Intel Core i7-6700HQ processor and 16GB of system memory. The simulation took 2 hours to complete. Radiance parameter ab was set to 6. 

5- I wanted to obtain the batch file for each view so I can run them on the server. So I used the fly component to run all 400 simulations and closed the cmd windows, that wasn't bad ( for me at least) because I asked my son to this job for me, he was just glad to help me :)

6- I created one batch file using this cmd command:

dir /s /b *.bat > runall.bat

This created a file with the path to each .bat file. I edited this file in Notepad++ to include the word "start" at the beginning of each line. This was done using the "find and replace" dialogue box.

 7- I split my newly created batch file into 3 batch files, each one has about 130 file names and " start" before the file names.

8- installed radiance on my server

9- Ran the first batch file on the server, this started 130 cmd windows performing my simulations, CPU usage was anywhere between 90% to 100%  and about 105 GB of RAMs were used.

10. It took about 5 hours to complete all 130 simulations, I expected to run all in 2 hours but can't complain because this would've taken about 260 hours to run on my laptop. After the simulations done I ran the second and then the third batch files ( total of about 15 hours). 

11. I got 400 valid dgb files. Couldn't be happier! 

 

 

Hi Rania, Congrats! Thank you for sharing your experience. Made me smile! At the same time, it confirms that we really need to cloud-enable Honeybee to let users such as you explore computationally intense stuff without having to close 400 batch files manually. Good luck with your research.

Woohoo!  Very exciting, Rania!  Thank you so much for sharing. We will definitely start putting in some capabilities to make it easier for running cloud simulations like this.

RSS

About

Translate

Search

Photos

  • Add Photos
  • View All

Videos

  • Add Videos
  • View All

© 2024   Created by Scott Davidson.   Powered by

Badges  |  Report an Issue  |  Terms of Service