plug-ins can be found at this website:
http://www.food4rhino.com/?ufh
Th plug-in we were going to focus on yesterday was Kangaroo - a form finding mesh editing software. The instructions for downloading it can be found here, although all computers in the lab should already have it loaded.
http://www.food4rhino.com/project/kangaroo?ufh
Your assignment for next week is found below.
Assignment #6
1. You are to model a membrane shading structure for the WAAC outdoor courtyard adjacent to 1001 Prince Street (seen below).
This is an abstract model - no measurements are required. A basic model would be the walls of the the various building and the ground. Spend a few minutes in the space and get a feeling for what a shading membrane would look like in the space.
2. Use the Kangaroo plug-in to make the mesh. I would look at the tutorials below (do part 2 of the design Analyze as a minimum).
http://www.designalyze.com/tutorial/columbia-meshing-spring-2013-class-04
https://www.youtube.com/watch?v=9i8dGtcQxMk
https://www.youtube.com/watch?v=yY5WU_8L4S8
There is also a manual that comes with the download. It should be on the share file at the WAAC; if not just download it for free at the Food 4Rhino website.
3. It does not matter what your membrane structure looks like, but try to use your eye as a designer. Think about things such as sun angle, experience from both in the courtyard and also from the windows above, support location in the wall etc...
Take a screenshot of the completed image. Do not save to dropbox, but bring to class next week - March 23rd.
Post here if you have any questions. See you all next week!
…
e.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Outdoor_Microclimate_Map
Thank you very much in advance!
1. why the underground zone representing the ground is defined as a plenum zone? By default, an office zone program is assigned. Will this affect the outside surface temperature of the ground plenum zone and affect, in turn, the outdoor microclimate map calculation?
2. I assume the construction GroundMaterial composed of five layers of 200mm concrete materials as assigned to the ground plenum zone is to assimilate a ground surface composed of thick concrete. But why this construction is assigned to this zone using both the Set EP Zone Construction and Set EP Zone Underground Construction components? Will the surfaces of this zone automatically recognized as underground surfaces based on their positions in relation to the default xy plane?
3. why a brep is connected to the input node distFromFloorOrSrf on the Indoor View Factor Calculator component which is expecting a number according to its annotation?
4. why the outdoor comfort analysis recipe is used for the indoor comfort analysis component?
5. why the OutdoorComfResult and DegFromNeutralResult are 2 csv files with PPD and PMV values if PMV/PPD thermal comfort model is only applicable to indoor air-conditioned space?
…
20%3ATopic%3A1241658&page=2#comments), I have some questions regarding the work mechanism of the lighting control in HB/Daysim.
1) Which specific sensor point (out of many) in the "lightingControlGroup" should fulfill the requirement of "lightingSetpoint" (300lux)?
in other words:
Does it mean that the average illuminance calculated from all the "sensorPoints" in the "lightingControlGroup" (insteady of any single sensorPoint) should fulfill the "lightingSetpoint" requirement?
if this is the case:
Why the lighting schedule obtained from DSElectricalLightingUse is different from the lighting schedule obtained by post-processing/checking all the hourly illuminance results at all the sensorPoints? (please see the attach gh file edited from Mostapha's version)
2) What's the difference, if I hook up "lightingControlGroups_" or not, as shown in the RED arrowhead in the attached file? (Because the results are different as well)
3) If I have a large space (like a 40m*70m sports hall) and want to know its annual lightingControlProfiles, do I need a "lightingControl" component to be hooked on to the "readAnnualResultsI" (in order to calculate its annual lightingControlProfiles)?
If needed, how many "testPoints" I should choose to be used as "sensorPoints_" in the "lightingControl" component?
Thank you in advance for your kind help!
Best,
Ding…
uld help me to optimize the script, so it works reliable.
At the end the script should work only by the input of the following informations:
- Top-Curve
- Bottom-Curve
- accuracy ( like poly-count)
- is the bowl an open or closed structure?
This is an example of a good result:
From here its probably the best if you open both attached files, so you understand the problems.
1. Bug:
Offset direction of the bottomcurve needs to be set up by hand sometimes.
The script uses "loft" on a bunch of 3pt Arcs to create transitions. Arc 3pts" needs a "Point B" on the offset of the bottom line. Sometimes the offset is inverted, so i need to change it by hand.
The rule to make it work correct is: "The offset of the bottom line goes into the same direction as the top line, but on the same hight as the bottom line."
How can i implement this in GH?
2. Bug:
The floor generation needs a lot of guessing the right index numbers of lists.
The script uses 2x "Deconstruct Brep" to find the actual bottom curve of the created transition Brep. "Patch" is used to create a floor from this curve.
If the bowl is an open structure, the script creates a line between the endpoints of the bottom curve to close it, in order to create a trimmed "Patch". But again, you have to set up the right Index Numbers by hand...
3. Bug:
If the bowl is an open structure and the endpoints of the top-line and the bottom-line are the same the lofting is not working. At the moment I use a script that finds double points in the list and deletes it.
But the the result is, that the loft is not starting at the beginning or the end. Here is an Image.
I have only a little experience in gh, but i really want to learn more.
Thank you all for your help!…
the following image of a hut.
I do not have experience using kangaroo to simulate forces, but I have made a test using multiple random components on a flat surface to fake the effect I'm going for. See image below.
The main issue I'm having is that the original file used for my test surface used box morph and the variable pipe command. Box morph is a bit touchy on a curved surface and it is not as elegant as I would like it to be (ie. I want all the hair diameters to be perfectly circular and uniform in size). Variable pipe also does not align the base of the hair with the existing surface, which means I have to offset the surface and then trim the excess of my pipe.....leading to heavy code and the file crashing.
So I'm trying to rebuild the "hairs" using a new method:
1) Subdivide the surface
2) Find the midpoint of each surface and then create a straight line that is perpendicular
3) Move a point along the on the straight line (between the start and end points) in the z direction, and then create a nurbs curve using this point and the start and end points
4) create a circle at the base of each crv, and then two more circles: one at the point in the middle point (I think I set it to .9) and the end of the curve
5) The problem: Now I am trying to sweep along these three circles and the nurbs curve to create a bent hair/pipe that is flush with the conic surface, but it does not work.
If someone can help that would be amazing. I've included my original surface test file and my new file where I am rebuilding using the sweep command. Below is a drawing of what I'm trying to achieve.
…
g by a given diameter and some shapes.
What I've done: 1. diameter definition: to find the final object diameter
2. rails definitions: from a initial rails and diameter definition I've find a scaling factor in order to create the final rails with the same form of initial rails but with the wanted distance from the center.
I find the scaling factor using the ray and the distance of a projected point of initial rail to the plane XZ
3. shapes definitions: like the rails definition. I used more than one shape because sometimes I need particular forms along the path.
The sweep 2 rail component gives me a no good result
I tried the options inside the components in order to manipolate the data, but I don't found the solution.
If I bake the final rails and shapes, the sweep 2 rail in RH (selected object) works well, but in GH no (red object).
I attach the RH and GH files for all who can help me Thanks
Filippo…
rking with. I am architecture student as well so please bear with me :).
I am currently working on a high rise building. All elements (that is core, slabs and colums) will be analyzed as made of reinforced concrete. I do not want to optimize reinforcement distribution, I will create a material that would be close to reinforced concretes properties.
I think i understand how to create and assemble models made of beams (COLUMNS in my model) in Karamba, but I get totally lost when it comes to combining them with shells (CORE, SLABS in my model).
I would like to optimize use of material (volume or mass) with:
A) slab deflection limited to 3cm (GRAVITY + LIFE LOAD)
B) top of the building cannot "lean out" (horizontal defletction from WIND LOAD) more than 1/500 of its height
I post my questions below:
1) I would like to apply wind load on bigger exterior walls of the building. What would the best method bo to do that? I thought about applying load on the level of slabs as uniform line load (marked blue in model). Uniform line load needs to be supplied with beam ID. How can i simulate that? would i have to add beams on slabs edges for that to work correctly? If yes - how would i connect them with slab, so that all elements are transfering the loads cooperatively. Also in that case - how to convert wind pressure (kN/m^2 to kN/m)
2) I know that living load I want to apply is 4kn/m2. How to apply such load to mesh so that results are realiable? it is hard to turn it to point load, as mesh faces (and points where loads are applied) would have to be 1x1m if I understand correctly.
3) How would you place supports under core part?
4)I do not want to vary Slabs/Cores section - I would like to find the minimal value so that mentioned conditions are met. For example - 25cm slabs, 40x40 columns, 50cm core walls. I wouldn't like slab and core to have different heights in different places. Is it possible to use "Optimize Cross Section" Component or should i use Galapagos for that?
Sorry for such long post,
Thank you for your time and help…
Added by Wujo to Karamba3D at 1:11pm on September 27, 2017
what they really mean by that, as in what buttons to push, so I assume it's a Windows Path entry?
2.) Modify PATH
Add the install location on the path, this is usually: C:\Program File\IronPython 2.7
But on 64-bit Windows systems it is: C:\Program File (x86)\IronPython 2.7
As a check, open a Windows command prompt and go to a directory (which is not the above) and type:
> ipy -V PythonContext 2.7.0.40 on .NET 4.0.30319.225
Tutorial on setting a Windows environmental variable (path):
http://www.computerhope.com/issues/ch000549.htm
But this fails to point out that path contains many entries already separated by semicolons so if I merely add a new variable called "path" it's likely that I will destroy existing program function. There's no info on how to just tack on another entry, and the Windows 7 edit box doesn't even show the whole collection, but one item (!), so I copied the existing path into a text editor to see the whole collection successfully and added the C:\Program Files (x86)\IronPython 2.7 entry after an added semicolon, correcting for an Enthought page typo of no 's' on the end of "Program Files". I also checked the others and many pointed to old missing directories so I deleted those entries.
...and the test fails and "ipy" is not recognized as a command, even though the path now shows up using "path" in the Windows CMD window, that is if I copy all by right clicking and pasting the stuff into a text editor to really view it all. I can run it from the source directory just fine.
The rabbit hole was indeed deep. Using the Task Manager (control-alt-delete) to kill Explorer and then Run in the menu to restart "Explorer," along with restarting the Windows CMD window however, worked. I can now invoke Iron Python ("ipy") via command line from any directory. For the "path" I edited path in the System Variables and not the User Variables. No, you don't have to type that whole crazy line above just to test the path variable, just "ipy" (and control-Z to quite IronPython) in the CMD window invoked by typing "cmd" into the Start menu search box.
From the CMD line this step did work fine:
3.) ironpkg
Bootstrap ironpkg, which is a package install manager for binary (egg based) Python packages. Download ironpkg-1.0.0.py and type:
> ipy ironpkg-1.0.0.py --install
Now the ironpkg command should be available:
> ironpkg -h(some useful help text is displayed here)
But of course Step 4 fails, giving pages of what seem to be error messages;
C:\Users\Nik>ironpkg scipy
Traceback (most recent call last):
File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\utils.
py", line 92, in write_data_from_url
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 126, in urlo
pen
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 397, in open
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 509, in http
_response
...
Why can't I just download Numpy as a normal file and thus also have it easy for other users to install it when they use my scripts? This is just crazy and lazy. The Enthought developer has turned this into a computer game, with a missing registration link and then the last step spits out errors with utterly no information on how to fix it manually.
This Step 4 error is covered here:
http://discourse.mcneel.com/t/trying-to-import-numpy-in-rhino-python-but-im-getting-this-error-cannot-import-multiarray-from-numpy-core/12912/16…
Added by Nik Willmore at 2:36pm on October 11, 2015
about it.
2. Nick's comment below got me thinking about unit testing for clusters. Being able to work will data flowing in from outside the cluster or having multiple states to test against could be really cool. Creating definitions that were valid across a general cross section of possible input parameters was a significant issue for us. It was all too easy to write the definition as if we were drawing (often we were working from sketches) and then have it fail when the input parameters changed slightly.
4. I wasn't thinking about threading the solver itself. I was thinking along the lines of some IDEs that I've seen which compile your project while you type it. I know that threading within components and at the rhinocommon-level is a freaking hard problem that has been discussed at length already. (although when, 5-10 years from now, it's finished it will be very cool)
Let's say the solver is threaded and the canvas remains responsive. As soon as you make a change to the GH file, the solver needs to be terminated as it is now computing stale data.
What if the solver was a little more atomic and like a server? A GH file is just a list of jobs to do with the order of the jobs and the info to do them rigidly defined - right? The UI could pass the solver stuff to do and store the results back in the components on a component by component basis (i have no idea what the most efficient way to do this is in reality - I'm just talking conceptually) this might even allow running multiple solvers to allow for at least the parallelism the might be built into a given GH file to be exploited (not within components but rather solving non-interdependent branches of components simultaneously). This type of parallelism would more than make up for the performance hit you alluded to for separating the UI and the solver (at least for most of the definitions i write).
I was imagining a couple of scenarios:
a) Writing a parallel module: solver starts chewing away - you see it working - you know it's done 1/3 of the work - if you have something to do at that point you could connect up to some of the already calculated parameters and write something in parallel to the main trunk which is still being solved.
b) Skipping modifications: you need to make a series of interventions at different intervals along a section of code. Sure you could freeze out that bit of a section of down steam code and make modifications so you can observe the effects more quickly. Unfreeze a bit more and repeat etc. etc. until your done and then unfreeze that big chunk at the end to make sure you haven't blown anything up. Just letting it resolve as far as it can while you sit there waiting for inspiration seems a lot more intuitive to me though.
On a file which takes 15 minutes to solve that's no big deal, but you certainly don't want to be adding a 20 millisecond delay to a solution which only takes 30 milliseconds.
You also wouldn't notice it at that point :-) perhaps for things where it would really make a difference, like Galapagos interactivity, it could be disabled - or could the existing "speed" setting just digest this need? Since the vast majority of time that Gh is solving is on files under active development not on finished code, i think qualitative performance is probably more important that quantitative performance (again with cases like Galapagos needing to be accommodated). In our case the code only had to "work" once since its output went to a cnc machine to make a one off project and it didn't really matter if it took 15 seconds or 15 hours for the final run.
Lastly, I have no way to predict how long a component is going to take. I can probably work out how far along in steps a component is, but not how far along in time.
that's ok, from a user point of view, just seeing a percentage tick along once in a while would be nice reassurance that the thing is just slow and has not, in fact, crashed. Maybe there could be two modes of display: the simple percentage version for unpredictable code and, for those of us able to calculate the time taken for our algorithm based on the number of input parameters, a count down in seconds or minutes or whatever.
I think a good place to start with these sort of problems is to keep on improving clusters, ... etc etc
i totally agree.
…
Added by Dieter Toews at 7:53pm on September 4, 2013