o use these extensions in order to integrate numerous tools for analysis and simulation in the architectural process.
This course aims to develop a link between the virtual and the real context model through structural or environmental simulations, using other software or plug-ins dedicated. Through this link the virtual model receives physical properties that can further modify and adapt the initial model. This creates feedback loops that can optimize the design to provide an object responsive to environmental conditions.
Curriculum
Mesh subdivision with Weaverbird, continuous surfaces without NURBS
Genetic optimization with Galapagos, optimal search
Physical environment feedback with Diva and Geco, solar and day lighting analysis
Adding physical properties with Kangaroo Physics, interactive form-finding
Linking the parametric model with structural analysis using Karamba, structural performance simulation
Extracting data with Firefly and Kinect, 3D scanning and human movement tracking
Exchange of information between Grasshopper and other applications with Ghowl links to internet feeds or Excel files.
Schedule:
Module 04 / Grasshopper intermediate & advanced (24 h)
11 Oct – 26 Oct 2013
Fri:
Sat:
16-20
10-14
Language: Romanian
Organized by:
OAR Bucureşti – Romanian Order of Architects, Bucharest Branch
Trainers:
Ionuț Anton, idz arhitectura (ART-Authorised Rhino Trainer)
Daniela Tănase, idz arhitectura (ART-Authorised Rhino Trainer)
https://www.facebook.com/cursurigrasshopperrhinoceros
http://www.oar-bucuresti.ro/anunturi/2013/02/27/d/…
Added by Dana Tanase at 2:49am on September 5, 2013
ld work.
For example there's a grid shell and I've got a number of control points (for example 3) that can move up and down.
Depending on the control points I get forms that are structurally good and some that are bad.
In my office we've got a GH-Component, which leads the geometry in structural members and solves the structural forces and so on through an external Software called Sofistik and afterwards gives back to GH some Values, for example maximum bending moments. (Like Karamba)
Now I want to create this optimization component or something like that to minimize e.g. the bending moments in the given geometry.
Let's start with the work of the component.
So when I've three control points that can only move in z-direction.
P1(0,0,Z1), P2(10,0,Z2), P3(5,5,Z3)
They only depend on Z, so everything depends on Z1 to Z3 which have a range between 0 and 10 f.e.
First I want to get some (between 9 and 15) random Particles, one particle consists of this 3 different Z's.
So for example the first particle Part1 is [Z1=10, Z2=5, Z3=7]
and the second particle Part2 is [Z1=7, Z2=1, Z3=9]
and so on.
I created these Start Particles in a Cluster. See attached file.
I also tried this in C#, but thought it is easier in GH.
After I've got the Start Particles I want to give out the first particle and evaluate with its including Z's the target value in GH. Therefore I had to take the first branch and graft this branch (Discussion before)
Afterwards I want to save this Target Value that depends on the first starting Particle. Then I want to give out the second starting Particle to evaluate its target Value and store it. And so on till the last target Value of the last Starting Particle got assigned.
Then I want to assign the particles with its target values. E.g. part1: t=0.9, part2: t=1.8...
Then I want to define neighborhoods or the count of the expected local minima.
These neighborhoods can look like: Each neighborhood has to include not less than 3 particles. And the particles have to be next to each other.
E.g. if there are 12 particles and I want to have a look for 3 local minima, I need 3 or 4 neighborhoods. Then I would take 3 neighborhoods, because the more particles in one neighborhood, the better.
So the Count of the neighborhoods would be N=min{(Count of Part/3)& N_min}
How to define these neighborhoods I don't know at the moment. I think it has to be searched for the distance between the particles. E.g. part1 with (9,9,9) and part2 with (9,9,8) are next to each other but part 3 with(1,1,2) is far away.
Then each StartParticle is set to Partx_localbest.
And in each Neighbourhood the best of these localbeststs is Part_NyBest. (The best ist the one with the smallest target Value)
Loop:
Now I want to create new Particles. These Particles don't change their Z-values randomly. They change their Z-Values depending on Part_NxBest and Part_localBest. Therefore it has to be evaluated a new velocityfactor with v_Partx_new=0,792*v_PartxOld+1,5*random(0,1)*(partx_localbest-partx)+1,5*random(0,1)*(part_NyBest-partx)
The new particles will then be partx_new=partx+v_Partx_new.
The new Particle partx_new will be set to partx and then set in the output.
then there has to be caught the targetValue of part1 afterwards part2 can be put out and its target value caught and so on.
Then it has to be looked for the Partx_localbest through comparing the partx_localbest and its target value with the new part_x and its target value. If the target value of the new partx is smaller than partx_localbest,
then partx_localbest is the new partx.
This has to be done for each partx. Afterwards the same for neighborhoods best (best of all partx_localbest in one neighborhood)
Endloop if velocity gets small.
Output all part_NxBest
Output all targetvalues of the part_NxBests.
So in the Input there have to be:
StartParticles if they are given through the cluster attached.
Device on the target Value like in the attached gh.file from David Rutten I found in the discussions
Count of neighborhoods
And in the output
Output particle for evaluation
Output all part_NxBest
Output all targetvalues of the part_NxBests
Hope didn’t forget anything. And hope it isn’t crushed to badly. Sorry for my bad English by the way ;-)
For more explanation, how the PSO works in other programs. There’s attached a workflow script (is it called like that?) I think for GH it should be a little bit changed like I tried in my explanations.
So if you can help me a in some parts or you have any advices would be great, otherwise thank you nevertheless!!!!
Thankfully there’s no limit for the words in the discussions :-D
Best, Heiko
…
ity...? How to define this parameters and simulate them? How to simulate and evaluate form? How to work with Evolutionary Solver inside Grasshopper3d? How to evaluate end data and choose the fittest geometry? How to optimize geometry to increase overall energy efficiency of project!?
»»» Rhinoceros 5 + Grasshopper 3D & Sub-Plugins *required grasshopper plugins: Elk, LadyBug + Honeybee, Mesh edit (uto tools), Mesh+, Weaverbird, Human, TT Toolbox, Lunchbox, Horster tools, Exoskeleton & Cytoskeleton
>>>Please download and install Rhino + GH3D & Sub-Plugins before workshop start!<<<
with Igor Mitrić
DIGITAL FABRICATION BASICS - 3D SCANNING AND 3D PRINTING
Workshop would provide overview of current state of technologies for 3D scanning and 3D printing with those affordable and practical devices for research and development new design projects. Attendees would use 3D scanner to generate 3D model in virtual computer space, remodel, and prepare for 3D printer.
with Roberto Vdović
OPTIONAL FIELD TRIP - ON SITE ENERGY MEASUREMENTS (19.6.2015)
with Benedikt Borišič and Veronika Madritsch
Participants will receive CERTIFICATES of knowledge acquired at the workshop for each section. Participation is FREE! The number of places is LIMITED!
MORE INFORMATION AND APPLICATION WWW.LIVECONST.EU
WORKING SCRITPS
Day 1 City.gh
Day 2 Lady Bug.gh
Day 3 Galapagos
Record Galapagos.gh
Day 3_First Half.gh
Day 3 Second Half.gh
Tower from any Curve.gh
…
three categories, each one corresponding to different shapeType_ input:- polygons (shapeType_ = 0): anything consisted of closed polygons: buildings, grass areas, forests, lakes, etc
- polylines (shapeType_ = 1): non closed polylines as: streets, roads, highways, rivers, canals, train tracks ...- points (shapeType_ = 2): any point features, like: Trees, building entrances, benches, junctions between roads... Store locations: restaurants, bars, pharmacies, post offices...
So basically when you ran the "OSM shapes" component with the shapeType_ = 2, you will get a lot of points. If you would like to get only 3d trees, you run the "OSM 3D" component and it will create 3d trees from only those points which are in fact trees. You can also check which points are trees by looking at the exact location on openstreetmap.org. For example:
Or use the "OSM Search" component which will identify all trees among the points, regardless of whether 3d trees can be created or not.However, when it comes to 3d trees there is a catch:
Sometimes the geometry which Gismo streams from OpenStreetMap.org does not contain a "height" key. Or it does contain it but the value for that key is missing.OpenStreetMap is free editable map database, so anyone with internet access and free registered account on openstreetmap.org can add features (like trees) to the map database. However, regular people sometimes do not have height measuring devices which are needed for specific objects as trees.So "OSM 3D" component will generate 3d trees from only those tree points which contain a valid "height" key.However, a small workaround is to input a domain(range) into the randomHeightRange_ input of "OSM 3D" component (for example the following one: "5 to 10"):
This will result in creation of other 3d trees which do not have defined height, by randomizing their height. randomHeightRange_ input can also be applied to 3d buildings, and it is definitively something I need to write a separate article on.
In the end it may be that nobody mapped the trees in the area you are looking for.
After you map a tree to openstreetmap.org then it will instantly be available to you or any other user of Gismo. I will be adding some tutorials in the future on how this can be done. But probably not in the next couple of weeks.
Let me know if any of this helps, or if I completely misunderstood your issue.…
Added by djordje to Gismo at 3:52am on February 8, 2017
n due at the end of march. i am hoping to see if i can do this as a sort of "HIVE MIND" experiment with one or two or more posters to the forum. i have uploaded two files to http://www.formpig.com/nine_bar-FAR and I have the following goals:
1. To "kinematically iterate" various formal building envelopes based upon a 50' x 100' lot that "conform" to the nine bar linkage geometry.
2. This lot would have "setbacks" consisting of two 5' side setbacks, a 10' rear yard setback and a 25' front yard setback. max height on the structure is 32' and the allowable overhangs into the setbacks are 2'. I would like to find a way to use the "nine bar geometry" to construct a series of iterations for "floors", "walls" and "ceilings", which would then be tied to a volumetric (cubic volume), or a total square footage (perhaps based upon two horizontal section cuts) which was based upon a given number that I will provide per local building code.
3. Laid on top of this we would also have "mcmansion ordinance" requirements based upon the pdf enclosed. i expect to have this "tent restriction" data in digital form to upload to ftp shortly.
It would be up to you individually or collectively to determine how best to position this "in the real world" based upon the lot, setbacks, zoning requirements etc. For instance, perhaps the nine bar configuration has its vertices coplanar with the 50' x 100' x 32' envelope restrictions and then the chosen volume is "trimmed' by the setback requirements. Or perhaps the nine-bar configuration is generated completely within the setbacks, or perhaps it is generated 2' outside of the setbacks so as to take advantage of the 2' overhang allowance on the setbacks, etc.
*
Given an opportunity to develop the work in a second phase we would have an opportunity to tie this into various efficiencies such as Bill of Materials (wall floor and ceiling square foot calculations), envelope to volume calculations, solar panel efficiencies (solar orientation and envelope geometry) etc, etc (love to get suggestions for this).
*
I've become /really/ convinced that this would be a /really/ interesting entry based upon my just finishing up Kas Oosterhuis' Towards a New Kind of Building: A Designer's Guide for Non-Standard Architecture". In an ideal world I was hoping that it would be possible to hash this out discussion-wise and then literally passing it around on the list after someone eventually made the first move by tossing out a rough ghx script. My expectation would be to finalize it rapidly in the next two weeks. Something of a contemporary version of a design charette.
However, I realize this may not be workable so if you have experience in this arena and particularly if you think this is a brief that is straighforward enough to be almost literally implemented in Grasshopper, please contact me for any wage and/or contract fee requirements.
I'm getting a bit of a late jump on this but my hope is that with the right participant(s) that I can thrash it together quick enough for the first round.
info@formpig.com…
case for sure (started by Giorgio a couple of days before). Ive got involved because I exploit ways to "relax" shapes on nurbs (say patterns created by Lunchbox or "manually) without using any kind of mesh (more explanations soon).
Here's 5 test cases (SDK appears that doesn't have some "thicken surface" thing ... thus the algo that finds the "whole" shapes is rather naive) VS 2 Kangaroo "methods" and the why bother (he he) option as well.
If the goal is to "fit" these shapes within the nurbs ... does it work so far? No I'm afraid (appears that "springs" used are not the proper ones - or [Kangaroo1 option] the lines that pull should been originated from valance 2 points only)
Tricky points:
1. Internalize appears having a variety of serious issues (see Input inside definition) - Load Rhino file first (but even so ...).
2. Pull to surface is deactivated - this is not the issue here (and it's very slow).
3. Since Starling/WB alter the "curves - points" related order
the issue here (Pull points to curves) is to correspond apples to apples:
and that's what Anemone does:
From chaos :
to order:
this means that prior activating Kangaroo you should double click to the Anemone start component in order to "sort" properly the curves.
But .. fact is that results are pathetic:
more soon
best, Peter…
ing results and I think it is based on the assumption of small displacements. That’s why I want to try with LaDeform.
But doing this I met some problems. I tried to experiment it on the small examples that are provided with Karamba:
1.LaDeform in load-controlled behavior
I know Karamba has mainly been created make form-finding and not properly precise calculations, but I’d like to evaluate deformations of my structure under certain loads (load-controlled). It is said to let it in Default value for MaxDisp (-1).
[Rhino view for deflection of the rope]
In this example derived from a Karamba example (Large_Deformation_Rope.gh), the programs shows different ways to get approximately equal max deflection. But, getting into it, I realized Load Multiplier for gravity is different from one model to another (-3.237 for Analyze TH1 and -134 for LaDeform). So what is the interest of the example if the quite similar shape of deflections are not got under the same loadings? (quite different loadings indeed)
Doesn’t it show on the contrary that LaDeform algorithm does not work properly, if you need to change the load multiplier?
The Grasshopper file is shown below.
2.MaxDisp
When I use the model is “max disp”, I command the deformation, but how can I get the value of the virtual force exerted (which I don’t know because it is now imposed)? What is its link with the imposed deflection?
Otherwise I can’t figure how to use it with displacement-controlled loading
3.Iterative process
As it seems impossible to use LaDeform process, I tried to test it by iterations, as you recommend it on the forum, saying that it is equivalent to an iterative Analyze Th1 process.
I tried to reproduce this loading but the result is not very enthusiastic as you can see. The Rhino file shows the progressive loading, with the corresponding Grasshopper files, where I
- disassemble the model,
- get the previous deformed model
- put in another part of the load,
- re-assemble and then calculate it on the previous deformed shape.
Do you have any idea why the answer is not the same ? (LaDeform seem to give like 5 times less for the same loadings) (and even controlling it by displacements the shapes do not fit the principle of the algorithm would let think)
[RhinView for Iterative process]
First step by analyze Th1, and result by LaDeform
4.Analyze Th1 after LaDeform?
Some tutorials of Karamba show that an analysis with Analyze Th1 is sometimes made immediately after a calculation in large deformations. What is its reason? It seems to sometimes change considerably the result. What is the sense of such an operation? Would it mean that LaDeform is not trustworthy?
ð My question is then: is there a way to make the use of LaDeform for other purposes than form-finding affordable and coherent? If I mistake using it, where?
Thank you very much for your help,
…