th Galapagos and that I'm already accustomed with mass addition of the fitness. But this can't solve my current problem. I'll tried to explain the best I can.
Goal :
I have a tower façade to cover with modules. Each floor is divide with a radiant structural grid. Let say for the exemple that each floor is divide in 9 parts.
I want to put modules on each parts in order to cover the whole length.
The thing is that each floor is different (so finding the solution for one floor is not usefull). So we decide to use three kinds of module :
Module A is 793mm long.
Module B is 893mm long
Special module as a size between 0.714 and 0.872.
Each part of the façade must be completed with 0 to 10 Module A, 0 to 10 Module B and 2 Special Modules. Special module size can change for each part.
Solution for one part
I've realized a Galapagos solution which change these tree parameters (number of Module A, number of Module B and size of Special module) in order to minimize the difference between the total length of the modules and the length of the façade's part.
This is working very well.
Solution in Grasshopper :
Before runing Galapagos :
After runing Galapagos :
The problem
I've now to generalize the idea for the total number of parts. In my exemple now, I'll show 9 parts but the real number is several hundreds.
I changed the Number Sliders in Gene Pools. As for instance, the first Gene Pool contains 9 integers for the 9 numbers of Module A (one number for each part).
I used Mass Addition for the fitness.
And then you can guess the problem : basicaly, the solution is working. But it's very slow. I need so many time fort 9 parts that I can't imagine the time for the whole tower.
Why this ? Simply because Galapagos doesn't understand that the number of Modules A for the Part X has no influence on the Part Y. So it tries, for each part, to change every paramater (3x9 in my exemple) unless to change only the three paramaters wich affect each part.
Thus, with a large number (600-700) of parts, it's impossible to reach the beginning of a solution.
Is there someone here to help me ? Please ? :)
Thanks in anticipate,
Marc
Architect and structural engineer
http://parametriclab.eklablog.com/…
ies a step further towards informative models, how to extract data through a parametric process and design analysis which leads to performance based schema.The workshop will cover some advanced modeling techniques in grasshopper along with some useful grasshopper plugins "GECO,WEAVERBIRD,KANGAROO and more". An introductory to ecotect analysis will also be inculded.The workshop is dedicated to intermediate Grasshopper users " knowledge of GRASSHOPPER equivalent to which gained in Parametricisim WS or higher is preferred".Knowledge of ECOTECT is a plus but not necessary".
Schedule :Deadline for Registration : May 13,2013Workshop Starts : Thursday, May 16, 2013 - 5:30 pmThe workshop consists of 10 lectures, Each lecture lasts for 3 hours.3 lectures per week (Sun, Tues & Thur) ---------------------------------------------------Fees : 600 L.EYou have to fill the Registration Form below for place reservation.We only have few places available. ---------------------------------------------------Prerequisite :-Students should bring their own laptops---------------------------------------------------Registration Form:https://docs.google.com/forms/d/1qd7cTRi8fGJ3OiVPjiNzHA0ZRmXI2qCvk1CUQ-X_4H8/viewformYou can view previous Parametric workshops,Student work & presnetation here :Previous workshophttps://www.facebook.com/events/469048376477647/https://www.facebook.com/media/set/?set=a.548388031851299.1073741826.470747186282051&type=1https://www.facebook.com/events/178326265647678/…
me work I was doing on DP on GH. Here are my conclusions:
- As Rhino is not a constraint-based modeller, assembly design without plugins(RhinoWorks or else) is just not possible. So as long as constraints will not be present in rhino... no constraints, no AEC.
- The list management that GH offers is 10 000 time more efficient and user friendly. So a good point would be to link all the list management tools with GH-like interface. In fact, for all operations that are not concerning assembly (wireframe generation for example), GH is way ahead in terms of speed IF you're not dealing with geodesic curves or parallels on surface, eventually boolean operations, that are really a weakness of Rhino in terms of precision and stability. You can also do amazing synchronised attributes datatrees quite easily in GH, that you can then synchronise via Excel with a massive product based on Catia without problem. It can easily save you a few days of work.
- Rhino does not handle pre-computation of the geometry without loading effectively that geometry, so you will not be able today to work on a product bigger than 2Gb (maybe 3) in rhino in any way, even on rhino v5 64 with 16Gb of Ram. With the constraint stuff, I really think it is the second bad point about rhino.
- As Jon said, I think Rhino has to be understood as a sketch-oriented application for the construction (this is not pejorative, that's what I personnaly prefere) in a sense that its usefulness is to allow research of design possibilities, that you can of course link afterwards with what you want, but too much basic options are missing to rhino to be really viable for AEC. I personnaly don't want to see geometrical sets to appear in rhino, it is absolutely useless considering grasshopper evolution towards clusters for exemple.
After that, in purely technical terms I would say that:
1) Possible, partially already working --> Clusters (waiting for updates)/nested definitions + SQL for attributes management on several working definitions.
2) --> I think there are two ideas here: a) exporting some dead geometry in an arborescence of files (can be done quite easily with LocalCode but it will remain dead. You can also create a definition based on dead geometry and update this geometry using the geometry cache. Of course if this geometry is automatically exported via LocalCode from a precedent definition, when you update the upper definitions then the modification is repercuted on all your model. Personnaly I think it is best not to do it in rhino. b) otherwise, it is just synchronisation of public attributes attached to existing parts/products, as I described previously.
3) Geometry Cache. You can also auto-loop you file using loading/unloading input geometry of your desifnition with LocalCode and some VB.
But maybe I am wrong on some points of course.
Best,
Thibault.
…
ng in Grasshopper?
As a general recommendation for developers in Grasshopper who are writing a part of their library which is performance-sensitive (please note: often the performance sensitive part is very limited) is to write it in C#, or maybe even C, or maybe even assembly :). Of course, the closer to the machine you will be, the easier it will be to harness all minimal optimizations. However, there is always a compromise between "getting things done" and "making them best" and this boundary is not very easy to catch, right?
If you want to have significant speed improvements for numerical calculations, I would at least recommend developing with C# in a compiled component using Visual Studio or SharpDevelop. The reason is: in order to provide the line number of possible errors, Grasshopper compiles C# scripts in debug mode! They will be much less optimized than what is possible even with today's technology. This does not preclude keeping the project open-source, if that is one of your goals.
Regarding the actual list:
1) Yes, the implied loop will probably be slower than just a simple for loop. This is because Grasshopper code has to keep track of more things than the ones you could be considering with your knowledge of of your very-special case. However, a factor of 10 is simply not acceptable and is likely a symptom of something else. In fact, I think I remember fixing a bug around that in Rhino WIP. However, it appears to be still slower also there. I've added a bugtracking item here.
2) If you are able to do all casts that are involved, and do them as Grasshopper does, please write that code that way. For example, if you supply a curve to an input with number hint, Grasshopper computes the length of the curve. There will have to be an "if" that checks if the input is a curve somewhere (or some similar construct). This aid for designers is what slows down the hint input.
3) Grasshopper has to keep side effects at bay. For example, components B and C are both connected to outputs of A. If you edit data in component B, and that data came from A you of course expect that data to be unchanged in C. This means that, for even lists of numbers, Grasshopper has to perform a deep copy of the output for each input. Otherwise, what happens if B sorts the list and C finds the index of the smallest number? This could be improved if GH components had some way of flagging themselves as non-data-mutating (constant). The fact that, by supplying special types, Grasshopper has no way of performing copies will likely speed things up. But be aware of possibly very annoying side effects creeping in if data is not immutable. Another option is performing the copy "optimally", just where you need it, because you know where your data is used. This is not information that is available to GH at present.
Does this help?
Thanks again for your input,
Giulio--Giulio Piacentinofor Robert McNeel & Associatesgiulio@mcneel.com…
rring to the above image)
Area
effective
effective
Second
Elastic
Elastic
Plastic
Radius
Second
Elastic
Plastic
Radius
of
Vy shear
Vz shear
Moment
Modulus
Modulus
Modulus
of
Moment
Modulus
Modulus
of
Section
Area
Area
of Area
upper
lower
Gyration
of Area
Gyration
(strong axis)
(strong axis)
(strong axis)
(strong axis)
(strong axis)
(weak axis)
(weak axis)
(weak axis)
(weak axis)
A
Ay
Az
Iy
Wy
Wy
Wply
i_y
Iz
Wz
Wplz
i_z
cm2
cm2
cm2
cm4
cm3
cm3
cm3
cm
cm4
cm3
cm3
cm
I have a very similar table which I could import to the Karamba table. But I have i_v or i_u values as well as radius of inertia for instance.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
dimensjon
Masse
Areal
akse
Ix
Wpx
ix
akse
Iy
Wpy
iy
akse
Iv
Wpv
iv
Width
Thickness
Radius R
[kg/m]
[mm2]
[mm4]
[mm3]
[mm]
[mm4]
[mm3]
[mm]
[mm4]
[mm3]
[mm]
[mm]
[mm]
[mm]
L 20x3
0.89
113
x-x
4,000
290
5.9
y-y
4,000
290
5.9
v-v
1,700
200
3.9
20
3
4
L 20x4
1.15
146
x-x
5,000
360
5.8
y-y
5,000
360
5.8
v-v
2,200
240
3.8
20
4
4
L 25x3
1.12
143
x-x
8,200
460
7.6
y-y
8,200
460
7.6
v-v
3,400
330
4.9
25
3
4
L 25x4
1.46
186
x-x
10,300
590
7.4
y-y
10,300
590
7.4
v-v
4,300
400
4.8
25
4
4
L 30x3
1.37
175
x-x
14,600
680
9.1
y-y
14,600
680
9.1
v-v
6,100
510
5.9
30
3
5
L 30x4
1.79
228
x-x
18,400
870
9.0
y-y
18,400
870
9.0
v-v
7,700
620
5.8
30
4
5
L 36x3
1.66
211
x-x
25,800
990
11.1
y-y
25,800
990
11.1
v-v
10,700
760
7.1
36
3
5
L 36x4
2.16
276
x-x
32,900
1,280
10.9
y-y
32,900
1,280
10.9
v-v
13,700
930
7.0
36
4
5
L 36x5
2.65
338
x-x
39,500
1,560
10.8
y-y
39,500
1,560
10.8
v-v
16,500
1,090
7.0
36
5
5
I have diagonals (bracings) which can buckle in these "non-regular" directions too, and they do. If I could add those values then in the Karamba model I could assign specific buckling scenarios..... I can see another challenge which will be at the ModifyElement component, I will not be able to choose these buckling lengths, in these directions.
Do you think this functionality can be added within short, or should I try to find another way to model these members?
Br, Balazs
…
as one element.
Thank you
Comment by karamba on October 7, 2014 at 11:27pm
Hello Patricio, divide the beams in such a way that each boundary vertex of the shell becomes an endpoint of a beam segment.
Best, Clemens
Comment by Llordella Patricio on October 8, 2014 at 8:30amDelete Comment
Hi Clemens,
I did what you suggested but now assemble element doesn´t work properly. Could you please tell me how to fix it? Thanks in advance, Patricio
8-10-14losa%20cadena.gh
Comment by karamba on October 8, 2014 at 11:59am
Hi Patricio, if you flatten the 'Elem'-input at the 'Assemble'-component the definition works. The triangular shell elements have linear displacement interpolations whereas the beam deflections are exact. In order to get correct results you should refine the shell mesh.
Best, Clemens
Comment by Llordella Patricio on October 9, 2014 at 8:35amDelete Comment
Hello, succeeds in creating the mesh to the slab, and built the beam segment, but when I see the deformations are not expected because the beam is deformed as the slab.
Thanks for the help
PS: maybe I'm using the program for a type of structure that is not the most appropriate, as I saw in the examples of other structures. But this type of structure is that students taught
best regards
Patricio
9-10-14%20Example%201.gh
Comment by karamba on October 9, 2014 at 10:46am
You could use the 'Mesh Edges'-component to retrieve the naked edges and turn them into beams - see attached file:91014Example1_cp.gh
Best regards,
Clemens
Comment by Llordella Patricio on October 15, 2014 at 3:41pmDelete Comment
Dear clemens
I was doing a rough estimate of the deformation, and I can not achieve the same result with Karamba. When I make a rough estimate of the result with Karamba beams and mine are very similar, I think the problem is when I connect the shell, because there are no similar results.
I sent the GH file, and an image of the calculation
The structure is concrete The result I get is 0.58cm
thank youPatricio
15-10-14%20Example.gh
Comment by karamba yesterday
Dear Patricio,
try to increase the number of shell elements. As mentioned in the manual they are linear elements. A mesh that is too coarse leads to a response which is stiffer than the real structure.
Best,
Clemens
…
troducción a su plugin de modelado paramétrico, Grasshopper.
Con este tipo de herramientas podemos pensar formas más allá de las cajas para diseñar, porque seremos capaces controlar con total rigor geometrías muy complejas.
En el siguiente video, podemos ver un ejemplo realizado durante un curso impartido anteriormente en Madrid por el profesor, Francisco Tabanera, en el que se realiza una interpretación del proyecto de BIG para la Biblioteca Nacional de Kazajstán.
<a title="Interpretación de la Biblioteca Naiconal de Kazakstan, de BIG" href="http://www.youtube.com/watch?v=YLldO-SxgPw" target="_blank"></a>
A lo largo del curso se realizarán diferentes ejemplos que podrán ser realizados por todos los asistentes, ya que no es necesario ningún conocimiento previo para su seguimiento.
El curso se desarrollará en las oficinas de Arquitecton en Barcelona con el siguiente horario:
HORARIO
Sábado 1 de Marzo
De 9.30 a 13.30h.
Sábado 1 de Marzo
De 15.30 a 19.30h.
El curso está planteado para un máximo de 9 alumnos, para conseguir el máximo aprovechamiento posible por parte de los mismos.
El curso tiene un precio de 90€. Estudiantes y desempleados tienen un descuento del 10%. Es posible asegurarte una plaza con un primer pago de 25€ a modo de reserva.
Apúntate aquí…
1 assembly. I think something is mixed up with the refenced dll. but I do not know what.I am using 4.net with visualstudio 2010. It was possible fo example to bind opennurbs.dll as assembly in Rhino4 and older grasshopper about mid 2010. Now in Rhino5 and the new Grasshopper it throws errors. I think some assemblies are mixed up. I cannot use Rhino.IO.3dmfile.Read as it does not have all functions i need for building a proper database.
here is some buggy code snippet for getting the layertree.
// pre define OnTextLog dump_to_stdout = new OnTextLog(); dump_to_stdout.SetIndentSize(2); OnTextLog dump = dump_to_stdout; OnFileHandle dump_fp = null; OnXModel model = new OnXModel(); bool bVerboseTextDump = true; string sFileName = @"C:\test.3dm"; // open file containing opennurbs archive OnFileHandle archive_fp = OnUtil.OpenFile(sFileName, "rb"); if (archive_fp == null) { dump.Print(" Unable to open file.\n"); //continue; } dump.PushIndent(); // create archive object from file pointer OnBinaryArchive archive = new OnBinaryFile(IOn.archive_mode.read3dm, archive_fp); // read the contents of the file into "model" bool rc = model.Read(ref archive, dump); // close the file OnUtil.CloseFile(archive_fp); RMA.OpenNURBS.ArrayOnLayer layer_array = model.m_layer_table; // layerstruktur lesen List <System.Guid> list_int = new List<System.Guid>(); List <System.Guid> list_parent = new List<System.Guid>(); for (int i = 0;layer_array.Count() > i;i++) { list_int.Add(layer_array[i].m_layer_id); list_parent.Add(layer_array[i].m_parent_layer_id); } // calculate List <int> list_depth = new List<int>(); for (int i = 0;list_int.Count > i;i++) { list_depth.Add(tree_decompose(i, list_int, list_parent)); } A = list_depth;</item> <item name="AdditionalSource" type_name="gh_string" type_code="10">
int tree_decompose (int position, List<System.Guid> list_guuid, List<System.Guid> list_parent) { int temp_position = position; int send = 0; if (position != 0){ if (list_parent[position] != System.Guid.Empty) { temp_position = list_guuid.IndexOf(list_parent[position]); send = 1 + tree_decompose(temp_position, list_guuid, list_parent); } if (list_parent[position] == System.Guid.Empty){send = 0;} } if(position == 0){send = 0;} return send; }…
t, you can see 6 (+) signs with what you can add (A,B,C,P,Q,R).
Let's say you add A = 90 and B = 50.
Now you can't add the third angle (cause its 180-(50+90) = C output).
What you can add at the moment is P,Q,R.
You choose to add P = 10.
There is no more a possibility to add Q and R.
All component outputs now give us the data.
2. Triangle with P,Q,R
When you zoom the component, you can see 6 (+) signs with what you can add (A,B,C,P,Q,R).
Let's say you add P = 15, Q = 20.
Now if you add R, the component's outputs all the angles and edge lengths.
If R > P+Q then component throws warning. (> or >= ?)
You cannot add A,B or C anymone.
3.Triangle with P,Q and C
When you zoom the component, you can see 6 (+) signs with what you can add (A,B,C,P,Q,R).
Let's say you add P = 15, Q = 20.
Now if you add C (angle), the component's outputs all the angles and edge lengths.
You cannot add A,B or R anymone.
To make it all easier, disable the possibility to internalize the data.
Tolerance issue... Maybe round the angles always to floor , with 0.1 precision ?
…
ight be able to provide more insight). Whenever you run a new simulation in Radiance, it is not always necessary to re-write all of the initial simulation files from scratch. These initial simulation files include both a .rad geometry file as well as a separate .pts file that contains the test point locations. If all that you are changing in a given parametric run is the locations of the test points (like your case), it is not necessary to re-write (or reinterpret) the entire .rad geometry file. My guess is that there is some type of check for this built into either code Mostapha wrote or radiance functions that Mostapha is calling. As such, it seems that the rad geometry file is not being re-written (or re-interpreted by radiance) completely when all that you change is the test points and this actually seems to be saving you an extra 10 seconds each time that you run the component without changing the materials or the building geometry. Other times (like when you plug in custom radParameters), it seems that it re-writes (or re-interprets) the .rad geometry file from scratch since this file is probably affected by customized rad parameters.
So far, if this explanation is holding, it seems like there would be no concern on your end but I also recognize that the difference between these long and short simulations is giving you radiation results that are ever so slightly different from each other (by my estimates, they differ by about 0.2%). Compared to the other types of assumptions that the radiance model is making, though, these are mere rounding errors that probably originate from the number of decimal places in the vertices of the rad geometry file. Rather than worrying about whether your simulations are giving you the right rounding errors to give you matching results, I would encourage you to instead contemplate how much your radiance results are matching reality given all of the assumptions that you are making about the climate (with the epw file for a "typical" year) and with the number of light bounces in the radiance simulation. To give you an example, I ran your model with a higher quality of simulation type (3 ambient bounces) and this gives you results that differ by 1.1% from the original simulation that you were running with only 2 ambient bounces (this is practically an order of magnitude larger than 0.2%).
To address your unease I will say that, for a long time, I also felt uneasy any time that I encountered something that seemed unpredictable in software that I was using. Once I started coding my own stuff, though, I realized quickly that unpredictable behavior is an unavoidable aspect of all software. There is always a tradeoff between accurate results and the time it takes to get them, which produces a multitude of possible ways to arrive at a solution. Add into this complex situation the fact that you might have an almost infinite number of possible inputs to a given set of code.
Because of the unpredictable multitude of cases, there is no application that is completely free from limitations and assumptions. In this light, what ends up being more important than the actual calculation method used is the social infrastructure that is in place to help understand what is being run under the hood, hence why both Radiance and Honeybee are open source and why we try to build a robust community of support through forums like this one!
-Chris…