ld work.
For example there's a grid shell and I've got a number of control points (for example 3) that can move up and down.
Depending on the control points I get forms that are structurally good and some that are bad.
In my office we've got a GH-Component, which leads the geometry in structural members and solves the structural forces and so on through an external Software called Sofistik and afterwards gives back to GH some Values, for example maximum bending moments. (Like Karamba)
Now I want to create this optimization component or something like that to minimize e.g. the bending moments in the given geometry.
Let's start with the work of the component.
So when I've three control points that can only move in z-direction.
P1(0,0,Z1), P2(10,0,Z2), P3(5,5,Z3)
They only depend on Z, so everything depends on Z1 to Z3 which have a range between 0 and 10 f.e.
First I want to get some (between 9 and 15) random Particles, one particle consists of this 3 different Z's.
So for example the first particle Part1 is [Z1=10, Z2=5, Z3=7]
and the second particle Part2 is [Z1=7, Z2=1, Z3=9]
and so on.
I created these Start Particles in a Cluster. See attached file.
I also tried this in C#, but thought it is easier in GH.
After I've got the Start Particles I want to give out the first particle and evaluate with its including Z's the target value in GH. Therefore I had to take the first branch and graft this branch (Discussion before)
Afterwards I want to save this Target Value that depends on the first starting Particle. Then I want to give out the second starting Particle to evaluate its target Value and store it. And so on till the last target Value of the last Starting Particle got assigned.
Then I want to assign the particles with its target values. E.g. part1: t=0.9, part2: t=1.8...
Then I want to define neighborhoods or the count of the expected local minima.
These neighborhoods can look like: Each neighborhood has to include not less than 3 particles. And the particles have to be next to each other.
E.g. if there are 12 particles and I want to have a look for 3 local minima, I need 3 or 4 neighborhoods. Then I would take 3 neighborhoods, because the more particles in one neighborhood, the better.
So the Count of the neighborhoods would be N=min{(Count of Part/3)& N_min}
How to define these neighborhoods I don't know at the moment. I think it has to be searched for the distance between the particles. E.g. part1 with (9,9,9) and part2 with (9,9,8) are next to each other but part 3 with(1,1,2) is far away.
Then each StartParticle is set to Partx_localbest.
And in each Neighbourhood the best of these localbeststs is Part_NyBest. (The best ist the one with the smallest target Value)
Loop:
Now I want to create new Particles. These Particles don't change their Z-values randomly. They change their Z-Values depending on Part_NxBest and Part_localBest. Therefore it has to be evaluated a new velocityfactor with v_Partx_new=0,792*v_PartxOld+1,5*random(0,1)*(partx_localbest-partx)+1,5*random(0,1)*(part_NyBest-partx)
The new particles will then be partx_new=partx+v_Partx_new.
The new Particle partx_new will be set to partx and then set in the output.
then there has to be caught the targetValue of part1 afterwards part2 can be put out and its target value caught and so on.
Then it has to be looked for the Partx_localbest through comparing the partx_localbest and its target value with the new part_x and its target value. If the target value of the new partx is smaller than partx_localbest,
then partx_localbest is the new partx.
This has to be done for each partx. Afterwards the same for neighborhoods best (best of all partx_localbest in one neighborhood)
Endloop if velocity gets small.
Output all part_NxBest
Output all targetvalues of the part_NxBests.
So in the Input there have to be:
StartParticles if they are given through the cluster attached.
Device on the target Value like in the attached gh.file from David Rutten I found in the discussions
Count of neighborhoods
And in the output
Output particle for evaluation
Output all part_NxBest
Output all targetvalues of the part_NxBests
Hope didn’t forget anything. And hope it isn’t crushed to badly. Sorry for my bad English by the way ;-)
For more explanation, how the PSO works in other programs. There’s attached a workflow script (is it called like that?) I think for GH it should be a little bit changed like I tried in my explanations.
So if you can help me a in some parts or you have any advices would be great, otherwise thank you nevertheless!!!!
Thankfully there’s no limit for the words in the discussions :-D
Best, Heiko
…
Path(i) Dim actf As brepface actf = x.Faces(i) Dim intlist As New list(Of Integer)(actf.AdjacentEdges)
For j As Integer = 0 To intlist.count - 1 Step 1 Dim tempedg As BrepEdge = (x.Edges(intlist(j))) Dim tempcrv As curve tempcrv = tempedg.Duplicate ptree.Add(tempcrv, actpath)
Next
Next
Dim outputCrvs As New list(Of Curve) For i As Integer = 0 To ptree.BranchCount - 1 Step 1 outputcrvs.addrange(rhino.Geometry.Curve.JoinCurves(ptree.Branch(i))) Next
a = outputcrvs…
cess import Rhino import Rhino.Geometry as rg
# scriptcontext import scriptcontext as sc
from System.Collections.Generic import IEnumerable #######################################
c = rs.coercecurve(crvs) print(c) print type(c)
# define Corner None style CurveOffsetNoneStyle = 0 loftTypeNormal = 0
working_plane = rg.Plane.WorldXY d = 2000 tol = 0.0001
c.PointAtStart c.PointAtEnd
# create outside and inside offset curves c_l = c.Offset(working_plane, d, tol, CurveOffsetNoneStyle) c_r = c.Offset(working_plane, -d, tol, CurveOffsetNoneStyle) base_crvs = [c_l, c_r]
a = rg.Brep.CreateFromLoft(curves=base_crvs, start=None, end=None, loftType=loftTypeNormal, closed=True) # what the fuck IEnumerable?????
when running in grasshopper I got an error:
Runtime error (ArgumentTypeException): expected IEnumerable[Curve], got list Traceback: line 33, in script
so How to create a IEnumerable list for CreateFromLoft()to use in python?? looks like it's a c# collection I somehow I tried to search usage from msdn but seems too greek for me. many thanks!!!!
…
e Workshop and Conference will be a gathering of the global community of innovators and pioneers in the fields of architecture, design and engineering.
The event will be in two parts, a four day Workshop 14-17 July, and a public conference beginning with Talkshop 18 July, followed by a Symposium 19 July. The event follows the format of the highly successful preceding events sg2010 Barcelona, sg2011 Copenhagen, sg2012 Troy, and sg2013 London.
sg2014: Hong Kong
Image: Cities without Ground - Adam Frampton, Jonathan D Solomon and Clara Wong
URBAN COMPACTION
Large cities thrive on density and diversity. But beyond the energy and pollution advantages of the elevator over the automobile, complex issues are at play in concentrating population and built infrastructure in contemporary high-rise cities. How do you meet the challenges of system design for high quality compact urban environments?
Designing for high and increasing density in cities is a complex and wicked problem that calls for innovative approaches to modelling in diverse areas of the city’s dynamics.
sg2014 Challenge: Urban Compaction
WORKSHOP
The SG Workshop is a unique creative cauldron attracting attendees from across the world of academia, professional practice as well as many of the brightest students. The Workshop is open to 100 applicants who come together for four intensive days of design and collaboration.
The annual Workshop is organised around Clusters. Clusters are hubs of expertise comprising of people, knowledge, tools, materials and machines. The Clusters provide a focus for Workshop participants working together, within a common framework.
We now have an open call to submit proposals for Workshop Clusters
call for clusters
CONFERENCE
Talkshop Conference Day One
After four intense days of innovative work, the first day of the conference, the Talkshop, offers an opportunity for critical reflection on what has been accomplished in the Workshop. Talkshop will be an opportunity to open debates, pose questions, challenge orthodoxies, and propose new ideas.
Talkshop will feature informal and open discussions between Cluster participants, leading practitioners and emerging talents in digital design, offering inside perspectives on how the landscape of computational design is reshaping built form.
Symposium Conference Day Two
The second day of the conference, the Symposium, will feature invited keynote speakers showcasing major projects and research from around the globe that mark out the territory of the year's Challenge. The Symposium is a unique opportunity to hear insights into the challenges ahead for the discipline.
Interwoven throughout the day will be reports and highlights from each Workshop Cluster, giving an opportunity to view work created during the previous four days of intensive collaboration, design and development.
More information about the conference, including speakers, to be posted soon.
www.Smartgeometry.org…
Added by Shane Burger at 10:51am on February 3, 2014
. From the Thermal Comfort Indices component, Comfort Index 11 (TCI-11):MRT = f(Ta, Tground, Rprim, e)
with:- Ta = DryBulbTemperature coming from ImportEPW component- Tground = f(Ta, N) where N comes from totalSkyCover input. Tground influences the long-wave radiation emitted by the ground in the MRT calculation.- Rprim defined as solar radiation absorbed by nude man = f(Kglob, hS1, ac)- ac is the clothingAlbedo in % (bodyCharacteristics input)- I can't find any definition in the code of Kglob and hS1. Could you tell me please what are those values referencered to? --> probably the globalHorizontalRadiation but how?- e = vapour pressure calculated from Ta and Relative Humidity input
Do you agree that in this case the MRT does not depend on these inputs: location, meanRadiantTemperature, dewPointTemperature and wind speed?It does not depend neither on the other bodyCharacteristics like bodyPosture, age, sex, met, activityDuration...?
MRT calculated by the TCI-11 method is the mean radiant temperature of a vector pointing vertically with a sky view factor of 100%?For ParisOrly epw,
2. From the SolarAdjustedTemperature component (that seems to be more used for the UTCI calculation examples on Hydra compared to TCI-11).
In contrast to the TCI-11, this component distinguishes diffuse and direct radiation and contextualizes the calculation thanks to _ContextShading input, right? It can also be applied to a mannequin thanks to the CumSkyMatrix and thus evaluate the dishomogeneity of radiation exposure.This component seems not to consider the influence of vapour pressure on the result --> is it then more precise to put the MRT output (from the TCI) as an input of meanRadTemperature for SolarAdjustedTemperature?The default groundReflectivity is set to 0.25 --> is GroundReflectivity taken into account in the Tground or MRT calculation in the TCI component? If yes, what is the hypothesised groundReflectivity?The default clothing albedo of 37% (TCI-11 bodyCharacteristics) corresponds to Clothing Absorptivity of 63%?
If the CumSkyMatrix input is not supplied, I get 9 results for the mannequin --> where are those points/results coming from?
If the CumSkyMatrix input is supplied,I suppose the calculation of the 482 results correspond to a calculation method similar to the radiation analysis component that is averaged over the analysis period. Right?But I don't understand why the mannequin is composed of 481 faces and meshFaceResult gives 482 results.
Finally, what is the link between the MESH results, the solarAdjustedMRT and the Effective Radiant field ? Is there a paper to have a detailed explanation of the method?
3. Here are some results for the ParisOrly energyplus weather data. You can find here attached the grasshopper definition.There is no shading in this simulation and the result coming from the ThermalComfort indices for MRT is very different compared to the solar adjusted MRT.Why such a big difference and which of the result should be plugged into the UTCI calculation component?
Results for ParisOrly.epwM,D,H:1,1,12
Ta : 6.5°Crh: 100%globalHorizontalRadiation: 54 Wh/m2totalSkyCover: 10MRT (TCI-11): 1.2°C
_CumSkyMtxOrDirNormRad = directNormalRadiation : 0 Wh/m2diffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.64°CMRTDelta: 4.14°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.47°CMRTDelta: 3.97°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = MRT (TCI-11)solarAdjustedMRT: 5.17°CMRTDelta: 3.97°C
Thanks a lot for your helpRegards,
Aymeric
…
art of the optimization process. The main strategy is simple, deep search of the design space and assessment/visualization of building performance using the above mentioned components.
I wanted however to also get an idea, quantitatively, of the performance of the different combinations of design parameters. And I wanted to do this within the framework of a search process and without using any of the well known (and certainly superior) optimization tools like Octopus. To do that I created this ridiculously simple definition, replicating the Desirability function for multivariate optimization introduced by Harrington (1965).
The math is quite rudimentary:
- in case of minimization of a parameter, the function takes the form
0, if x > B
d(min) = [(x-B)/(A-B)]^s, if A <= x <=B
1, if x < A
- in case of maximization of a parameter, the function takes the form
0, if x < A
d(max) = [(x-A)/(B-A)]^s, if A <= x <=B
1, if x > B
where:
A, B, and s chosen by the user.
A and B are the lower and upper limits for a specific parameter and are therefore completely subjective. Additionally, s is a value that makes the criterion easier or more difficult to satisfy (e.g. lower s values during minimization assign greater importance to a criterion, etc.).
Additional variables can be added by reproducing (i.e. copy/pasting) the individual functions and assigning a new parameter, limits and s values. The overall desirability of the model is given by a simple geometric mean of all the variables (the example is developed for 2 but you can quickly understand the logic).
Again, this is a very primitive definition so there are no checks for any kind of errors (e.g. A and B selected the same, not connecting variables values to the geometric mean calculation, not getting the length of parameter list, etc). Also, I am well aware that a beginner to medium level of programming can transform this definition into an actual useful thing (if it is indeed smth useful). Finally, I am not even sure of the appropriateness of this method to the kind of analyses we perform with LB/HB. Despite all that, it seemed very simple to use, extremely easy to understand, and a really good way to put me to sleep. I hope someone finds it interesting!
Kind regards,
Theodore.…
types. Equations currently working:
Constant f(x) = c
Linear f(x) = ax+b
Parabola f(x) = a(x-h)² + k
Polynomial f(x) = a + bx² + cx³ + ...
Hyperbola f(x) = (ax + b) + (d/(x - c))
Reciprocal f(x) = 1/((x - b)^p) + c
Logarithm f(x) = log[base](x-b) + c
Cosine f(x) = a*cos(f(x-b)) + c
Sinc f(x) = a(sin(f(x-b))/x) + c
Gaussian
Block Wave
Sawtooth Wave
TriangleWave
Perlin Noise up to 8 octaves
Interpolation of N points using various interpolation schemes: {Nearest neighbour, Linear, Cubic, Akima, Bulirsch-Stoer, Equidistant polynomial, Floater-Hormann, Neville polynomial}
Rhino Curve (not quite sure yet how to expose control-points on this one)
Grasshopper Expression
Bezier spans, i.e. N sequential points and tangents (still working on this one actually).
I could add more types such as tan, arctan, hyperbolic trig functions, square-roots, etc. etc. but I've got enough for testing purposes now.…
ial Folders > User Object Folder
It'll pop in the last tab "User" in the GH UI)
It have some tolerances, because a "splitting point" is impossible to be always exactly at distance=0 to target curve, so I've used Curve Closest Point component to find the point (or better, the parameter) to make fragments then, but this is obvious....
Cluster inputs:- List of splitting points (will be flattened > no support for trees)- List of curves to be splitted (will be flattened > no support for trees)- Distance: point too far away from curve will not split (0.0001 units as default)(flattened input and taken first value)- Boolean: If true (default) points split EVERY curves within distance, if false only the nearest curve within distance
Cluster output:- List of curves, new splitted curves (fragments joined) replaces original curves to keep original list order. If fragment needed, explode output curves.
If you got any problem, let me know...
bye :D…
ng the shape with a plane, and thus you did get 2 blade at time, one from one side and one from the opposite side.
To sort them do like this:Dispatch (with predefined pattern "P" 0 , 1) to get in list "A" even blades and in list "B" odd blades; merged the 2 lists together to have again the sorted list.
Watch out, this solution works only for this particular situation...
To "scout" inside your definition, to understand the order of a list containing geometries, do something like this:
You should always be aware of what your datas are, how sorted, the tree structure, ecc ecc...
Cya!
:D
…
estand if there is something wrong on what i am doing or if i need to set up my definition/components differently
when the slider is setted as default I get a single set of XY coord. ( 2D Domain 0 to 1/0 to 1 ) but when i switch it to bezier crv nothing is generated
I think this has something to do with domain a series , i tried to split the "d domain trough a Div Dom comp but the point is that there is nothing coming out of the MD slider
is it possible to generate sets of values out of MD slider ? could anyone help me to do what i want ?
thanks in advance, regards
Filippo…