grid size 3 = 2.7 mins
grid size 2 = ??? memory peaks and rhino freezes.
However now that I have switch the unit of the rhino file to feet,
now grid size 3 = 18 mins.
which makes i suppose since the analysis will have to work with smaller tolerance.
The below img is what i got after 18 mins. I think also the fact that I have joined the individual units with solid union also make it longer maybe? you can see the mesh triangulation not only around the corners of masses but also inbetween different units (if you look at the top level you will see)
oh, and I also have very little disk space left.
I would like to share the file but right its a big mess and has a lot of stuff that is unrelated to this particular memory issue, like revit interoperability and urban modelling. and the definition is set up so that it needs to have an excel file that feeds what you see on the lower left corner, wing mass scales. In order to compare design studies I am animating the index of list component that feeds the different scale of the wings and the width of the floor plates you see. you can see it in my video here. I will try to clean it up a bit when I get a chance, but it seems like grid size 3 might work as a starting point.
when I get around to extract values from the mesh vertices and actually apply different facade designs driven from the parameters, I would know better what grid size might be necessary.
…
ld work.
For example there's a grid shell and I've got a number of control points (for example 3) that can move up and down.
Depending on the control points I get forms that are structurally good and some that are bad.
In my office we've got a GH-Component, which leads the geometry in structural members and solves the structural forces and so on through an external Software called Sofistik and afterwards gives back to GH some Values, for example maximum bending moments. (Like Karamba)
Now I want to create this optimization component or something like that to minimize e.g. the bending moments in the given geometry.
Let's start with the work of the component.
So when I've three control points that can only move in z-direction.
P1(0,0,Z1), P2(10,0,Z2), P3(5,5,Z3)
They only depend on Z, so everything depends on Z1 to Z3 which have a range between 0 and 10 f.e.
First I want to get some (between 9 and 15) random Particles, one particle consists of this 3 different Z's.
So for example the first particle Part1 is [Z1=10, Z2=5, Z3=7]
and the second particle Part2 is [Z1=7, Z2=1, Z3=9]
and so on.
I created these Start Particles in a Cluster. See attached file.
I also tried this in C#, but thought it is easier in GH.
After I've got the Start Particles I want to give out the first particle and evaluate with its including Z's the target value in GH. Therefore I had to take the first branch and graft this branch (Discussion before)
Afterwards I want to save this Target Value that depends on the first starting Particle. Then I want to give out the second starting Particle to evaluate its target Value and store it. And so on till the last target Value of the last Starting Particle got assigned.
Then I want to assign the particles with its target values. E.g. part1: t=0.9, part2: t=1.8...
Then I want to define neighborhoods or the count of the expected local minima.
These neighborhoods can look like: Each neighborhood has to include not less than 3 particles. And the particles have to be next to each other.
E.g. if there are 12 particles and I want to have a look for 3 local minima, I need 3 or 4 neighborhoods. Then I would take 3 neighborhoods, because the more particles in one neighborhood, the better.
So the Count of the neighborhoods would be N=min{(Count of Part/3)& N_min}
How to define these neighborhoods I don't know at the moment. I think it has to be searched for the distance between the particles. E.g. part1 with (9,9,9) and part2 with (9,9,8) are next to each other but part 3 with(1,1,2) is far away.
Then each StartParticle is set to Partx_localbest.
And in each Neighbourhood the best of these localbeststs is Part_NyBest. (The best ist the one with the smallest target Value)
Loop:
Now I want to create new Particles. These Particles don't change their Z-values randomly. They change their Z-Values depending on Part_NxBest and Part_localBest. Therefore it has to be evaluated a new velocityfactor with v_Partx_new=0,792*v_PartxOld+1,5*random(0,1)*(partx_localbest-partx)+1,5*random(0,1)*(part_NyBest-partx)
The new particles will then be partx_new=partx+v_Partx_new.
The new Particle partx_new will be set to partx and then set in the output.
then there has to be caught the targetValue of part1 afterwards part2 can be put out and its target value caught and so on.
Then it has to be looked for the Partx_localbest through comparing the partx_localbest and its target value with the new part_x and its target value. If the target value of the new partx is smaller than partx_localbest,
then partx_localbest is the new partx.
This has to be done for each partx. Afterwards the same for neighborhoods best (best of all partx_localbest in one neighborhood)
Endloop if velocity gets small.
Output all part_NxBest
Output all targetvalues of the part_NxBests.
So in the Input there have to be:
StartParticles if they are given through the cluster attached.
Device on the target Value like in the attached gh.file from David Rutten I found in the discussions
Count of neighborhoods
And in the output
Output particle for evaluation
Output all part_NxBest
Output all targetvalues of the part_NxBests
Hope didn’t forget anything. And hope it isn’t crushed to badly. Sorry for my bad English by the way ;-)
For more explanation, how the PSO works in other programs. There’s attached a workflow script (is it called like that?) I think for GH it should be a little bit changed like I tried in my explanations.
So if you can help me a in some parts or you have any advices would be great, otherwise thank you nevertheless!!!!
Thankfully there’s no limit for the words in the discussions :-D
Best, Heiko
…
three categories, each one corresponding to different shapeType_ input:- polygons (shapeType_ = 0): anything consisted of closed polygons: buildings, grass areas, forests, lakes, etc
- polylines (shapeType_ = 1): non closed polylines as: streets, roads, highways, rivers, canals, train tracks ...- points (shapeType_ = 2): any point features, like: Trees, building entrances, benches, junctions between roads... Store locations: restaurants, bars, pharmacies, post offices...
So basically when you ran the "OSM shapes" component with the shapeType_ = 2, you will get a lot of points. If you would like to get only 3d trees, you run the "OSM 3D" component and it will create 3d trees from only those points which are in fact trees. You can also check which points are trees by looking at the exact location on openstreetmap.org. For example:
Or use the "OSM Search" component which will identify all trees among the points, regardless of whether 3d trees can be created or not.However, when it comes to 3d trees there is a catch:
Sometimes the geometry which Gismo streams from OpenStreetMap.org does not contain a "height" key. Or it does contain it but the value for that key is missing.OpenStreetMap is free editable map database, so anyone with internet access and free registered account on openstreetmap.org can add features (like trees) to the map database. However, regular people sometimes do not have height measuring devices which are needed for specific objects as trees.So "OSM 3D" component will generate 3d trees from only those tree points which contain a valid "height" key.However, a small workaround is to input a domain(range) into the randomHeightRange_ input of "OSM 3D" component (for example the following one: "5 to 10"):
This will result in creation of other 3d trees which do not have defined height, by randomizing their height. randomHeightRange_ input can also be applied to 3d buildings, and it is definitively something I need to write a separate article on.
In the end it may be that nobody mapped the trees in the area you are looking for.
After you map a tree to openstreetmap.org then it will instantly be available to you or any other user of Gismo. I will be adding some tutorials in the future on how this can be done. But probably not in the next couple of weeks.
Let me know if any of this helps, or if I completely misunderstood your issue.…
Added by djordje to Gismo at 3:52am on February 8, 2017
n due at the end of march. i am hoping to see if i can do this as a sort of "HIVE MIND" experiment with one or two or more posters to the forum. i have uploaded two files to http://www.formpig.com/nine_bar-FAR and I have the following goals:
1. To "kinematically iterate" various formal building envelopes based upon a 50' x 100' lot that "conform" to the nine bar linkage geometry.
2. This lot would have "setbacks" consisting of two 5' side setbacks, a 10' rear yard setback and a 25' front yard setback. max height on the structure is 32' and the allowable overhangs into the setbacks are 2'. I would like to find a way to use the "nine bar geometry" to construct a series of iterations for "floors", "walls" and "ceilings", which would then be tied to a volumetric (cubic volume), or a total square footage (perhaps based upon two horizontal section cuts) which was based upon a given number that I will provide per local building code.
3. Laid on top of this we would also have "mcmansion ordinance" requirements based upon the pdf enclosed. i expect to have this "tent restriction" data in digital form to upload to ftp shortly.
It would be up to you individually or collectively to determine how best to position this "in the real world" based upon the lot, setbacks, zoning requirements etc. For instance, perhaps the nine bar configuration has its vertices coplanar with the 50' x 100' x 32' envelope restrictions and then the chosen volume is "trimmed' by the setback requirements. Or perhaps the nine-bar configuration is generated completely within the setbacks, or perhaps it is generated 2' outside of the setbacks so as to take advantage of the 2' overhang allowance on the setbacks, etc.
*
Given an opportunity to develop the work in a second phase we would have an opportunity to tie this into various efficiencies such as Bill of Materials (wall floor and ceiling square foot calculations), envelope to volume calculations, solar panel efficiencies (solar orientation and envelope geometry) etc, etc (love to get suggestions for this).
*
I've become /really/ convinced that this would be a /really/ interesting entry based upon my just finishing up Kas Oosterhuis' Towards a New Kind of Building: A Designer's Guide for Non-Standard Architecture". In an ideal world I was hoping that it would be possible to hash this out discussion-wise and then literally passing it around on the list after someone eventually made the first move by tossing out a rough ghx script. My expectation would be to finalize it rapidly in the next two weeks. Something of a contemporary version of a design charette.
However, I realize this may not be workable so if you have experience in this arena and particularly if you think this is a brief that is straighforward enough to be almost literally implemented in Grasshopper, please contact me for any wage and/or contract fee requirements.
I'm getting a bit of a late jump on this but my hope is that with the right participant(s) that I can thrash it together quick enough for the first round.
info@formpig.com…
ices to regulate light and view while simultaneously producing both ornamental as well as material effects. The workshop will make extensive use of our Digital Fabrication equipment, coupled with Parametric Patterning techniques in Grasshopper for Rhinoceros. In a fast-paced and hands-on learning environment, participants will explore issues pertaining to the Coordination of Fabricated Parts through Unique Object Attributes, Baking Objects with User-Defined Attributes, Nesting Optimization with Rhinonest for Grasshopper, as well as the precise creation and manipulation of Computational Geometry through parametric modeling interfaces.
The workshop will begin with an examination of a set of Parametric Schemas as a means of identifying a suite of conceptual approaches to the regulation of light and view. Site-Specific Influencers such as Solar, Shading, Air Flow, and Viewing requirements will serve as Catalysts for the Parametric Articulation of a series of screening devices through shifting, perforating, and graduating Patterns. Emphasis will be placed on consistent organization of data through Lists and Data Trees and best practices for Professional Workflow Integration, File Modularity, and Data Visualization in Grasshopper.
SCREENING will focus on the Grasshopper, supplemented by the add-on Rhinonest as a means of fully integrating fabrication logics into Parametric workflows. The workshop is structured to allow each participant time to iteratively develop design prototypes, moving quickly from digital design environments to material artifacts and back again. As the next installment in the modeFab series, participants in this workshop will be introduced to and work directly with a large-format CNC Laser Cutter to develop scaled and 1:1 Component Assemblies. This workshop, particularly well suited for intermediate users, offers an in-depth and rigorous expansion on the topics of Algorithmic Design, Computational Geometry, and Parametric Modeling through the lens of Digital Fabrication and Prototyping with Grasshopper. As part of a larger online infrastructure, modeLab, this workshop provides participants with continued support and knowledge to draw upon for future learning.
Attendance will be limited to provide each participant maximum dedicated time with instructors.
Participants should be comfortable with the fundamental concepts of parametric design and general usage of Grasshopper.
Topics:- Parametric Design :: Fundamental Concepts and Essential Skills- Data Structures :: Working with Lists and Data Trees- Patterning Logics :: Shifting, Perforating, and Gradients- Pressures + Influencers :: Working with Attractors, Image Mapping, and Data Sampling Strategies- Material Strategies :: Folding, Lamming, and Lapping- Detailing :: Connection Logics and Tolerance- Coordination :: Logical Naming, Cut Order, and Sheet/Part Management- Fabrication Workflows :: Layouts and Nesting with Rhinonest…
year, international teams located in key cities around the globe explore a common agenda with projects that are deeply embedded in diverse local conditions. Because of this, participants have an international laboratory to test their design hypothesis, understanding how design conclusions derived locally can be tested and evolved globally in different cities where other teams reside. This intensive two week course connects each participant to ongoing research agendas in robotics, simulation, physical computing, parametric design, digital fabrication, and other relevant emerging design methodologies. Specific emphasis is placed on understanding the multiscalar implications of design conclusions, thus creating critical research advanced on the application of new technologies in design.
HYPER CITIES
The way we describe and understand cities today is radically changing, and alongside this change there is also a radical transformation in the tools we use to design them. Cities call for a different approach towards the development of new multi-scalar strategies in urban design and planning solutions. Cities can be described as systems of networked ecologies: a series of co-dependent aggregations revolving around environmental mitigation, land-use organization, communication and service delivery. These generate a complexity that can be organized through technology, laws, political pressures, disciplinary desires, environmental constraints and social interaction. In fact networked ecologies embody the dominant form of organization today: the network, be it telematic, physical or even social.
GSS16 will focus on the potentials of this network to work not only at an urban scale, but also across diverse cities, interconnecting and expanding them. These will ultimately create a dynamic and interactive system of “HYPER- CITIES”: A variety of city-sensors (digital or analogue) processing and transferring information in explicit manifestations, interrelating with the collective environment.
The GSS16 will be directed by IaaC in collaboration with multiple Nodes participating in the course from and in different parts of the globe. All the Nodes engaged in the GSS16 will be challenged to define a 1:1 urban machining intervention transferring and digitally interrelating multiple sets of data collected from hyper-connected cities. The goal is to work as a globally distributed campus to generate a hyper-network of interventions that communicate to each other both locally and globally.
…
Added by Aldo Sollazzo at 3:17am on February 23, 2016
ion, extract structural data, produce 2d drawings, and exchange data with other external software. Nemo also includes free tools to create parametric shapes, such as Naca profiles, hydrofoils, keels, rudders, blade propellers, and sail plans.
Born in 2018 as an academic research project at ENSTA Bretagne, Nemo grew up since, immersed in professional naval architecture practice with L2Onaval.
From 2021, Nemo is now available for purchase with commercial or educational licenses. Following license levels are provided to fit every needs depending of user activity :
Free (Designer)
Level 1 (Section + Hydrostatics + Visualization)
Level 1 + 2 (Section + Hydrostatics + Visualization + Resistance + Structure)
We can also help you make best use of our software, provide project guidance, establish specific workflow and create custom tools.
Requirements
Microsoft Windows 10 or Apple Mac OS 12 Monterey :
McNeel Rhinoceros 7 SR26
(Other Rhinoceros, Windows and Mac OS versions have not been tested but may work)
Additional info
Food4Rhino Download
Discourse Forum
Facebook Page
Linkedin Page
Nemo Website
Credits
Authors : Mathieu VENOT
Contributors : Paul POINET, Laurent DELRIEU
…
make sure I add this information to groundTerrain_ inputs in the next few days.
So if you are using "Gismo Terrain Generator" component (former "Ladybug Terrain Generator 2" component), only the following types are allowed for groundTerrain_ input: type_ = 2 (surface with rectangular edges)
type_ = 3 (surface with circular edges)If you are using "Ladybug Terrain Generator" component, then only the:
type_ = 1 (surface with rectangular edges)
is allowed.
As for terrain not being colored when it is created as a surface, you can analyse it additionally with "Terrain Analysis" component for Elevation analysis type. It can even be colored for rendering afterwards by using the "OSM Render Mesh" component. Check the attached file below.Have in mind that in urban areas "Ladybug Terrain Generator" component produces much more precise terrain than "Gismo Terrain Generator" component. On the other hand, the latter component can generate much larger terrain areas (up to 10 000 sq km2, at least in theory).
The reason why component might still work even though a terrain mesh has been added to the groundTerrain_ input is probably because once groundTerrain_ input fails to convert a mesh to a brep, this results in it being equal to None. Component then considers as if groundTerrain_ input is empty and runs as if nothing has been added to it (the buildings are laid down on a flat plane with 0,0,0 as the plane origin).
Thank you once again for all the testing you are doing!!! It really makes Gismo a better plugin!!…
Added by djordje to Gismo at 12:45pm on February 8, 2017
ck body with view factor 1 for the exterior. Accordingly, I have set this as the default whenever anyone plugs in the words 'outdoor' for the film coefficient or plugs in a convective film coefficient greater than 10 W/m2K (which is pretty certainly an outdoor condition). You can see the changes here on the github and, if you update your components to sync with the github, they will now work in this manner:
https://github.com/mostaphaRoudsari/honeybee/commit/8804bbdc65bc26a2eef97f5ab358a3191b8b6b12
I'll update the example files with these new components soon.
Furthermore, for the sake of giving complete control over to people using the THERM components, I have added an extra input for a "Custom Radiant Environment" (customRadEnv_) and an extra component by the same name to generate what is needed for this input:
This allows you complete control over the radiation model, view factor, radiant temperature of the environment, and the emissivity of the environment. If you leave the viewFactor input blank, it will assume an autoEnclosure model but, if you specify a viewFactor, it will use the black body model along with that view factor.
Finally, I confirmed that the Constant Heat Flux Boundary Condition is mostly intended to account for solar radiation. So I added an input for this on the boundary condition component.
I'll post back here once I get the chance to update the example files.
Thanks again,
-Chris…
ed to do:
FOA_Bundle_Tower.pdf
The tower height is a variable
The degrees of symmetry in plan is variable from 2 to 10 (2 bundles up to 10 bundles; the actual project has 4 bundles made from 8 individual towers or tubes).
The overall radius or diameter of the circle on which each tower is located is a variable
The tower should match the overall topology of the Bundle Tower: each tube should alternate between touching its neighboring tube on the left and right twice.
The number of floors is a variable
Overall tower height: 500m- Floor to floor height: 4.5m (I recommend that you increase this to 10m while testing)- Each tube's plan roughly has an area of 1000m2
this is what i have got so far:
foa tower.ghx
I just need guidance because i am soo lost. thank you
…