w how. Thanks for that. Now I do have some questions.
1. I am using the area weight tool. I am first calculating the volume of the form. I then multiply that value by it's density. So for concrete I am using 2400 kg/m^3 x volume. I then divided that number by the area of the membrane that is supporting the mass. This gives me my area weight. It seems to be working well but I want to verify that this is the correct workflow. I also want to verify that gravity would be turned off since I am thinking it is already calculated within the weight component.
2. I am finding that the new triangular element tool works much better than trying to use EA/L as input for the springs from mesh. Even when I set the timestep, subiteration, and drag I still have issues with getting very stiff materials to work. On the new finite elements tool I wanted to verify that E was in pascals. I also wanted to ask if I use imperial units can psi be entered. Now from what I am seeing the materials are deforming more than expected and to get less deformation and stretch in the mesh area I am finding the E value needs to be increased more than the true material values. Often I am raising E by a multiple of 10 or 100.
I am going to describe my problem and I will gladly share the definition if you'd prefer looking it over but basically I have an inflated membrane at a certain pressure made of a particular material. I then have a certain volume of concrete on top of the inflated membrane. My goal is to review the displacements as the concrete is applied over the membrane and find the proper pressures to apply to keep it free from deformation. I am including a picture from a project that we used kangaroo on and attempted to deal with such issues. It was a class sponsored by Cloud9 architecture held at Art Center College of Design where I was one of the instructors. Hopefully this illustrates the problem. To summarize any example file that shows the best way to implement real material properties and unit based forces would be a helpful reference and would be greatly appreciated.
…
ts in extreme aliasing effects that carry into the 3D realm as regular steps along what should be smooth surfaces.
On sleeping on it, I realized I hadn't yet tried fast Unary Force on fine quad meshes from the standard Grasshopper meshing system that includes the meshing options component.
Bingo! It's fast now. Workable. I don't need super fine meshing since I'm not running from aliasing. I can still use rather fine local meshes since Unary Force lets Kangaroo do a simple thing just in the Z direction rather than a full 3D force.
After only a minute or so of Kangaroo initialization that slows the interface, each of a dozen needed cycles takes half a second, FOR THE ENTIRE GRAPHIC.
I just set the timer to 1 second so I can move around the interface, and I double click the Windows taskbar timer shut-off to enjoy the result.
WHILE RUNNING VIA TIMER, IF I CHANGE A SPRING/FORCE SETTING IT SUFFERS NO DELAY AT ALL AND JUST ALTERS THE OUTPUT OVER TIME. I can change Unary Force from 20 to 100 and immediately see the bigger areas balloon like crazy:
It's fast enough overall to play with, yet the individual steps are slow enough that it's fun to watch the hysteresis as it overshoots back from 100 to 20 Unary Force, going concave in the middle of bulges then back to more shallow hills.
A force of 1000 is a bit disturbing, I wonder if I can tamp it down with greater spring strength or will that just give me the same result as before?
Looks like it's the same, just the ratio matters. Makes sense I guess. At one point it blew up though. Hitting the reset button...a minute later it blows up again...and just doesn't like huge numbers, so I don't see an advantage playing with bombs. The high mesh strength is pulling the mesh apart.
With low Unary Force and moderate mesh tension, you get flat tops, as if the overall force on the mesh fighting its anchored edge vertices, is enough to displace it, but the surface itself is too stiff to care about local gravity.
Then you have less flat areas as you increase Unary Force:
Weird, there *is* some sort of absolute effects, rather than just relative, between Unary Force and spring stiffness, since now I'm getting flat tops even in the extreme:
Oh, wait, strike that, I may be seeing but a single step with the timer off, subject to hysteresis. With the timer back on...it can sit there a minute...not locked up but just idling...until you see the Display > Widgets > Profiler time start cycling to near half minute numbers...makes you want to hit the reset button...and indeed that locks the interface for another initialization...and yes, it was merely hysteresis, not an equilibrium result. My former flat tops may have been due to that too, due to my use of the Windows taskbar timer disabler. The lesson is that you can obtain different results by using a long timer setting and just stopping it before it equilibrates.
This script is a keeper, fast and fun after the relatively mild Kangaroo initialization period is over.
The uniform mostly quad meshing is all done in Grasshopper too, from any flat surface with holes, especially from images of shapes that are traced with potrace to give surfaces with holes.
Could I switch to hex meshes from triangular meshes to do the same thing with fewer vertices?
Are there other forces I can add to smooth the bulging? Letting things bulge is not so bad if you then just scale down the result in Z afterwards (though perhaps the same result could be had with lesser force):
Also, can this same thing be done with possibly faster Kangaroo 2?…
Added by Nik Willmore at 10:02pm on February 21, 2016
ly this is a Rhino.Python problem and not a Grasshopper issue, but it could apply to both!
I was trying to take a simple example of moving a ball around and see how it could be animated through Rhino.Python. The code works great in wire frame with now memory issues at all. However, when I switch the view to Shaded or Rendered, things go south pretty quickly. The RAM usage of Rhino which was steady around 350mb (ish) now grows every frame after a minute or so, it is in the GB's and never drops even after the script has stopped.What gives? Clearly this must be possible because Bongo does something similar when it does animations. Check out my code below and I would love to hear your thoughts.
import time
import rhinoscriptsyntax as rs
import Rhino
height = 100
width = 100
x = 0
y = 0
xspeed = .1
yspeed = .3
start_time = time.time()
end_time = 60
run_time = 0
sphere = rs.AddSphere((x,y,0), 5)
while run_time < end_time:
x = x + xspeed
y = y + yspeed
if x > width/2 or x < -width/2:
xspeed = xspeed * -1
if y > height/2 or y < -height/2:
yspeed = yspeed * -1
rs.MoveObject(sphere, (xspeed, yspeed, 0))
Rhino.RhinoApp.Wait()
run_time = time.time() - start_time…
uts an instance of a class type, which I refer to as Class Instance A for this discussion post.
Component B, has a for-loop, through which it does a few things, and makes "n" number of variations (in this case, just 3) of Class Instance A. Simplified/generalized structure of Component B is as follows:
a=[]
for i in range (n):
.....variation = DoSomething (StartShape)
.....a.append(variation)
.....for obj in ghenv.Component.OnPingDocument().Objects:
..........if obj.Name == "MakeAssembly": ...............obj.ExpireSolution(False)
I am not sure if I am using the ExpireSolution method correctly here. I also have found this (Grasshopper.Instances.ActiveCanvas.Document.NewSolution(False)) from some other discussions, but it doesn't seem to solve this problem. Or maybe I have the placement of ExpireSolution or NewSolution wrong. Upon pressing F6 then F5, Component B keeps updating Class Instance A, instead of "resetting" it to its original condition. For-loop in Component B must have the original Class Instance A every time, in order for it to spit out variations. I need to find a way to expire the solution for Component A only, at the end of each For-loop. Eventually, I need it to loop 100+ times, so manually pressing F5 is not really an option... Below is an image of what should happen:
What is the best way to tackle this problem (in either Grasshopper-Python or IronPython), so that the same original Class Instance A is fed in every time the loop is run in Component B?
Thank you! …
aching my skill set here, but bare with me.
I want to create an animated facade of squares which rotate depending on a sequence of grey-scale images. I've got pretty far thanks to many discussions here, but have hit a blank with exporting my animated model to 3ds max.
Here's my GH script - it's a botch of 3 or 4 various things incorporating centipede at the start and end to get the animation.
All good and it works! It produces animations which I can sequence for presentations too thanks to it's bmp export, which is sort of a side-product.
What I have a problem is that the OBJs it produces error wildly when imported to max. eg in rhino it looks like
But when I've imported them to max it looks like
and as it animates it just gets longer and smaller.
NOW I reckon it might be because my model in grasshopper is 100 separate geometries and it'd like it to be a single one - but I've not achieved that.
Does anyone have any ideas how to solve this? My end result I would like to look like this rendered still from max, but animated.
Thankyou all! This also uses Firefly, so you might need that installed to see how my file works.
…
Added by chris parrott at 10:34am on September 11, 2015
mbre de 9:00 am a 8:00 pm Este taller está dirigido principalmente a arquitectos y diseñadores interesados en el aprendizaje del diseño paramétrico y generativo aplicados a la generación y racionalización de geometrías complejas para su implementación en diferentes procesos de diseño. En el curso se abordarán los conceptos básicos y metodología para hacer frente a diversas problemáticas del diseño mediante el desarrollo de herramientas algorítmicas a través de un lenguaje de programación visual y el desarrollo de esquemas de fabricación digital. No se requieren conocimientos previos de Rhinoceros 3D ni de programación, conocimientos previos de CAD deseables. Estudiantes: 2,500 MXN Profesionales: 3,000 MXN
CONCURSO DE RENDERS - BECA DEL 100% - Parametric & Generative Architecture & Design Grasshopper Workshop.
- Publica tu render en www.facebook.com/3dmetrica - El render con más likes será el ganador. - Fecha límite de votaciones 15 de septiembre del 2012.
Informes e Inscripciones: workshop@3dmetrica.com 04455 28790084 www.3dmetrica.com www.facebook.com/3dmetrica
…
hops, design sessions & symposia across 5 cities in India. We encourage all architecture & design students and professionals to join us in this novel experimentation event and aid in 'Filling The Void'; Void in Architecture, Void in our Cities, Void in Education. REGISTRATIONS ARE OPEN NOW.
rat[LAB] Computational Design Tour - INDIA
Agenda // Filling The Void
1 country // 5 cities // 1 agenda // 100+ students // 25+ professionals // 5 exhibitions // 1 publication
Void is typically defined as null, invalid, empty or redundant and has a psychological perception of a ‘negative’. Through years of development in India, there has been an organic urban growth and inorganic architectural growth which has led to formation of voids in a physical and a metaphorical sense. There also exist voids as gaps between architecture, cities, education and technology. ‘Filling The Void’ looks at void as an opportunity, potential and a driver of change for architecture & design education in India.
// Cities & Dates*
Mumbai – 22nd June to 24th June 2015 (Monday to Wednesday)
Chennai - 29th June to 1st July 2015 (Monday to Wednesday)
Bengaluru – 3rd July to 5th July 2015 (Friday to Sunday)
Chandigarh - 16th July to 18th July 2015 (Thursday to Saturday)
New Delhi – 6th August to 8th August 2015 (Thursday to Saturday)
*Venue details are published on rat[LAB] website.
// Registration Dates
// Early-bird Registrations Open: 08 May 2015
// EXTENDED Early-bird registrations End: 05 June 2015
// General Registrations End: 15 June 2015 (Or till seats last)
…
reaky thing consisting from triangulated "modules" (i.e an assembly out of this, this and that) where the exterior edges ARE always under tension (= SS 304/316 cables OR nylon) and the interior ones MAY be under compression ( = steel, aluminum, wood, carbon) OR ... some of them ...may be under tension. Bastardized T trusses deviate a bit from theory ... but who cares? (not me anyway). T trusses have many variants (but as the greatest ever said: Less is More).
2. Large scale T for AEC is the art of pointless since it costs around the GNP of Nigeria. Here's some indicative components from a module of a multi adjustable TX system costing (the module) ~ the price of my Panigale (Google that):
The above is mailed to a friend who has MIT (yes, that MIT: the top dog) on sight ... therefor he needs some appropriate "credentials", he he.
3. The distance that separates the above with the demo TDT node provided is around 666.666 miles - but we don't care: we are after Art not some testimony to vanity.
4. On purpose I've used a smallish ring to give you a clear indication upon the constrain numero uno in truss design: CLASH matters.
5. You'll need:
(a) A decision related with the tensioners (classic Norseman + SS cables or nylon machined thingies?).
(b) A machinist who can do elementary stuff (like the adapters) and can weld this to that (the "ring" for instance). His abilities must be 1 in a scale of 100. If the fella has a computer (not a CRAY) and he knows what 3dPDF is (hmm) ... well ... use that way to communicate with him PRIOR designing anything: He must agree on the parts BEFORE the whole is attempted (as a design in GH or in some other app).
(c) A carpenter with a wood lathe for the obvious. BTW: BEFORE doing any TDT attempt > ask the carpenter about the available wood strut sizes. Against popular belief DO NOT varnish the wood (use exterior alkyd/oil stains from some top maker like the notorious US company PPG).
http://www.ppgpaints.com/products/paints-stains-data-sheets
(d) Good quality cigars (and espresso) plus some classic music (ZZTop, PFloyd, Cure, Stones, U2 etc etc) during the assembly.
(e) Faith to the Dark Side (see my avatar).
May the Force (the dark option) be with you.…
Angeles, which has 12% of the year made comfortable, and Shiraz, Iran, which also has 12% comfortable (assuming default parameters).
Jerusalem also makes sense to me. There is only a maximum possible 9% of the year that is inside the polygon (you'll see this if you set the timeConstant to a very high number). The default strategyPar makes 6% of these hours comfortable and 3% without cool enough temperatures in the previous hours. This seems reasonable to me.
I could be convinced to change the default time constant to 12 hours (instead of 8) as I know that 12 is the default of climate consultant but that seemed really idealized in my opinion. You'll need really high exposed mass and insulation without much internal heat gain to make conditions stable for more than 8 hours in my opinion.
As for the solarHeatCapacity, I get changes when I drop it down to 10 W/m2 or boost it up to 100 W/m2. It's definitely a parameter that operates on an "order of magnitude" scale and little tweaks to it won't change it too much. You can think of this number as representative of a lot of other physical properties: most notably the depth of the space being passively heated and the thermal mass of that space's materials that participate in heat exchange over the time constant. Climate consultant uses a default assumption of 30 W/m2 but, from my calculations, this is likely assuming a space that has a facade to floor area ratio that is greater than 1. If we say that we need to raise the temperature of 10 cm of an exposed concrete floor for passive heating purposes, and we have a facade-to-floor area ratio of 1:
Required solar flux = ((1 facade-to-floor ratio) x (0.1 m3 of concrete) x (2400 kg/m3 concrete density) x (880 J/kg-K concrete specific heat capacity)) / 3600 seconds/hour
This lands you with a required solar flux of 58 W, which is almost twice the 30 W climate consultant default. While me might say that not all 10 cm of concrete participates over the course of a default 8-hour time constant (most of the action is probably within the first 5 cm), we also have to account for things like transmittance of solar though the window, which, for triple pane, is probably only half of the incident solar. So 50 W seemed to be a more reasonable rule of thumb from my perspective, essentially assuming a facade-to-floor ratio of roughly 1 with 5 cm of concrete participating in an 8 hour heat exchange and a little more than half of solar heat getting through a fully glazed window.
Let me know if that makes sense or if you have any suggestions,
-Chris…
that both the ASHRAE and European Adaptive models were derived from surveys of awake occupants. While the topic has not been investigated as well as it should be, the few adaptive-style surveys of sleeping occupants that have been conducted show that people tend to desire significantly cooler temperatures when they are sleeping as opposed to when they are awake.
Notably, Chapter 8 of Humphrey's recently-published book on Adaptive Comfort (https://books.google.com/books?id=lOZzCgAAQBAJ&printsec=frontcover&dq=Adaptive+Thermal+Comfort+Foundations+and+analysis&hl=en&sa=X&ved=0ahUKEwi6npqSi__KAhUJMj4KHf7SCXMQ6AEIKjAA#v=onepage&q=Adaptive%20Thermal%20Comfort%20Foundations%20and%20analysis&f=false) provides some interesting insights into this. In a 1973 survey, Humphreys found that the quality of sleep started to deteriorate at temperatures above 24-26C regardless of the time of year and that there was no clearly-determinable lower limit to comfortable sleeping temperatures (in other words, people were fine at 12C if they were given enough blankets). He surveyed only British occupants who were sleeping in traditional beds with mattresses and a wide range of blankets. This is important because the nature of the findings is such that the comfort temperatures would be very different if the survey participants had been sleeping in a hammock or in closer contact with the ground (both popular practices for a number of cultures living in warmer climates). Traditional mattresses cut the ability to radiate body heat in half as compared to a standing human body and I would venture a guess that this is a big reason why much cooler temperatures are desired while sleeping on mattresses as opposed to standing awake/uptight.
So for your case, if you want to account for a time of the day that occupants are sleeping on mattresses, I would change the comfort temperature for this these hours down to 24C. Otherwise, if you are trying to show the comfortable hours of awake people in your space, your current 100% comfortable nighttime hours are a better estimate. I have also noticed that nighttime temperatures become comfortable in extreme weeks of hot/dry climates. This is what is happening in this extreme week simulation of Los Angeles' San Fernando Valley here:
https://www.youtube.com/watch?v=WJz1Eojph8E&index=3&list=PLruLh1AdY-Sj3ehUTSfKa1IHPSiuJU52A
I will put in the ability to set custom values for comfort temperatures into the Adaptive Comfort Recipe soon so that you can test out a 'sleeping comfort temperature' if you would like. I have created a github issue for it here:
https://github.com/mostaphaRoudsari/Honeybee/issues/486
I was not so convinced by Nicol's argument about humidity on those pages as I was when I saw the correlations of both operative temperature and effective temperature to surveyed comfort votes in real buildings. Humphreys shows these correlations on page 106 of the book I linked to above. Notably, the correlation of Effective Temperature to comfort votes (0.257) is slightly worse than the correlation of just Operative Temperature (0.265). In other words, trying to account for humidity actually weakened the predictive power of the metric. This difference in correlation is not so great as for me to discount an Adaptive comfort model based on Effective temperature (as deDear once proposed). However, the correlations of PMV (0.213) and SET (0.185) to comfort votes are so poor that I now use the PMV model only with great caution.
This reason for the decreased importance of humidity may be multi-faceted, whether it's Nicol's explanation or another. Still, the data suggests that we are probably better off ignoring humidity when forecasting comfort and should only consider it when evaluating conditions of extreme heat stress where people's primary loss of heat is through sweating.
-Chris…