idad del entorno construido contemporáneo y la sociedad. Creo que deja muy claro una inquietud mía respecto de cómo las herramientas de diseño paramétrico pueden ayudar a ser responsivo a la complejidad que supone generar, mediante el diseño, un metabolismo urbano circular en un sistema de alta complejidad que incluye ciudad/naturaleza ya no en la distinción clásica de naturaleza apartada y sublime sino como un todo, en donde inclusive la sola idea o concepto de naturaleza, atenta con con la simbiosis que debiese existir entre estos dos ámbitos tradicionalmente separados.
2. Me llamó la atención la claridad de la analogía que hace entre genotipo y fenotipo y componentes adaptativos en superficies populadas. Para mi esto es la esencia del diseño paramétrico y refleja muy bien también la idea de "field" o campo. El amarre o restricción del componente y sus estructuras pueden generar entonces una variabilidad y difereniación infinita en cuanto posee relaciones asociativas que se adaptan a condiciones que varían en tiempo/lugar/forma. Esta idea, considero, está en el centro de una preocupación personal que es cómo diseñar efectivamente la sustentabilidad. Sin duda lo más sustentable sería no diseñar algo dos veces sino mas bien adaptarse y responder al cambio.
3. Otro tema que me llamó la atención es la abolición de sub-sistemas funcionales, lo cual rompe con el paradigma moderno. Estaríamos entonces frente aun gran sistema en donde desde el inicio al final, todos sus componentes están relacionados efectivamente, en el entendido de que ninguno permanece estacionario o insensible al cambio, si es que otro componente varía.
Atte
…
one bug on our end.
First, you set the natural ventilation setpoint to be only one degree greater than the heating setpoint. This frequently causes the building to expend heating energy reaching the setpoint only to throw away this heat a few minutes later when the indoor temperature rises by one degree. Generally, I would leave at least 3 degrees between these two setpoints unless you are using some special types of schedules that ensure this conflict does not happen.
Second, I do not know what you were trying to do with so many solve adjacency components but, when you hook up only one zone to this component and set "removeAdjacencies" to true, you will over-write all of the previous boundary conditions that you set in the surface-by surface method. These were the boundary conditions that you were sending into EnergyPlus in the file you uploaded:
As you can see, you were blowing your heating load out of proportion by getting rid of your originally-set adiabatic walls. In the attached GH file, I use just one solve adjacencies component and these are your resulting boundary conditions:
Finally, there was a bug in the function that checks the normal direction of surfaces input to the energy model and Mostapha just fixed this one this past week (https://github.com/mostaphaRoudsari/Honeybee/issues/365). This was causing the solar gain calculation to be incorrect in your model, which, admittedly, is the major reason why the heating loads between the north and south facades were not different. As you see in the attached GH file, you now get very different heating loads for the north and south facades:
I hope that all of this helped and, again, sorry for the delay.
-Chris…
3dParameter("Points", "P", "description", GH_ParamAccess.item);
pManager.AddCurveParameter("Curves", "C", "description", GH_ParamAccess.item);
Register Output:
pManager.AddPoint3dParameter("Nodes", "N", "description", GH_ParamAccess.item);
pManager.AddPoint3dParameter("Tails", "T", "description", GH_ParamAccess.item);
Tails are the NurbsCurve.Startpoints.. and that is where I am getting 3 points, instead of the expected 2 points.
Many Thanks!
…
other ordering mechanisms. If you feed layer names in explicitly, output is grouped by layer. If you feed multiple types or multiple object name filters, output is grouped by these instead, following standard data tree behavior. There are a number of ways to achieve layer-based sorting of objects using existing components:
Letting anything be organized by the layer-palette order is extremely sticky, since it can be re-ordered on the fly by various sort mechanisms. See more on this topic in my conversation with Tim Halvorson in the comments on the Human Group page. As it stands now I do not alphabetize layers in the layer table component as you say - I use the layer index, so that as new layers are added/rearranged, the entire order of the set doesn't change in unpredictable ways. If layer sort order is preferable to you, you can use the simple one-line script I provided Tim in order to retrieve the sort index and use it to sort your layers. My priority for both object order and layer order in the components is to minimize unnecessary changes/refreshes/event listening by returning everything exactly as the SDK gives it to me.
(3) I've added the Include Locked and Include Hidden options to the latest release attached to this post. However, this only pays attention to hidden objects, not hidden layers. I do not want to include a toggle for "Include Hidden Layers" because this would force the dynamic pipeline to expire every time a layer's visibility changed, which I do not want - for most purposes this would cause unnecessary recomputing. However, with the components as they exist now, you can drive the dynamic pipeline with a layertable set to auto update, like so:
This has the effect I think you are after, which is to ignore objects on layers that are hidden - and recognize them as soon as the layer is turned on.
…
to do once I figured out how you use only a small portion of each of my generated curves to make the 360 degree Loft surface. I had a huge AHA! moment when I realized the complete Loft surface really only needs a small portion of the generated curves rotated around to form a closed (except for top & bottom) surface. That is a major new insight for me and I appreciate you pointing it out.
I also tweaked the Twist angle parameter a bit so the resulting positive and negative Twist surfaces, when combined, yielded a result that was closer to my original shape. This is when I discovered something very interesting.
When I baked/exported the result using just one of the 2 twisted surfaces I got an STL file that had no errors, that 3D Builder was able to simplify from a 37 MB file to a 3 MB file, and that sliced A-OK. But, when I combined the left and right twisted surfaces, I was back with my same set of problems: the exported STL file had many errors, could not be fixed, and did not slice properly.
I went back to my original layout that uses the complete set of generated curves to create the Loft surface and found I got exactly the same results - using only one twisted surface worked fine, but nothing worked when the left and right twisted surfaces were combined. By nothing I mean I tried all the standard methods (GH Join and Sunion, Rhino Solid/Union, Join, etc.) What I think this means is that the Loft surface behaves the same, and apparently is the same, regardless if it is generated by rotating strips or by using complete closed curves.
Furthermore, I am guessing the problems with the combined/exported STL file made from both left and right twisted surfaces has to do with overlapping/coincident parts of each one - like the top & bottom planar surfaces and some of the wiggly parts.
If I am correct about this then it suggests to me that there is some sort of glitch in Rhino's STL Export function. This is surprising to me since I though an STL file only paid attention to the external shape of thngs,and did not know or care about any inside stuff. Of course this is all conjecture on my part, but at least for now seems it will be impossible for me to actually print the double-twisted geometry.…
Added by Birk Binnard at 3:52pm on September 23, 2016
be in your definition is impossible.
The 2 (or 3) sides of the cube's corner pieces can never be the same color (see image:)
because they are actually part of the same piece (they always move together).
So in your definition it is as if you have removed the stickers from the cube and replaced them randomly, which results in an unsolvable cube...
In order to start with a properly scrambled cube, I believe you could start with a solved cube and perform a big number of random rotations on it (just like you would do in real life).
On another subject:
"There are over 43 quintillion legal positions of the Rubik’s Cube.
It would take thirteen hundred million years to see every position if you were able to view one thousand per second.
If we stacked 43 quintillion pennies, the stack would be tall enough to reach the sun and return to the earth four thousand billion times."
source: http://b.chrishunt.co/how-many-positions-on-a-rubiks-cube
So, trying to brute-force the Rubik's cube is definitely not the way to go... :)
Of course there is a number of programming algorithms for solving the cube (examples) but I don't know how easy it would be to implement them in GH....
Best of luck and please keep us posted!
Nikos
…
Added by nikos tzar at 10:42am on January 31, 2017
supplied _values of _keys" notice.I tried running the "OSM 3D" component first with groundTerrain_ input. As I did not get the upper notice message, I closed down the whole Rhino so that I cut the waiting time. Then I tried running it without the groundTerrain_ input, and in some 15 minutes I got the following buildings:
I think I may understand what was causing the problem: when one takes large radii, it covers large areas, and with this area comes large number of information (keys and values). You can get hundreds of keys (or thousands). What can happen is that: these hundreds of keys, can exceed shapefile's capacity to story keys. So basically in case of radius 750 meters your "height" or "buildings:levels" keys somehow slipped beyond this allowable capacity. In case of 800 meters they were somehow allowed to enter (a bit bad term sorry) before the allowable capacity is reached. This depends on the number of keys named with letters which precede the "h" and "b".The best way to solve this issue is to know which data do you actually need, and use the "OSM Keys" component to generate the list of needed keys. In this way, only those keys that you need will be used, others will be disregarded.You do not even have to use the "OSM Keys" component if you know which specific keys you exactly need. Check the attached file below. I grouped the "OSM Keys" solution as "a" and a custom defined list of keys as "b".
2) The component running time might now be cut with picked "requiredKeys_" input I mentioned at the end the previous 1) part.
3) "OSM 3D" component's "randomHeightRange_" input is suppose to do exactly that: to randomly create 3d buildings (or 3d trees) when there are no valid "height" or "buildings:levels" tags.I have just changed one line the "OSM shapes" component code.I wonder if it would make any problem on your PC.Please let me know if LocationGrabber03_Gismo2.gh file works.…
Added by djordje to Gismo at 2:34pm on February 11, 2017
ulio´s latest bakeAttribute, so it also sets a specified layercolor?
Thanks,
Phillip
Reply by Giulio Piacentino 1 hour ago
Hi Phillip
if possible, you should try to modify layer colors independently from baking. A layer can have only one color, but many objects.
To modify a layer color, use something along these lines:
if(!string.IsNullOrEmpty(layer) && !color.IsEmpty) { int n = RhinoDoc.ActiveDoc.Layers.Find(layer, true); if(n < 0) return; Rhino.DocObjects.Layer l = RhinoDoc.ActiveDoc.Layers[n]; l.Color = color; A = l.CommitChanges(); }
Can I also ask you to start a new discussion next time? I hope this helps,
- Giulio
…
mple:
I wish to populate a rectangle with some random points, but I need them to be more dense at the base of the rectangle and then linearly getting more and more sparse towards the top.
This is how I worked it around:
1) first I have created a triangular prism,
2) then I've populated its volume with some random points
3) and finally I've projected them on the plane I'm wishing to populate.
But I don't really like the final result since the points are not as nicely spaced as if they were produced by the "Populate 2d" command. They look kind of "clumpy":
Do you have any better idea?
The best thing would be to be able to put a grayscale bitmap underneath and use it as a "density map"...
Here you have the .gh file I made:
prism.gh
Thank you very very much for the help! :)
By the way:
While I was preparing my 3d random distribution of points I've spotted a weird behaviour of the random command:
Even if the seeds are all different, for some values of them the points still belong to some common planes...
To solve that I had to jitter the output of one of the Random components.
I suppose this is a weakness of the pseudorandom generator implemented in the random component, isn't it?…
ilion.
Then i sketched the outline curves in rhino with a few control points. The building is symetric so i only draw one side. But i'm not sure what is better for a voroni. a sharp or a soft surface? Or dose i need points?
So i have some questions:
1. how can i loft the curves correctly? My problem is that if i divide my curves for more control points, grasshopper automatically change my curve. thats ok but than i've the problem with a short curve, which fit bevor with the large one, but after the devision it can't connect.
So i tryed to duplicate the long curve and split it but with the shatter battery it dosen't work. It always cut the curve somewhere.
2. my next problem is, the curves in rhino should be my main construction, which is always visible. so i decided to offset the curves that i got a colum. but i don't know how to orient the offset curves in the xyz axis.
3. hopefully if i have the surfaces, how can i build a voroni which is offsetet, and has maybe some different thicknesses? :D
Would be really great if s.o. can help me. I tried a lot but not every thing is simple.
Sorry for my bad english.
Thx max
Here are my files:
FCP_MAX_GH_konstruktion_1.3dm
FCP_MAX_GH_konstruktion_1.gh
…