make-this-form-...
Other than that:
1. Tensegrity is a "static" thingy in the sense that you use some module (let's call it "mode") and repeat. Creating some code that does INVENT new modes for T trusses (Pulitzer/EMMY/Nobel on sight, he he) ... I would strongly suggest to forget that THIS VERY MOMENT.
2. Applying some T "mode" on something (see my examples in the above thread where I use surfaces for the T nodes) is another animal. If you intend to use Kangaroo to "relax" that something (NOT the T itself) well ... you can do it but has nothing to do with T.
3. The Kangaroo def provided is a "way" to test the "rigidity" of the T in use. It's a "post-processing" thing NOT a T solving thing.
4. I have a terrible feeling: are you saying that (a) without knowing a thing (or two) from C#, (b) without knowing K1/K2, (c) with a limited GH experience ... your goal is to write down from scratch a FEA ("Femap") thingy that ALSO does node "relaxation" ? If so ... well ... what about sky diving (without parachute) or that classic Russian roulette "game"?
PS: shown double tetra (classic) and XFrames (classic) T trusses applied in open and closed surfaces.
But of course these are abstract stupid "arrangements" utterly out of question in real-life: read CAREFULLY the discussion in the thread provided above AND also study the 3dPDF attached (with a system out of many available) in order to get the gist about what real-life means (Note: EVEN if no real-parts are used ... the node calculation is different from the abstract "star" connections pictured above - by "star" I mean that cables meet at a single point in space without any "offset" etc etc).
Moral: Seppuku
Plan Z: Skype ASAP
…
a machine that is light and very sturdy. I have taken my Macbook Pro all around the world, carry it with me every day, even dropped it a few times and its still totally fine. Its thin and light.
2) You get some actual support for your hardware even a few years down the line. My Macbook Pro is from 2012 and I can still walk in to any Apple Store and get help with it, which I have done many, many times in different places around the world - I never had to show a receipt or was charged any money for help. There is no PC/Laptop manufacturer in the world with anything close to that, because companies like Asus, Dell, etc. bring out dozens of new versions of laptops every year, so its much harder to service them after a few years.
3) This is the most important one, which usually people forget when they say that Macbooks are overpriced: Resale Value. If you have ever tried to sell an old PC/Laptop (I have a few times), you will know how little value they have even after just 2-3 years. Macbooks retain their value very well and even after 4 years you can still get 50% of your original price.
4) Of course you can install Windows on it and it runs perfectly. I have MacOS and Windows on it and both run absolutely fine. On the Windows side I have Rhino+GH, Maya and a few others. Having Windows is good, because some software still only runs on Windows (looking at you, 3DSMax!). Most other software also runs on MacOS. In the interest of sanity it is great to have an alternative to Windows for all the day to day stuff, like Mail, Calender, Photos, Presentations, etc. that just always works.
5) As for performance: Yes, Macbook Pros dont necessarily have the latest and greatest in graphics cards (the rest is on par with PC laptops), but unless you want to play games you will not need it. VRay RT can do GPU rendering, but you wont get great performance from a Notebook GPU anyways and it doesnt make sense to do rendering on a laptop (especially since you have a workstation). You could get one of the older Macbook Pro Retina Late 2013 or Mid 2014 models with the GTX750M by Nvidia, which will be usable to render using VRay RT, but of course not huge performance. Better to invest in a good used graphics card for your workstation like an Nvdia GTX980ti, which is the best value for money for GPU rendering right now (lots of used ones available).
So at least consider also getting a Macbook Pro. You can buy refurbished models (depending where you are) and they are like new, but a lot cheaper or even get an older one thats used. It will be a worthwile investment.
Take it from someone who has used dozens of PCs and Macs in my lifetime and have to do the IT support here at work (where we also use both).
I still have my Macbook Pro Retina from 2012 and its still running perfectly, super fast, and I can use Rhino and GH for huge files, do GPU Rendering with Octane Render and all sorts of other heavy computing stuff.
Hope that helps.…
Added by Armin Seltz at 11:12am on September 19, 2016
phere with the maximum number of triangles but not much than a defined threshold.
I scaled that mesh just to fit Rhino grid, but it is not mandatory. What is useful, is to scale not uniformly the mesh (Scale NU). It could be done after cellular modifier applied or before or before and after. The 3 options are possible in the script. If you don’t need them just put 1 in scale sliders.
Ellipsoid mesh is the populated with points, I put 2 independents populations to randomize a bit further. For each vertices of the mesh the closest distance from the populated points is calculated.
Here is an illustration in color of this distance.
This distance is then used to calculate a bump. If domain for bump is beginning with negatives values to 0, it carves the mesh. Instead it bumps/inflates it.
Some images to illustrate the difference with populating 100 points with one or two populations.
Here some images to illustrate the application of scale before carving or after.
Next phase apply noise. At the moment I don't find it good.…
rce=activity
Basically, I want to create a workflow to automatically subdivide a building mass envelope geometry into different floors which will be further subdivided as perimeter zones and core zones.
But I encountered an error for a particular building mass geometry (a quite regular form) which doesn't work with the split building mass component (see item 4&5 below):
The workflow is:
1. import building mass geometry:
2. divide the building mass into floors (one zone per floor) using one of the two different methods depending on whether the floor surface has holes or not:
3. use the split building mass component to further divide the zone for each floor into perimeter zones and core zone:
4. I tested several building forms which work for this workflow as shown below, except for one form C05 which is a courtyard block with small tower blocks on top of it:
5. in the last step, there is an error from the split building mass component saying that "solution exception: index out of range: 0" ...
So, I wonder if this is error is related to the split building mass component or related to the way the building mass geometry is created.
Appreciate your kind advice!
Thank you!…
d simulate the bending process of a flat stell sheet in order to get the same shape. This can be really interesting so we can evaluate the material beheaviour, the deformation on the cross section a
nd explore big deformations in mecanics analysis of materials.
I am not a mecanical engineer nor a civil engineer, I´m an Architect and my interest is the construcction method and extracting the necesary information to consider fabricating the project.
I´m having conceptual challengings on the methodology for this simulation, so I will post a small overview of what I`ve done.
1.- Understanding the Geometry.
This is a sclupture by the Venezuelan/Hungarian/German artist Zoltan Kunckel (KuZo).
The shape is achieved bending a pre water cut square sheet of stainless steel. After bended manually, the different lashes are pulled on the opposite direction. New curvatures are produced after all is deployed.
2.- Reproducing the Shape digitally.
Using Karamba I built a definition to reproduce the produced by physical stress. This model served to find deformations that occur when a set of loads are applied to a mesh. Following this process will allow us to find a coherent and more natural cross section so then we could re-shape simulating the bending process of a piece of ductile material.
3.- Discretizing curve
Reducing the model to its simplest element is a key aspect of finite nonlinear analysis. Once our shape is already defined we can divide its principal characteristic of its principal given curve.
At this point I have already found the desired curve.
I Think the better strategy to simulate bending the steel sheet into this shape, is rationalize the curve and divide it finding the tangents one of the curve that compose this sort of parabola. bur i don`t know how to parametrize that in GH.
Please. If someone have a better Idea about this process I`ll glad to read sugestions.
Tomás Mena
…
rrect, the heat balance of a zone is always 0 = Qcool/heat + Qinf + Qvent + Qtrans + Qinternalgains + Qsol. These parameters also correspond with the readEPresult element. However, if i sum up these values there is a slight deviation.
The deviation is greater during daytimes and in winter, suggesting it has something to do with the heating values.
Attached you'll find an image of the energy plus outputs that I use and the resulting -.CSV file that I constructed. In this you'll see that the balance does not add up.
Am i missing some energy flows?
Thanks for the help.
Hour[H]
Qbal{kWh]
Qint[kWh]
Qsol[kWh]
Qinf[kWh]
Qvent[kWh]
Qtrans[kWh]
Tair[°C]
Tdrybulb[°C]
DIFFERENCE
1
3,039357
0,137702
0
-0,253218
-0,321929
-2,000028
20
5,1
0,601884
2
3,107099
0,125462
0
-0,247457
-0,315484
-1,881276
20
4,6
0,788344
3
3,181073
0,119342
0
-0,261765
-0,334485
-2,473788
20
4,3
0,230377
…
peuvent se diviser une surface avec ne importe quel motif imaginable. 3. Ici, je fournir un moyen de le faire via Lunchbox ... cela fonctionne mais il est fixe et donc nous avons besoin de jouer avec des arbres de données afin de créer le motif approprié par cas. 4. L'autre composante est un joint C # qui fait beaucoup de choses autres que de diviser ne importe quelle collection de points avec de nombreux modèles (voir le modèle ANDRE que je ai fait pour vous). 5. Vous devez décomposer une polysurface en morceaux afin de travailler sur les subdivisions. 6. Je donne une autre définition ainsi que pourrait agir comme un tutoriel sur la façon de traiter des ensembles de points via des composants de GH standards et des méthodes classiques.
Avertissez si tous ceux-ci apparaissent floue pour vous: Si oui, je pourrais écrire une définition utilisant des composants de GH classiques - mais vous perdrez les variations de motifs de division.
mieux, Peter
…
ers can be applied from the right click Context Menu of either a component's input or output parameters. With the exception of <Principal> and <Degrees> they work exactly like their corresponding Grasshopper Component. When a I/O Modifier is applied to a parameter a visual Tag (icon) is displayed. If you hover over a Tag a tool tip will be displayed showing what it is and what it does.
The full list of these Tags:
1) Principal
An input with the Principal Icon is designated the principal input of a component for the purposes of path assignment.
For example:
2) Reverse
The Reverse I/O Modifier will reverse the order of a list (or lists in a multiple path structure)
3) Flatten
The Flatten I/O Modifier will reduce a multi-path tree down to a single list on the {0} path
4) Graft
The Graft I/O Modifier will create a new branch for each individual item in a list (or lists)
5) Simplify
The Simplify I/O Modifier will remove the overlap shared amongst all branches. [Note that a single branch does not share any overlap with anything else.]
6) Degrees
The Degrees Input Modifier indicates that the numbers received are actually measured in Degrees rather than Radians. Think of it more like a preference setting for each angle input on a Grasshopper Component that state you prefer to work in Degrees. There is no Output option as this is only available on Angle Inputs.
7) Expression
The Expression I/O Modifier allows you change the input value by evaluating an expression such as -x/2 which will have the input and make it negative. If you hover over the Tag a tool tip will be displayed with the expression. Since the release of GH version 0.9.0068 all I/O Expression Modifiers use "x" instead of the nickname of the parameter.
8) Reparameterize
The Reparameterize I/O Modifier will only work on lines, curves and surfaces forcing the domains of all geometry to the [0.0 to 1.0] range.
9) Invert
The Invert Input Modifier works in a similar way to a Not Gate in Boolean Logic negating the input. A good example of when to use this is on [Cull Pattern] where you wish to invert the logic to get the opposite results. There is no Output option as this is only available on Boolean Inputs.
…
ing the maps to the broader community.
At the moment, there are just a few known issues left that I have to fix for complex geometric cases but they should run smoothly for most energy models that you generate with Honeybee. Within the next month, I will be clearing up these last issues and, by the end of the month, there will be an updated youtube tutorial playlist on the comfort tools and how to use them.
In the meantime, there's an updated example file (http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Indoor_Microclimate_Map) and I wanted to get you all excited with some images and animations coming out of the design part of my thesis. I also wanted to post some documentation of all of the previous research that has made these climate maps possible and give out some much deserved thanks. To begin, this image gives you a sense of how the thermal maps are made by integrating several streams of data for EnergyPlus:
(https://drive.google.com/file/d/0Bz2PwDvkjovJaTMtWDRHMExvLUk/view?usp=sharing)
To get you excited, this youtube playlist has a whole bunch of time-lapse thermal animations that a lot of you should enjoy:
https://www.youtube.com/playlist?list=PLruLh1AdY-Sj3ehUTSfKa1IHPSiuJU52A
To give a brief summary of what you are looking at in the playlist, there are two proposed designs for completely passive co-habitation spaces in New York and Los Angeles.
These diagrams explain the Los Angeles design:
(https://drive.google.com/file/d/0Bz2PwDvkjovJM0JkM0tLZ1kxUmc/view?usp=sharing)
And this video gives you and idea of how it thermally performs:
These diagrams explain the New York design:
(https://drive.google.com/file/d/0Bz2PwDvkjovJS1BZVVZiTWF4MXM/view?usp=sharing)
And this video shows you the thermal performance:
Now to credit all of the awesome people that have made the creation of these thermal maps possible:
1) As any HB user knows, the open source engines and libraries under the hood of HB are EnergyPlus and OpenStudio and the incredible thermal richness of these maps would not have been possible without these DoE teams creating such a robust modeler so a big credit is definitely due to them.
2) Many of the initial ideas for these thermal maps come from an MIT Masters thesis that was completed a few years ago by Amanda Webb called "cMap". Even though these cMaps were only taking into account surface temperature from E+, it was the viewing of her radiant temperature maps that initially touched-off the series of events that led to my thesis so a great credit is due to her. You can find her thesis here (http://dspace.mit.edu/handle/1721.1/72870).
3) Since the thesis of A. Webb, there were two key developments that made the high resolution of the current maps believable as a good approximation of the actual thermal environment of a building. The first is a PhD thesis by Alejandra Menchaca (also conducted here at MIT) that developed a computationally fast way of estimating sub-zone air temperature stratification. The method, which works simply by weighing the heat gain in a room against the incoming airflow was validated by many CFD simulations over the course of Alejandra's thesis. You can find here final thesis document here (http://dspace.mit.edu/handle/1721.1/74907).
4) The other main development since the A. Webb thesis that made the radiant map much more accurate is a fast means of estimating the radiant temperature increase felt by an occupant sitting in the sun. This method was developed by some awesome scientists at the UC Berkeley Center for the Built Environment (CBE) Including Tyler Hoyt, who has been particularly helpful to me by supporting the CBE's Github page. The original paper on this fast means of estimating the solar temperature delta can be found here (http://escholarship.org/uc/item/89m1h2dg) although they should have an official publication in a journal soon.
5) The ASHRAE comfort models under the hood of LB+HB all are derived from the javascript of the CBE comfort tool (http://smap.cbe.berkeley.edu/comforttool). A huge chunk of credit definitely goes to this group and I encourage any other researchers who are getting deep into comfort to check the code resources on their github page (https://github.com/CenterForTheBuiltEnvironment/comfort_tool).
6) And, last but not least, a huge share of credit is due to Mostapha and all members of the LB+HB community. It is because of resources and help that Mostapha initially gave me that I learned how to code in the first place and the knowledge of a community that would use the things that I developed was, by fa,r the biggest motivation throughout this thesis and all of my LB efforts.
Thank you all and stay awesome,
-Chris…