itects are at the spoke of a number of different specialties, and their work affects many different people. It's not like an architect is a painter, whose work may offend or upset the occasional viewer. As an architect you have a responsibility to produce quality work. How can anybody trust you with this responsibility if you're taking a purely artistic approach? What guarantees do you have that your clients money won't be spend on a poorly designed project if you can provide no rational for why your design is the way it is?
2. What is any sense in purely architectural discourse?
I don't get. Discourse is there to flesh out problems and agree on solutions. It might not always accomplish that, but what's the difference between talking about architecture as opposed to any other topic?
3. strictly looked, can be determined sense generally in a purely architectural discourse?
I'm sorry I don't understand.
4. What is purely architectural discourse?
I imagine it's having a discussion where you only talk about architecture?
5. What is Funktionalismus or Rationalismus without philosophical support?
Functionalism and Rationalism are ideologies. Some would even call them methodologies. They are inherently philosophical things as they are nothing more than a collection of ideas and views. As a society we've decided that a certain level of rationalism is a good thing. The Enlightenment continued this trend after the Dark Age hiatus and it quickly led to a large number of very tangible benefits for almost everyone.
I'm not arguing for or against Functionalism as an architectural style. I'm asking for a measure of rationalism in our academic process.
6. Would not be the pure functional fulfilment empty ?
Let's find out. In the meantime I'll settle for a little functionalism.
7. Would be not a critical position on the promise of purely rational algorithms applied?
Algorithms and algorithmic design are rational in the sense that they do not allow for ambiguity. But that doesn't make them rational in the real-world sense. These are not the same kind of 'rational's. I can make an algorithm that produces total nonsense, but does so completely reliably. I can also use an algorithm in a setting for which it wasn't intended, thus invalidating the results.
This is actually the crux of the problem. Which algorithms does one use to solve a problem and what data do they require? If you can't answer this question or if you do not understand the algorithms you are using (at least on a superficial level) then I'd say you have no business using them.
--
David Rutten
david@mcneel.com
Tirol, Austria…
Added by David Rutten at 12:48pm on August 19, 2013
an almost planar tissue (your case) can cause a variety of issues up to the undo able state (metal parts/components grow in size as well for no reason). See forces estimated by FF below.
2. Therefor I strongly suggest to consider Plan B (a) mastermind a secondary "anchor" capability in order to achieve a far more stable system (b) use a mount design that can support this (and comply with the attractor concept of yours). Here's a variable mount custom system (mostly machined AND not cast) that is suitable for the scope (Rhino reads the stp file OK .... but makes a colossally big file - thus I attach here the original).
3. On first sight lot's of things in this system appear "odd". For instance: is it stable? Why these double cables are used? How far can be adjusted? (that's a classic case for feature driven parametric design - not doable with Rhino).
4. This concept (strut axis exported only) is tested in FORMFINDER and some other far more complex membrane apps that I use quite often (not RhinoMembrane). Here's is what FF tells us about:
Observe a different kind of "stress" when this is converted to radial type:
5. If you insert the stp file to the Rhino file provided (exactly as exported from FORMFINDER - no mods of mine of any kind) you'll see what goes where (and why). That way the usage of double cables is rather obvious (and a lot other things - for instance the way that the struts achieve "equilibrium", see the slots in the base mount plate.
6. If this approach is worth considering your definition requires some serious rethinking (far more simpler/manageable with the drawback that the real parts they are "static" they can adjust only as far this particular solution allows them to do - controlling them parametrically is clearly impossible with the current state of R/GH capabilities).`
All in all: this case works because the cables push the anchor points downwards and the struts push them upwards.
more in a while
…
now.
This V4 can sense if you feed it with your points and uses these instead of the p1,p2,p3 (it's a prelude for V5 that uses DataTrees of points making any surface subdivision a reality). Do the following: sample a triad of your points (NOT internalized) and feed the C# . Then ... start dragging these Rhino points around (the C# responds accordingly). See any difference?
The topology:
Well, the whole fractal logic (in this case) is to have 3 pts on hand (call them p1,p2,p3 : red, green, blue) and then project the "right" one, say, p3 to the Line (p1,p2) > do this > do that ... blah blah.
But ... what p3? that's the 1M question: Here for instance the right p3 (blue) is (by accident) the 3rd point entered (it's obvious the "projection" recursive logic):
but if you drag around a bit the points : p3 is now different (C# does this by sorting synchronously the triangle angles per point VS points) Numbers are used to indicate that "swift" : (0 for the new p1, 1 for the new p2, 2 for the new p3... etc). Compare with the initial points (red = ex p1, green = ex p2 , blue = ex p3).
and again different:
The 1M question:
In fractal thinking the big thing is when to stop: I could obviously control that by a counter ... but here the requirement is the tile min size (within unpredictable amount of recursions) : this is what the stop logic used does.
The 1B question:
So ... implementing fractal logic (against DataTrees of points) to a parametric environment ... requires a lot of questions: because each time the size of the start triad varies ... whilst the stop condition is constant: meaning that with a little bit of "good" luck you can reach incredible high amount of tiles (computer out of memory > Adios Amigos).
Obviously I'm taking having all possibilities in mind and especially big projects > big facades > millions (or zillions) of tiles > Armageddon > ....
more soon
…
ng in Grasshopper?
As a general recommendation for developers in Grasshopper who are writing a part of their library which is performance-sensitive (please note: often the performance sensitive part is very limited) is to write it in C#, or maybe even C, or maybe even assembly :). Of course, the closer to the machine you will be, the easier it will be to harness all minimal optimizations. However, there is always a compromise between "getting things done" and "making them best" and this boundary is not very easy to catch, right?
If you want to have significant speed improvements for numerical calculations, I would at least recommend developing with C# in a compiled component using Visual Studio or SharpDevelop. The reason is: in order to provide the line number of possible errors, Grasshopper compiles C# scripts in debug mode! They will be much less optimized than what is possible even with today's technology. This does not preclude keeping the project open-source, if that is one of your goals.
Regarding the actual list:
1) Yes, the implied loop will probably be slower than just a simple for loop. This is because Grasshopper code has to keep track of more things than the ones you could be considering with your knowledge of of your very-special case. However, a factor of 10 is simply not acceptable and is likely a symptom of something else. In fact, I think I remember fixing a bug around that in Rhino WIP. However, it appears to be still slower also there. I've added a bugtracking item here.
2) If you are able to do all casts that are involved, and do them as Grasshopper does, please write that code that way. For example, if you supply a curve to an input with number hint, Grasshopper computes the length of the curve. There will have to be an "if" that checks if the input is a curve somewhere (or some similar construct). This aid for designers is what slows down the hint input.
3) Grasshopper has to keep side effects at bay. For example, components B and C are both connected to outputs of A. If you edit data in component B, and that data came from A you of course expect that data to be unchanged in C. This means that, for even lists of numbers, Grasshopper has to perform a deep copy of the output for each input. Otherwise, what happens if B sorts the list and C finds the index of the smallest number? This could be improved if GH components had some way of flagging themselves as non-data-mutating (constant). The fact that, by supplying special types, Grasshopper has no way of performing copies will likely speed things up. But be aware of possibly very annoying side effects creeping in if data is not immutable. Another option is performing the copy "optimally", just where you need it, because you know where your data is used. This is not information that is available to GH at present.
Does this help?
Thanks again for your input,
Giulio--Giulio Piacentinofor Robert McNeel & Associatesgiulio@mcneel.com…
onents to the latest version and, as you can see, everything works fine:
Over the next week, I am going to be adding in several new capabilities to the Adaptive model in LB+HB that are not an official part of ASHRAE or ISO standards but they are endorsed by the experts and researchers who have helped build the standards. Mostapha, I will be sure to have the component give a comment any time that these un-standardized methods are used and I will be clear that I have made them a part of LB because I have found these insights from new research to be particularly helpful to design processes for passive architecture. Also, I think many of us recognize that both ASHRAE and ISO were initially founded to produce standards for conditioned or refrigerated spaces and that, understandably, they . Among the features that I will be adding in:
1) You will have the option of using either the American ASHRAE adaptive model or the ISO EN-15251 model (see the CBE's comfort tool for a visual of the differences - http://comfort.cbe.berkeley.edu/).
2) In addition to a different comfort polygon, the European standard also uses a "running mean" outdoor temperature instead of the average monthly outdoor temperature. This "running mean" is computed by looking at the average temperatures over the last week and weights each of the daily average temperatures by how recent it is. This makes more sense to me than the ASHRAE method and addresses the issue that you bring up, Alejandro. Needless to say, the updated adaptive model will allow you to use either a running mean or average monthly temperature with either the American or European polygon.
3) The WIP adaptive chart currently has an option for a "levelOfConditioning". This input allows you to make use of research the was conducted along-side the initial development of the adaptive model, which showed that the findings did not contradict the PMV model when people were surveyed in fully conditioned buildings. This parallel research ended up producing a different correlation between the outdoor and desired indoor temperatures and this correlation had a much shallower slope than the official adaptive model for fully naturally-ventilated buildings. The levelOfConditioning allows you to make a custom correlation for full natural ventilation, full conditioning or (presumably) somewhere in between for a mixed-mode building. This levelOfConditioning will become an official input for all LB components using the adaptive model (not just the chart at the moment).
At the end of all of this, I will put together a new video series on Adaptive comfort so that we are all on the same page about how to use the model.
-Chris…
Send Feedback
Defines enumerated values for all implemented corner styles in curve offsets.
Namespace: Rhino.GeometryAssembly: RhinoCommon (in RhinoCommon.dll) Version: 5.1.30000.12 (5.0.20693.0)
Syntax
C#
public enum CurveOffsetCornerStyle
Visual Basic
Public Enumeration CurveOffsetCornerStyle
Members
Member name
Value
Description
None
0
The dafault value.
Sharp
1
Offsets and extends curves with a straight line until they intersect.
Round
2
Offsets and fillets curves with an arc of radius equal to the offset distance.
Smooth
3
Offsets and connects curves with a smooth (G1 continuity) curve.
Chamfer
4
Offsets and connects curves with a straight line between their endpoints.
…
... er ... hmm ... I would strongly suggest Plan B:
How to get the gist of C# in just 123 (+1) easy steps (I've already posted that 3-4 times if memory serves well):
Step 0: get rid of the computer (press the OFF button), buy some cigars:
Step 1: get the cookies
The bible PlanA: C# In depth (Jon Skeet).
The bible PlanB: C# Step by step (John Sharp).
The bible PlanC: C# 5.0/6.0 (J/B Albahari) > my favorite
The reference: C# Language specs ECMA-334
The candidates:
C# Fundamentals (Nakov/Kolev & Co)
C# Head First (Stellman/Greene)
C# Language (Jones)
Step 2: read the cookies (computer OFF)
Step 3: re-read the cookies (computer OFF)
...
Step 120: re-read the cookies (computer OFF)
Step 121: tun ON computer
Step 122: do something
Step 123: shut down computer permanently, forget all that
May The Force (the Dark Option) be with you.…
reaky thing consisting from triangulated "modules" (i.e an assembly out of this, this and that) where the exterior edges ARE always under tension (= SS 304/316 cables OR nylon) and the interior ones MAY be under compression ( = steel, aluminum, wood, carbon) OR ... some of them ...may be under tension. Bastardized T trusses deviate a bit from theory ... but who cares? (not me anyway). T trusses have many variants (but as the greatest ever said: Less is More).
2. Large scale T for AEC is the art of pointless since it costs around the GNP of Nigeria. Here's some indicative components from a module of a multi adjustable TX system costing (the module) ~ the price of my Panigale (Google that):
The above is mailed to a friend who has MIT (yes, that MIT: the top dog) on sight ... therefor he needs some appropriate "credentials", he he.
3. The distance that separates the above with the demo TDT node provided is around 666.666 miles - but we don't care: we are after Art not some testimony to vanity.
4. On purpose I've used a smallish ring to give you a clear indication upon the constrain numero uno in truss design: CLASH matters.
5. You'll need:
(a) A decision related with the tensioners (classic Norseman + SS cables or nylon machined thingies?).
(b) A machinist who can do elementary stuff (like the adapters) and can weld this to that (the "ring" for instance). His abilities must be 1 in a scale of 100. If the fella has a computer (not a CRAY) and he knows what 3dPDF is (hmm) ... well ... use that way to communicate with him PRIOR designing anything: He must agree on the parts BEFORE the whole is attempted (as a design in GH or in some other app).
(c) A carpenter with a wood lathe for the obvious. BTW: BEFORE doing any TDT attempt > ask the carpenter about the available wood strut sizes. Against popular belief DO NOT varnish the wood (use exterior alkyd/oil stains from some top maker like the notorious US company PPG).
http://www.ppgpaints.com/products/paints-stains-data-sheets
(d) Good quality cigars (and espresso) plus some classic music (ZZTop, PFloyd, Cure, Stones, U2 etc etc) during the assembly.
(e) Faith to the Dark Side (see my avatar).
May the Force (the dark option) be with you.…
l, you can find examples of parametric design using LB/HB, specifically the HB component pollinator workflows.
In these examples, a GH component (data recorder) is used to locally store either input parameters or output values of different model configurations and transmit them to pollinator. I can imagine, depending on how your facade is made parametric in GH, that you could save those input parameters (e.g. angle of surfaces or height of extrusion) and output variables for each iteration (e.g. annual shading).
This a search process through the design space. I do think that if you would set up the model as such, then it would be ok that the components in the PV workflow resetted after each iteration as the results would be saved. There is even a really good visualization platform Mostapha has shared to go along pollinator.
You can find examples of these workflows in the forum, simply search pollinator. I have one that I shared somewhere as well, although it was doing rudimentary things it would help.
This design space approach is a bit different than the optimization approach utilizing components like galapagos. It gives you an idea of the space of possible different desings and allows you to compare alternatives. Plus, it usually allows me to avoid all these issues of losing results between components in the workflo.
I also find it very handy and much more efficient than simply allowing a component optimize everything for me. However, it can ncrease almost exponantially (or is it geometrically, I am always bad at this) to the range and number of your input parameters. So, if each square on the wall has more than a couple of input values for a a few input parameters, I would expect this to take a long time. Thankfully, the components in the workflow will let you know exactly how many iterations.
If this method is interesting to you and you follow it I would suggest a few things to hasten the process like utilizing only the squared above and on the sides of the PV panel, since the others won't really affect shading, selecting just 2 or 3 characteristic angles for extrusions, and perhaps approximating energy production through annual shading numbers (since I imagine they have an almost linear relationship).
I do hope that I have understood what you want to do and the above information helps. I'm sure Djordje will give much better feedback on the specifics of the PV workflow. I will try and keep this page saved so that I can send over the example once I'm back at work mid of next week.
Good luck!
Kind regards,
Theodore.
…
st variety of papers (mostly related with LIDAR airborne sampled clouds) ... but ... hmm ... no code (other than some "abstract" algos that may (or may not) work). Reason? A very hot cake that one these days: from reverse engineering to DARPA founded future defense systems and up to cruse missiles pattern recognition algos.
The solution (obviously doable only via code) is the so called flat hard clustering ... were points are sampled into clusters based on the coPlanarity "rule". For large amounts recursive octTrees (an oriented box divided in 8 "partitions") subdivisions are used and then pts are processed in parallel (and then clusters are re-evaluated in order to "absorb" other clusters with same plane A,B,C,D vars etc etc).
See what's happening in a very carefully made test point collection:
3.7 ms and the "ideal" clustering (7 search loops VS the max 42M theoretical threshold):
Depending on the pts "preparation" ... a considerable more time/search loops is required ... and ... well ... also "valid" clusters (4 points and up) made:
So "ideally" speaking in your case:
1. Mesh faces center points (or alternatively: mesh vertices) are sampled into a pts collection .
2. Hard flat coPlanarity clustering is attempted yielding pts/planes in equivalent DataTrees.
3. Planar Breps are made with respect the planes (like the black things captured above) and sampled, say, into a breps List.
4. The method Brep[] solids = Brep.CreateSolid(breps); is used for attempting to create your desired "engulfing" brep. This method is very slow mind (other waaaay faster approaches also available).
…