switch entirely to code for the K2 part.
1. When you have some idea in mind you MUST have always a Plan B (see the 2 stage def as in the other one [bridging]). In fact achieving a "uniform" result is a bit tricky via K (and maybe, just maybe Plan A is a "better" approach). That said NOT all the K2 thingies required are used in this V1.
2. Plan A (stop before K2) can yield some more suitable "conical" "shapes" with an indeterminate Loft profile (NOT included: a challenge for you, he he) like the one used in the bridging example. Notice that Plan A is real-time (and when doing parametric stuff this is the goal Numero Uno).
3. Authenticity of the soap opera series is assured since there's only sardines provided as instance definitions.
…
division Points) is the first, the intermediate or the last and then they add the vectors (v1 for the first adjacent face, v2 for the other) depending on the even/odd index. As you may know: point + vector = a new "shifted" point. Flipping the v2 vector (the last boolean in the C#) allows you to viz the angle between the 2 adjacent faces (per edge).
To tell you the truth ... for me ... well .. the above few lines are far easier (and way faster) to write than the "equivalent" thing with components. Blame the Dark Side I guess (and ~500+K lines of code).
Jokes apart:
Thicken a given freaky crenelated Brep Face is not that easy: a teeth needs some space for the glue AND the sides need to be "tapered" in order to fit perfectly each other. Meaning: an offset for the crenelated closed Curve AND an other crenelated Curve (at the bottom). Then you join the Lofted Brep between the 2 Curves with the 2 planar Breps (top-bottom).
Additionally: well ... try to imagine "inserting" pieces into an already assembled combo > glue would do what glue does AND if the vault is made, say, from plywood ... well... an aesthetic disaster is asured
If the plywood is treated with some proper penetrating oil (AVOID varnish AT any cost) ... then the oil would "contaminate" the sides and glue won't work.
Other than that, I predict lot's of broken teeth (in the "sensitive" areas) not to mention broken nerves.
Here's me admiring a "similar" experimental vault that I made a million years ago:
…
pper-elk-and-openstreetmap/
This has been mostly very successful, but I have a few questions/suggestions:
1) Is is possible to extract relations? For example, the York Museum Gardens are described by a relation, and at the moment aren't detected by Elk. If not, is this something you have in the pipeline?
2) Is there any way to get building height data? (i.e. for all polylines with "building", return the value from the "height" tag, if available) I have a workaround that involves using GenericOSM with K = "height", though this will return false positives if the OSM file contains any non-building objects with the height tag.
Thanks in advance for your help,
James…
ruses could follow. Then cones are made and some other things.You can move the cones around via the equivalent slider. If the cones "touch" then ... well .. test it and see what's happening,
2. Interactive capability is not present: assume that you have 666 paths/cones > by what means you think that you could control what's happening? By adding 666 sliders? (not in a million years).
3. Rhino is amusing with regard the solid union Method. Depending on Karma you can get this:
or that :
4. Leaving aside N3 .. if the real-time response goes AWOL with just 8 cones what would be the situation if you add 666 cones? This is the reason for using K to solve this ... obviously with "some" compromises yielding "vault" stuff like this:
or like that (an Alien billiard (C)(Tm)(US patent pending) for planet Zorg):
Moral: stick to the Soap_opera approach.…
he normal vector (per quad/tri set).
Random means that the vector length can vary between a desired min/max and sin distortion means ... hmm ... something "equivalent".
Clash detection (and topology validation) means that a variety of trigonometry based checks are performed in order to assure that tubes and the other paraphernalia ... stay clear each other AND the drilling axis in the balls are doable. But this is only half the bacon: since roofing material (say VM Zink and the likes) usually imposes several restrictions (for instance: you can't "bend" a VW sheet beyond a given angle etc etc).
Then coordinate systems (Planes) are made that are used in classic Plane to Plane transformations: i.e. take an instance definition related with a part of the system (sleeve, cone etc etc) always defined at WorldXY and put it in the desired plane.This has severe limitations in Rhino (actually is undoable and thus I very rarely do this type of work in Rhino/GH). What limitations may you ask? Well ... imagine fully parametric feature driven solids (the nuts and the bits of the system) that MUST comply (NOT yielding some clash, that is) with the whole topology OR they must change OR the topology must change: stuff for CATIA and the likes.
Then the structural department evaluates the whole thingy and gives the OK - if not another variant is attempted etc etc.
Then connectivity trees are made that relate nodes to edges etc etc ... in order to assemble the thing in real-life.
Then it's roof/envelope time ... and that is a pure nightmare.
BTW: having some NVidia Kepler K4200 (and up) is a must ... other wise ... well ... dealing with 20++K "objects" is not that trivial. NEVER attempt to do that type of stuff using, say, a laptop and the likes.…
teger, that is smaller than or equals to x.
floor(0/2) = floor(0) = 0
floor(1/2) = floor(0,5) = 0
floor(2/2) = floor(1) = 1
floor(3/2) = floor(1,5) = 1
ahhhhh ... so if you make a path {0;k} out of paths {0;i}, {0;j}, ... the index will be sorted automatically 0,1,2,... hmmmm ...
Is that "it" about path mapper? ... What other typicall functions are used for this tool? ... Thank you man!
EDIT:
Ahhh ... Unless I use (i) in the mapping (This is the Mapping: {A;(B\2)+1}) I will not need this index (i) at all ... Got it man ...
The BIG question is:
Is this "Integer division" the one and only operation, that a path mapper does?!
…
pattern, where pairs of points on a piece of fabric are stitched together(1 to 2, 2 to 3 etc...)
The result is a bas relief puckered surface as seen below
My question is, does anyone have thoughts about components that would simulate the material qualities/behaviors of fabric via a mesh or surface in gh? I have used k2 to constrain curve length, which is similar to what I want to do, however I need to constrain the dimensions of a mesh or surface (fabric) instead. I would also need to be able to dictate what direction the pinched/gathered fabric would go (up vs down)
I don't need any assistance in creating patterns or points for puckered iterations, I have plenty of ideas regarding that. I also don't need to get into minute material quality differences (trying to simulate how silk vs wool puckers) I just want to use the definition to give me a general understanding (pucker shape, direction) of how a 2d surface or mesh "fabric" can be transformed by "stitching" points together. I would be grateful for some component/plugin suggestions. I am exploring meshedit, mesh+, k2, toolbox at the moment.
Many thanks -
-K
Many thanks
…
erature is measured i.e tempMeasureHeight on Dragonfly_Reference EPW Parameters component.
As for calculating the Reference height, where are you getting the "common sizing rule of 'twice the top building height'"? I can't find that rule anywhere... I think keeping the default value of 150m as the component advises would be better.
Something that helped me a lot, and I think will be of interest to you is this thesis:https://dspace.mit.edu/handle/1721.1/99251?show=full. Specifically, chapter 4 has a set of sensitivity analysis for UWG input parameters, which shows you the % impact of varying the various parameters in terms of heating/cooling energy, and temperature change. In general, it found that varying the meteorological parameters (like the boundary layer inputs) deviated simulated temperature less then 0.5 K, for less then 43 hours of the year from the original empirical measurement. Which is why the study concludes its safe to keep the meterological inputs to default values.
An important caveat here is that the Reference height, is a slight exception to this as it had an effect on Boston temperatures beyond that limit (pg 43). That being said, I spent some time trying to figure out how the 150m value was calculated, and as far as I can tell, its based on an atmospheric simulation model, referenced in Bueno's thesis. So unless you have access to your own vertical temperature dataset, I think keeping the meterological values as they are is a good rule of thumb.
S…