roduct operator
<d> becomes the degrees operator
<deg> becomes the degrees operator
<degrees> becomes the degrees operator
<2> becomes the square operator
<square> becomes the square operator
<3> becomes the cube operator
<cube> becomes the cube operator
<project> becomes the Pull operator
<pull> becomes the Pull operator
<push> becomes the Push operator
<dist> becomes the Distance operator
<distance> becomes the Distance operator
!= becomes the Inequality operator
<> becomes the Inequality operator
<almost> becomes the Near Equality operator
<almostequal> becomes the Near Equality operator
<approx> becomes the Near Equality operator
<approximately> becomes the Near Equality operator
<similar> becomes the Near Equality operator
~ becomes the Near Equality operator
Functions:
<sum> becomes the Summation function
<sumtotal> becomes the Summation function
<summate> becomes the Summation function
<prod> becomes the Multiplication function (a very woody sort of operator)
<product> becomes the Multiplication function
<average> becomes the Average function
<mean> becomes the Average function
<hmean> becomes the Harmonic Mean function
<harmonicmean> becomes the Harmonic Mean function
<gmean> becomes the Geometric Mean function
<geometric> becomes the Geometric Mean function
<geometricmean> becomes the Geometric Mean function
Constants:
<pi> becomes the PI symbol
<phi> becomes the PHI (golden ratio) symbol
<inf> becomes positive infinity
<infinity> becomes positive infinity
<1/2> becomes the one-over-two symbol
<0.5> becomes the one-over-two symbol
<half> becomes the one-over-two symbol
<1/4> becomes the one-over-four symbol
<0.25> becomes the one-over-four symbol
<quarter> becomes the one-over-four symbol
<3/4> becomes the three-over-four symbol
<0.75> becomes the three-over-four symbol
<threequarters> becomes the three-over-four symbol
<1/3> becomes the one-over-three symbol
<third> becomes the one-over-three symbol
<2/3> becomes the two-over-three symbol
<twothirds> becomes the two-over-three symbol
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
Added by David Rutten at 3:01am on October 3, 2010
one bug on our end.
First, you set the natural ventilation setpoint to be only one degree greater than the heating setpoint. This frequently causes the building to expend heating energy reaching the setpoint only to throw away this heat a few minutes later when the indoor temperature rises by one degree. Generally, I would leave at least 3 degrees between these two setpoints unless you are using some special types of schedules that ensure this conflict does not happen.
Second, I do not know what you were trying to do with so many solve adjacency components but, when you hook up only one zone to this component and set "removeAdjacencies" to true, you will over-write all of the previous boundary conditions that you set in the surface-by surface method. These were the boundary conditions that you were sending into EnergyPlus in the file you uploaded:
As you can see, you were blowing your heating load out of proportion by getting rid of your originally-set adiabatic walls. In the attached GH file, I use just one solve adjacencies component and these are your resulting boundary conditions:
Finally, there was a bug in the function that checks the normal direction of surfaces input to the energy model and Mostapha just fixed this one this past week (https://github.com/mostaphaRoudsari/Honeybee/issues/365). This was causing the solar gain calculation to be incorrect in your model, which, admittedly, is the major reason why the heating loads between the north and south facades were not different. As you see in the attached GH file, you now get very different heating loads for the north and south facades:
I hope that all of this helped and, again, sorry for the delay.
-Chris…
3dParameter("Points", "P", "description", GH_ParamAccess.item);
pManager.AddCurveParameter("Curves", "C", "description", GH_ParamAccess.item);
Register Output:
pManager.AddPoint3dParameter("Nodes", "N", "description", GH_ParamAccess.item);
pManager.AddPoint3dParameter("Tails", "T", "description", GH_ParamAccess.item);
Tails are the NurbsCurve.Startpoints.. and that is where I am getting 3 points, instead of the expected 2 points.
Many Thanks!
…
to do once I figured out how you use only a small portion of each of my generated curves to make the 360 degree Loft surface. I had a huge AHA! moment when I realized the complete Loft surface really only needs a small portion of the generated curves rotated around to form a closed (except for top & bottom) surface. That is a major new insight for me and I appreciate you pointing it out.
I also tweaked the Twist angle parameter a bit so the resulting positive and negative Twist surfaces, when combined, yielded a result that was closer to my original shape. This is when I discovered something very interesting.
When I baked/exported the result using just one of the 2 twisted surfaces I got an STL file that had no errors, that 3D Builder was able to simplify from a 37 MB file to a 3 MB file, and that sliced A-OK. But, when I combined the left and right twisted surfaces, I was back with my same set of problems: the exported STL file had many errors, could not be fixed, and did not slice properly.
I went back to my original layout that uses the complete set of generated curves to create the Loft surface and found I got exactly the same results - using only one twisted surface worked fine, but nothing worked when the left and right twisted surfaces were combined. By nothing I mean I tried all the standard methods (GH Join and Sunion, Rhino Solid/Union, Join, etc.) What I think this means is that the Loft surface behaves the same, and apparently is the same, regardless if it is generated by rotating strips or by using complete closed curves.
Furthermore, I am guessing the problems with the combined/exported STL file made from both left and right twisted surfaces has to do with overlapping/coincident parts of each one - like the top & bottom planar surfaces and some of the wiggly parts.
If I am correct about this then it suggests to me that there is some sort of glitch in Rhino's STL Export function. This is surprising to me since I though an STL file only paid attention to the external shape of thngs,and did not know or care about any inside stuff. Of course this is all conjecture on my part, but at least for now seems it will be impossible for me to actually print the double-twisted geometry.…
Added by Birk Binnard at 3:52pm on September 23, 2016
be in your definition is impossible.
The 2 (or 3) sides of the cube's corner pieces can never be the same color (see image:)
because they are actually part of the same piece (they always move together).
So in your definition it is as if you have removed the stickers from the cube and replaced them randomly, which results in an unsolvable cube...
In order to start with a properly scrambled cube, I believe you could start with a solved cube and perform a big number of random rotations on it (just like you would do in real life).
On another subject:
"There are over 43 quintillion legal positions of the Rubik’s Cube.
It would take thirteen hundred million years to see every position if you were able to view one thousand per second.
If we stacked 43 quintillion pennies, the stack would be tall enough to reach the sun and return to the earth four thousand billion times."
source: http://b.chrishunt.co/how-many-positions-on-a-rubiks-cube
So, trying to brute-force the Rubik's cube is definitely not the way to go... :)
Of course there is a number of programming algorithms for solving the cube (examples) but I don't know how easy it would be to implement them in GH....
Best of luck and please keep us posted!
Nikos
…
Added by nikos tzar at 10:42am on January 31, 2017
isseminated at the firms I've worked at:
Always write your scripts as though someone else is going to have to use and debug them without any instruction from you. This is kind of an overall governing principle that drives a lot of the other best practices.
Structure your definitions left to right. This way it is clear what is dependent on what, what executes in what order, and makes it easy to use the "Moses" tool (alt-click and drag on the canvas to spread components apart) to insert intermediate functionality to an already-existing definition.
For a given functional group (a set of components that does a well-defined thing) keep all the data going in to the group as labeled parameters on the left, and everything going out of the group as labeled parameters on the right. This is the intent of my "Best Practicizer" tool in Metahopper (now a menu item instead of a component). In essence, you're treating each group as though it's about to be clustered - you've defined what it does, what its inputs are, and what its outputs are. This makes troubleshooting much easier - if something is going wrong, you can easily isolate which group is causing the trouble by looking at inputs and outputs.3b. If you're grabbing some value or data (e.g. "STEP COUNT") many times from elsewhere in the definition, don't make a bunch of long wires that connect all the way back to the original source - grab that data into ONE labeled parameter and then connect all your inputs that need it to that - makes for one long wire instead of 20.
Annotate, annotate, annotate. Label your params (and if you're in icon mode, switch them manually to text). Label your groups. Use scribbles to mark larger regions of functionality. Use panels for "instructions" wherever it might not be clear how someone is supposed to use your tools.
Avoid "wireless" (Hidden Wires) connections. If you MUST use them, make sure you create params at both ends with matching names so it's clear what the data represents and where it comes from.
Cluster where possible. It's extremely helpful to isolate functional groups into clusters - it makes debugging faster and easier, since you don't have to wait for the whole definition to recompute when making small edits to the inside of a cluster, and it sets you up well to create code that can be re-used later on. However, don't take whole definitions and cluster them. As a rule of thumb, if a cluster has more than ~10 inputs, it should probably be broken into multiple clusters. There is a slight performance impact when clustering, because unlike an un-clustered group of components, which only executes the parts of the definition where something has changed, any time ANY input to a cluster changes, the WHOLE cluster re-computes. Because of this, a cluster shouldn't generally wrap any groups of components that are not related / don't connect with each other.
Color code your groups. Many firms develop a standard around group coloring so that it's easy to understand what parts of a definition are doing what kind of task. For instance, at Woods Bagot where I work, we have different colors for component groups that highlight inputs, outputs, rhino references, baking, and visualization. You may find that a different set is useful to you, but having a consistent standard can improve legibility. That's my 2c. At the end of the day, everyone works a little bit differently, and that's unavoidable (and not even a bad thing!) As long as you keep #1 in mind, all the rest will follow.
…
ther math and logic. i can usually conceptualise what i want to do and cobble some semi working thing together but don't know which components to use and how to patch it. so i'm super happy to have someone who knows what he's doing to find this interesting.
and i'm glad you mention the fanned frets again, there is one input parameter that's still missing for the multiscale frets to be fully parametric, it's the angle of the nut or which fret should be straight. it depends a bit on personal preferences and playing posture what is more comfortable. so being able to adjust this easily would be cool. again i have no idea how the maths for that work or if you can just rotate each fret the same amount around it's middle point. The input either as fret number (for the straight fret) or as a simple slider from bridge to nut should do as input setting.
Here are the two extremes and the middle ground:
i've been thinkin today while analysing your patches and cleaning up my mess what exactly the monster should do.
Here are the input parameters needed, i think it's the complete list
scale length low E string
scale length high e string
fret angle/straight fret
string width at nut
string width at bridge
number of frets
fretboard overhang at nut (distance from string to fretboard bounds)
fretboard overhang at last fret
string gauges
string tensions
fretboard radius at nut (for compound radius fretboard radius at bridge is calculated with the stewmac formula)
fretwire crown width
fretwire crown height
action height at nut (distance between bottom of string and fretwire crown top)
action height at last fret
pickup 1 neck position
pickup 2 middle position
pickup 3 bridge position
nut width
the pickup positions should be used to draw circles for the magnet poles on each string so they are perfectly aligned and can be used for the pickup flatwork construction. ideally they would need a rotation control aligning the center line of the pickup so it's somewher between the last fret angle and bridge angle. personally i do this visually depending on the design i'm looking for, some people have huge theories on pickup positioning but personally i don't believe in it.
that should result in everything needed to quickly generate all the necessary construction curves or geometry for nut/fingerboard/frets/pickups. this is the core of what makes a guitar work, the more precise this dynamic system is the better the guitar plays and sounds.
i posted another thread trying to understand how i could use datasets form spreadsheets,databse, csv to organize the input parameters. What would make sense for the strings for example is hook into a spreadsheet with the different string sets, i attached one for the d'Addario NYXL string line which basically covers all combos that make sense.
The string tension is an interesting one, and implmenting it would sure be overkill albeit super interesting to try. it should be possible to extrapolate from the scale length of each string what the tension for a given string gauge of that string would be so that you could say 'i want a fully balanced set' or 'heavy top light bottom) and it would calculate which SKU from d'addario would best match the required tension. All the strings listed in the spreadsheet are available as single strings to buy.
i'm trying to reorganize everything which helps me understand it. i just discovered the 'hidden wires' feature which is great since once i understood what a certain block does or have finished one of my own, i can get the wires out of the way to carry on undistracted. a bit risky to hide so many wires but it makes it so much easier not to get completely lost :-)
btw, the 'fanned fret' term is trademarked, some guy tried to patent it in the 80's which is a bit silly since it has been done for centuries. there is a level of sophistication above this as well, check out http://www.truetemperament.com/ and that really is something else. it really is astounding how superior the tuning is on those wigglefrets, the problem is that it's rather awkward for string bending and also you can't easily recrown or level the frets when they are used. …
y anyway ;))
Since 2014 i begun to get back into the construction biz for some dozen main reasons, one of them being the highly increased availability of this kind of software "power", and robotics.
first project ended by 1stQ 2015 was focused on the development of a parametric block for construction. (almost sure the first parametric product designed in Uruguay, and probably one of the few first of this kind globally...)
Far from being a complicated model. In fact the standard model is extremely simple, key thing is that is fully parametric...
dimensions, materials, textures, colors... and so on
second key thing is that the main common component of the blocks (an EPS core) is robotically machined...
the blocks are the base of a construction system (oriented mainly - though not restricted only - to residential buildings) that
- is based on digital models, tendentially to be used in parametric models of buidings
- lab tested to prove to be 1.5 times as compression resistant than traditional bricks and blocks. (autoportability up to two stories buildings)
- has recently proved (due to size) to be 300% more efficient than the classic and 200% more efficient than steel frame in (our country official figures)
check it out here
--
https://drive.google.com/file/d/0B1TRxxgF_sEnQnZrTkZGbUx3cmM/view
--
- and it's aimed to be mass produced and handled by robots...
this project ended on 1H 2016
and i filed 4 patents in the process.
3 of them of mechanical devices designed as extensions for a cnc machine i own
and the fourth (
the patent related specifically with the blocks ) included a dozen of innovations (believe me...i have almost 15 yrs in the biz, and are coool stuff...)
along the project I've been working with inventor, even knowing in advance it will lack the kind of features I wanted to program many things... (lisp, VB, etc.... all same species of -prehistoric - animals) to leverage the tool to the sky - and far beyond... -
but was an alternative valid by that time because it allows the implementation of some form of parametric models, had a local representative and some supposedly skilled guys in the neibourhood....
but life is hard... and none of the latter two rendered me any significant help
so I had to take the tour myself...
- mind i never regret to do things that others cant -
and finish what i start
this one was a great project for many figures... and ended with more results than the ones commited to accomplish...
... some more history here ....
then because of a customer who brought a ZHA project ! to quote..., I crossed with rhino, and then met GH again to notice to my great joy and pleasure, in what kind of animal it had developed...
since money talks I'm investing hard on getting up to the expectations, and beyond as i usually do...
and thats how we met..
2017-2018 it's the time frame to build two robots. first one is a prototype to handle the k-nano blocks in the production process, delivery AND at the construction site ( a "smart crane" we nicknamed...)
the other one is the first prototype of robot to assist in the fabrication (smart blocker we called it to be creative ! ;))
then by 2018-2019 i'll be making a "kinda contour crafter" machine to complete the pie :) (you'll be interested on this..)
i guess you already know what all this has to do with GH...
i already have all the components i can imagine to do almost all i ever wanted to do in relation to this set of projects
but in almost a single tool !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
i can design, animate, render, optimize, simulate and even robotic simulate..
so, i have to ask...
is there a chance you might be interested in helping us in some projects we are starting on march and june 2017 (8 and no more than 18 months of duration respectively) ?
sent you a friend request, for the case you might be interested to continue by e-mail...
in any case many thanks for your help and inspiration !
best regards !
long happy marriage, and large figures bank account !
…
mponent that calculates view factors from a point or plane to a set of surfaces. The component uses a ray-tracing method that increases with accuracy as you increase the viewResolution (I recommend using a value of 2 to ensure that your view factor error is below 1%). Because of the ray-tracing method, the component should be able to handle any surface geometry that you put into it. Importantly, if you input a point, the view factors will be calculated without a bias in direction but inputting a plane will calculate the view seen by a surface in the plane, which includes the directional bias induced by the direction the surface faces. This image gives you a sense of how the component works showing you the surface temperatures from EnergyPlus mapped onto the sphere of view:
Ladybug_Radiant Asymmetry Discomfort - A component that calculates the percentage of people dissatisfied (PPD) from radiant asymmetry. It uses the formulas that Christian has posted on this discussion, which are also the formulas published in ASHRAE 55:
https://www.scribd.com/doc/194468127/Ashrae-Std-55-2010
And in the CBE comfort tool:
http://comfort.cbe.berkeley.edu/
https://github.com/CenterForTheBuiltEnvironment/comfort_tool/blob/master/static/js/local-discomfort.js#L11-L14
With this component, you can convert the temperature difference across a plane:
... to a percent of people dissatisfied (PPD) from radiant asymmetry:
... and then to a temporal plot of radiant discomfort risk:
If you sync with the github, you can find these components under the WIP section.
Here you can find an example file that shows you how to estimate radiant asymmetry discomfort using these components and the results of an EnergyPlus simulation:
http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Radiant_Asymmetry_Discomfort&slide=0&scale=1.0000000000000004&offset=64.23627386566636,-8.784339845403167
Let me know if you have any feedback, Christian. If you get the chance to ask Olesen the question about the wall asymmetry, please let us know.
Thanks again for all of the info that you have posted here.
-Chris…