re are major changes and enhancements.
HONEYBEE
More Flexible Workflow - Many small modifications were made to support a more flexible workflow, such as the ability to separate a zone created with masses2Zones into editable HBSrfs that can be recombined. For the energy components, it is now possible to plug custom constructions directly into the components that set the zone constructions without writing them first into the library. For the daylighting components it is now possible to change all of the materials of specific surface types at once.
Support for Complex Geometry - Many small bugs for complex geometry have been fixed including the ability to import energy results correctly for curved NURBS surfaces as well as unconventional window configurations. Also, the intersectMasses component now almost always succeeds in splitting all of the surfaces of adjacent zones, no matter how complex the intersection is.
Automatic Download Issues Fixed - Many users who faced issues with not having “gendaymtx.exe” or who had trouble syncing with our github know that we faced an issue with automatic background downloads.
Air Walls - Honeybee EnergyPlus models now officially support air walls (or virtual partitions) in a basic implementation. Now, any time that you use the air wall construction or set a surface type to “air wall,” the air between adjacent zones will be automatically mixed. At present, this mixing is just a constant flow based on the surface area between zones connected by air walls multiplied by an adjustable “flow factor.” It is important to stress that this basic air mixing is not with the EnergyPlus Airflow Network, although the groundwork laid in this release will eventually allow for the implementation of the Airflow Network in future releases. As such, this present air mixing is only suitable for multi-zone conditions where there is not significant buoyancy-driven flow between zones.
Natural Ventilation - To go along with the new potential introduced by air walls, there has been a basic implementation of EnergyPlus’s natural ventilation objects in a new component called “Set EP Airflow”. The current setup allows for three possible types of natural ventilation: 1) natural ventilation through windows (with auto-calculated flow based on window area, outdoor wind speed/direction, and stack effects), 2) custom wind and stack objects that can be used to model things such as chimneys off of single zones, and 3) constant, fan-driven natural ventilation.
Additional Thermal Mass - The capability to add additional thermal mass to zones has been added. This is useful for factoring in the mass of indoor furniture or heavy interior objects such as chimneys.
New Utility Components - Abraham has added a couple of useful components to help calculate lighting loads based on bulb types and target lighting levels as well as a converter from ACH to the m3/s-m2 that the other HB components accept. Along this vein, there is also a component for adding in the resistance of Air Films to HB constructions.
Improved and Editable Ideal Air Loads System - The EnergyPlus Ideal Air System now goes through an automatic sizing period at the start of the simulation based on the extreme weeks of the weather file. Furthermore, the ability to adjust many of the parameters of the ideal air loads system have been added with a new “Set Ideal Air Loads Parameters” component. The component allows you to add in heat recovery, air side economizers and demand-controlled ventilation.
OpenStudio Export Update - The OpenStudio workflow is still largely under development but this release includes a version with a working VAV and PTHP system template for those curious with experimenting. Note that not all of the new features available for the basic “Run Energy Simulation” component are available for the OpenStudio component (such as air walls, natural ventilation, or additional thermal mass).
Microclimate/Indoor Comfort Maps - Blossoming from initial experiments with the radiant temperature map, a workflow for looking into sub-zone microclimate and indoor comfort has been initiated. All components for this are presently under the Honeybee WIP tab but, over the next month, they will be completing their development phase and moving into the rest of the tabs. If you are interested in testing when they are ready, please let Chris know. For a teaser video of the intended capabilities, see this video: (https://www.youtube.com/watch?v=fNylb42FPIc&list=UUc6HWbF4UtdKdjbZ2tvwiCQ)
LADYBUG
Monthly Bar Chart - After much demand from multiple parties, a new component to create monthly bar and line charts has been added. The component is particularly useful for plotting the outputs of the “Average Data” component like monthly EPW data or averaged monthly-per hour data. It also supports daily data and any type of Energy simulation results.
Wind Profile - To go along with the new capabilities of natural ventilation in Honeybee, Ladybug now has a fully fleshed-out Wind Profile component that allows you to visualize how wind speed changes with height in relation to your building geometry. The component is geared to understanding the conditions of prevailing wind and will be useful in the future for setting up CFD models. Credit goes to Djordje Spasic for adding in all of the new capabilities. In a similar vein, the appearance of the wind rose has also been improved thanks to suggestions from Alejandra Menchaca.
Faster Solar Adjusted Temperature - Thanks to the SolarCal method from the Center for the Built Environment at UC Berkeley (http://escholarship.org/uc/item/89m1h2dg), the solar adjusted temperature component now includes an option for a much faster calculation that produces results that are very close to those originally obtained with the genCumSky component. Instead of using the cumulative sky, the component can now accept the direct and diffuse radiation from the ImportEPW component. Over a whole year, this essentially takes a calculation that used to be a half-hour and shrinks it down to 10 seconds. Thanks again to those at UC Berkeley for keeping their work open source!
Instructions - Last but not the least, [It took me almost two years to understand this but finally] we have a text file that describes the installation step by step and is way easier to modify than a video. You can find it in the zip file. Credit goes to Chris!
We also want to welcome Anton, Patrick and Sandeep to the team. Anton has kicked off his development by working on a component to import and visualize epw ground temperature data and he will be continuing to develop components to bring in reliable precipitation data to Ladybug. With this basis, he will continue to implement Honeybee components for ground heat storage, earth tubes, rain collection and hot water systems. Patrick and Sandeep are working on integration of Honeybee to Energy Performance Calculator.
As always let us know your comments and suggestions.
Enjoy!…
ndard length elements without any cutting, and using only simple connections, such as cable ties or scaffold swivel couplers.
To summarize the approach I present here:
Design an initial shape
Remesh this form so that the edges are all roughly the length of the tubes we will use to build the structure
Rotate and extend the edges of this mesh to create the crossings
Apply a relaxation to optimize the positions of the tubes for tangency
demo_reciprocal_structures.gh
Initial form
In this example I show how to apply this system to a simple sphere. You can replace this with any arbitrary shape. It can be open or closed, and have any topology.
Remeshing
The new ReMesher component takes an input mesh, and a target edge length, and iteratively flips/splits/collapses edges in order to achieve a triangulated mesh of roughly equal edge lengths.
Press the Reset button to initialize, then hold down the F5 key on your keyboard to run several iterations until it has stabilized. (F5 just recomputes the solution, and this can be a quick alternative to using a timer)
Once the remeshing is complete, bake the result into Rhino and reference it into the next part of the definition (I recommend doing this rather than connecting it directly, so that you don't accidentally alter the mesh and recompute everything downstream later).
Alternatively you can create your mesh manually, or using other techniques.
Rotate and Extend
We generate the crossings using an approach similar to that described by Tomohiro Tachi for tensegrity structures here:
http://www.tsg.ne.jp/TT/cg/FreeformTensegrityTachiAAG2012.pdf
Using the 'Reciprocal' component found in the Kangaroo mesh tab, each edge is rotated about an axis through its midpoint and normal to the surface, then extended slightly so that they cross over.
By changing the angle you can change whether the fans are triangular or hexagonal, and clockwise or counter-clockwise.
Choose values for the angle and scaling so that the lines extend beyond where they cross, but not so far that they clash with the other edges.
Note that each rod has 4 crossings with its surrounding rods.
There are multiple possibilities for the over/under pattern at each 'fan', and which one is used affects the curvature:
A nice effect of creating the pre-optimization geometry by rotating and extending mesh edges in this way is that the correct over/under pattern for each fan gets generated automatically.
Optimization for tangency
We now have an approximate reciprocal structure, where the lines are the centrelines of our rods, but the distances between them where they cross vary, so we would not actually be able to easily connect the rods in this configuration.
To attach the rods to form a structure, we want them to be tangent to one another. A pair of cylinders is tangent if the shortest line between their centrelines is equal to the sum of their radii:
Achieving tangency between all crossed rods in the structure is a tricky problem - if we move any one pair of rods to be tangent, we usually break the tangency between other pairs, and because there are many closed loops, we cannot simply start with one and solve them in order.
Therefore we use a dynamic relaxation approach, where forces are used to solve all the tangency constraints simultaneously, and over a number of iterations it converges to a solution where they are all met. The latest Kangaroo includes a line-line force, which can be used to pull and push pairs of lines so that they are a certain distance apart. Each rod is treated as a rigid body, so forces applied along its length will cause it to move and rotate.
The reciprocal component uses Plankton to find the indices of which lines in the list cross, which are then fed into the force for Kangaroo. We also use springs to keep each line the same length.
If the input is good, when we run the relaxation (by double clicking Kangaroo and pressing play), the rods should move only a little. We can see whether tangency has been achieved by looking at the shortest distance between the centerlines of the crossing rods. When this is twice the rod radius, they are tangent. Wait for it to solve to the desired degree of accuracy (there's no need to wait for 1000ths of a millimeter), and then press pause on the Kangaroo controller and bake the result.
The radius you choose for the pipes, curvature of the form and length of the edges all affect the result, and at this stage you may need to tweak these input values to get a final result you are happy with. If you find the rods are not reaching a stable solution but are sliding completely off each other, you might want to try adding weak AnchorSprings to the endpoints of the lines, to keep them from drifting too far from their original positions.
For previewing the geometry during relaxation I have used the handy Mesh Pipe component from Mateusz Zwierzycki, as it is much faster than using actual surface pipes.
To actually build this, you then need to extract the distances along each rod at which the crossings occur, and whether it crosses over or under, mark the rods accordingly, and assemble (If there is interest I will also clean up and post the definition for extracting this information). While this technique doesn't require much equipment, it does need good coordination and numbering!
There is also a ReciprocalStructure user object component that can be found in the Kangaroo utilities tab, which attempts to apply steps 3 and 4 automatically. However, by using the full definition you have more control and possibility to troubleshoot if any part isn't working.
The approach described here was first tested and refined at the 2013 Salerno Structural Geometry workshop, lead by Gennaro Senatore and myself, where we built a small pavilion using this technique with PVC tubes and cable ties. Big thanks to all the participants!
Finally - this is all very experimental work, and there are still many unanswered questions, and a lot of scope for further development of such structures. I think in particular - which of the relative degrees of freedom between pairs of rods are constrained by the connection (sliding along their length, bending, and twisting) and how this affects the structural behaviour would be interesting to examine further.
Steps 3 and 4 of the approach presented above would also work with quad meshes, which would have different stability characteristics.
There is also the issue of deformation of the rods - as the procedure described here solves only the geometric question of how to make perfectly rigid straight cylinders tangent. The approach could potentially be extended to adjust for, or make use of the flexibility of the rods.
I hope this is useful to somebody. Please let me know if you do have a go at building something using this.
Any further discussion on these topics is welcome!
Further reading on reciprocal structures:
http://vbn.aau.dk/files/65339229/Three_dimensional_Reciprocal_Structures_Morphology_Concepts_Generative_Rules.pdf
http://www3.ntu.edu.sg/home/cwfu/papers/recipframe/
http://albertopugnale.wordpress.com/2013/04/05/form-finding-of-reciprocal-structures-with-grasshopper-and-galapagos/
…
can toggle these modes from either the Canvas Toolbar, the Remote Control Panel or via shortcuts Ctrl+1,2 or 3
These are pretty self explanatory so I will keep it brief:
No Preview will completely switch off the preview of the Grasshopper Objects in the Rhino Viewports.
Wireframe Preview similar to Disable Meshing will disable any render meshes but keep any curves or Edges visible.
Shaded Preview will shade the preview...
There are two more Icons in this section of the Display Menu:
Selected Only Preview
Preview Settings
Also available on the Canvas Toolbar.
Selected Only Preview is a useful feature for following what your definition is doing at stages along the process without having to switch all previews off and manually turning individual ones back on as you go.
Without Selected Only Preview Toggled
With Selected Only Preview Toggled:
Preview Settings is the area within Grasshopper where you can modify the colours - including transparency - Grasshopper uses to display objects in the Rhino Viewport.
The first thing you should do before altering any settings is to Drag the Default Colours onto the green plus sign to add them to the Presets. This will enable you to restore them easily.
For future reference the default settings are:
Normal = Hue: 0º, Sat: 100, Val: 59, A:100
Selected = Hue: 120º, Sat: 100, Val: 59, A:100
Apart from accounting for taste this feature is particularly useful for anyone that is colour blind[2]:
The way to restore a colour from the preset list is to drag it from the right hand panel to either the Normal or Selected option on the Left
[2] There is a very interesting discourse topic on the McNeel Forums about Red/Green Colour Blindness.
work carried out by Jørgen Holo
…
guages I'd recommend all use the RhinoCommon SDK and thus all have access to the same functionality.
How long would it take me to understand and write my own code?
If you already know how to program, it probably won't take too long. If you're past the hurdle of what it means to declare and assign variables, how conditionals and loops work and what scope is, you've already rounded the hardest corner.
Is it even worth it?
That really depends. "Learn programming" is clearly not blanket good advice. Most people out there do not have to learn programming to be happy with their lives and successful in their careers. For some people it can make a small difference, and for a few people it can make a huge difference. If you feel you're in the 'some' category then this is indeed a question you have to answer. Note that the investment for learning programming is a continuous process. Unless you keep up your skills and learn about new stuff that becomes available, you'll lose the ability to write successful code over time.
Where do I start?
Step 1 is to answer the previous question. It is unlikely that anyone besides yourself can answer it, but you can start by making a list of things you do manually now that may be programmable. Then make a list of the things you are unable to do now but which you might be able to do with programming. If while looking at these lists your reaction is: "meh", the answer is probably no.
Step 2 is to pick a language. This is again a very personal thing; there's no wrong answer, because there's no right answer.
Step 3 is to start learning this language. My experience is that the best way to learn a programming language is to try and solve a real problem that you understand very well. If the problem statement is nebulous or poorly understood, you'll be learning two things and that's a recipe for unnecessary frustration.
Here are my thoughts on language:
Python: I don't use Python myself, I can sort of read it while moving my lips. I don't particularly like Python though. The indentation sensitiveness stresses me out, and I find the lack of type-safety disturbing. However it is a good language for mathematical/scientific programs. There are lots of additional code libraries you can easily import that will ease the development of mathematically intense algorithms.
C#: I like C# very much, but it does suffer from geekerosis. A lot of the keywords used in the language are not self-explanatory (abstract, sealed, virtual). For me this is no longer a problem as I've memorised what they all mean. C# is designed to be an efficient language to write, rather than an easy one to learn.
The great thing about C# though is that there's a huge amount of material out there for learning it. It is one of the most popular, mature and modern languages you can hope to pick.
VB: I learned VBScript as my first language, and then moved on to VB5, VB6 and VB.NET. It is somewhat more friendly than C#, and functionally it is almost identical. The switch from VB to C# is reasonably low-threshold and there are excellent tools for translating VB code to C# and vice versa.
Since you already know some Python, it probably makes the most sense to continue on that path. If you want to switch, C# is more like Python than VB, so C# would be my next suggestion.
As for where to get information... you have 4 major options when developing code for Rhino.
If it's a question about the language itself, StackOverflow is a great resource. It can be a pretty hostile place for beginner questions, but I find that mostly the questions I'm asking have been asked already and the answers on SO tend to be good. In fact usually when I google my questions, the first few hits are always SO posts.
If it's a question about the Rhino SDK or Grasshopper, you can ask it either on the GH forums (where we are now), or on Discourse. We're not as quick on the draw as SO, but we do know about Rhino.
If you're looking for a basic explanation of what a keyword or a type is for, perhaps with an example, MSDN is the best first choice. In fact if you google the name a of a .NET type, the first hit is almost always an MSDN page.…
Added by David Rutten at 2:03pm on December 3, 2014
umbrella of Urban Heat Island (UHI) and I am going to try to separate them out in order to give you a sense of the current capabilities in LB+HB.
1) UHI as defined as a recorded elevated air temperature in an urban area:
If you have access to epw files for both an urban area and a rural area, you can use Ladybug to visualize and deeply explore the differences between the two weather files. Ladybug is primarily a tool for weather file visualization and analysis and it can be very helpful for understanding the consequences of UHI on strategies for buildings or on comfort. This said, if you do not have both rural and urban recorded weather data or you want to generate your own weather files based on criteria about urban areas (as it sounds like you want to do), this definition might not be so helpful.
2) UHI defined by air elevated air temperature but viewed as a computer model-able phenomenon resulting primarily from urban canyon geometry, building materials, and (to a lesser degree) anthropogenic heat:
This definition seems to fit more with they type of thing that you are looking for but it is unfortunately very difficult and computationally intensive such that we do not currently have anything within Ladybug to do this right now. I can say that the state-of-the art for this type of modeling is an application called Town Energy Budget (TEB) and this is what all of the advanced UHI researches that I know use (http://www.cnrm.meteo.fr/surfex/spip.php?article7). Unfortunately for those trying to use it in professional practice, it can take a while to get comfortable with it and it currently runs exclusively on Linux (this does mean that it is open source, though, and that you can really get deep into the assumptions of the model). A couple years ago, a peer of mine translated almost all of TEB into Matlab language making it possible to run it on Windows if you have Matlab. He wrapped everything together into a tool called the Urban Weather Generator (UWG), which can take an epw file of a rural area and warp it to an urban area based on inputs that you give of building height, materials, vegetation, anthropogenic heat, etc. I would recommend looking into this for your project, although, bear in mind that is it not open source like the original TEB tool and that you may need to get a (very expensive) copy of MATLAB (http://urbanmicroclimate.scripts.mit.edu/uwg.php).
3) UHI as defined by a thermal satellite image of an urban area depicting an elevated average radiant environment that reaches a maximum a the city center and changes by land use:
This is the definition of UHI that I am most familiar with and was the basis of much of my past research. I feel that it is also a definition of UHI that is a bit more in line with where a lot of contemporary UHI research is headed, which is away from the notion of UHI as a macro-scale meteorological phenomena that is averaged as an air temperature over a huge area towards one that accepts that different land uses have different microclimates and (importantly) different radiant environments. While the air temperature difference between urban and rural areas usually does not change more than 1-4 C, the radiant environment can be very different (on the order of 10-15 C differences). The best way to understand UHI in this context is with Thermal satellite images, for which there is ha huge database of publicly available data on NASA's glovis website (http://glovis.usgs.gov/) or their ECHO website (http://reverb.echo.nasa.gov/reverb/#utf8=%E2%9C%93&spatial_map=satellite&spatial_type=rectangle). I tend to use thermal data from LANDSAT 5-8 and ASTER satellites in my research. Unfortunately, there is a lot f bad data with a lot of cloud cover mixed in with the really good stuff and it can take some time to find good images. Also, there aren't too many programs that read the GeoTiff file format that you download the data as. I know that ArcGIS will read it, a program called ENVI will read it (I think that the open source QGIS can also red it). I have plans to write a set of components to bring this type of data into Rhino and GH (I may get to it a few months down the line).
4) UHI as a computer model-able notion of "Urban Microclimate" with consideration of local differences and the local radiant environment:
This is where a lot of my research has lead and, thankfully, is an area that Honeybee can help you out a lot with. EnergyPlus simulations can output information on outside building surface temperatures and these can be very helpful in helping get a sense of the radiant environment around individual buildings. Right now, I am focusing just on using this data to fully model the indoor environments of buildings as you see in this video:
https://www.youtube.com/watch?v=fNylb42FPIc&list=UUc6HWbF4UtdKdjbZ2tvwiCQ
I have plans to move this methodology to the outdoors once I complete this initial application to the indoors. For now, you can use the "Surface result reader" and the "color surfaces based on EP result" components to get a sense of variation in the outside temperature of your buildings.
I hope that this helped,
-Chris
…
ahams's question about how shades are accounted for in the simulation/thermal map and Theodore's thought that just accounting for shades in the E+ run was sufficient. I think that it may be clearest to explain what is going on with this infographic:
As the graphic shows, the thermal maps are made from 4 key types of inputs. The radiant temperature map is formed through a consideration of both the temperature of the surfaces surrounding the occupants and the direct solar radiation that might fall onto the occupants through un-shaded windows. The first surface temperature effect is easily computable from your Energy simulation results and the HBZone geometry. However, the second is calculated by seeing how sun vectors pass through the windows of the zones and uses the SolarCal method of the CBE team (http://escholarship.org/uc/item/89m1h2dg) to compute an MRT delta resulting from solar radiation. This delta is then added to the initial values computed through surface temperature view factor. When you do not connect up your shading brep geometry, internal furniture breps, or outdoor context geometry that might block sun to the additionalShading input, the thermal map will assume that sun can pass unobstructed through the window or through indoor furniture to fall onto occupants. It is important to stress that the EnergyPlus simulation does not count for blind geometry or internal furniture as actual geometry. Just as numerical abstractions of surface area and material properties. So we need you to plug in the actual geometry of these things when we compute the MRT delta resulting from sun falling directly onto people.
Next, to clear up the definition of window transmissivity. The important thing to clarify here is that, whether it refers to the tranmittance of glass or to the amount of sun coming through a fine screen of blinds, the value is multiplied by the radiation falling on the occupant and thus has a direct correlation to the MRT Delta from sun falling on occupants. So, if you set transmissivity to zero, the sun falling on the occupants will not be considered in the calculation and, if you set the transmissivity to 1, the assumption is that there is no window (or the window glass is 100% clear). So, Abraham, your definition of it as a coefficient is appropriate.
Normally, I would just recommend that you leave this value at the default 0.7, which corresponds to the transmittance of the default glass material in Honeybee. However, there are 4 cases in which you might consider changing it:
1) You are not using the default Honeybee glazing material, in which case, you should change the transmissivity to be equal to this new value.
2) You have a lot of really small blind/shade geometries and you do not want the view factor component to take several minutes to trace sun vectors through the detailed shade geometry and so you are ok with using just a simple abstraction instead of plugging shade breps into the additionaShading. In this case, you might try to estimate the average percentage of radiation coming through the blind geometry (maybe with some simple Ladybug radiation studies or with your intuition about the amount of sun blocked by the shades). You will then multiply this by the tranmissivity of your glass and this will be the value that you input to the component.
3) Your blinds for your Honeybee simulation are dynamic, in which case, plugging shade breps into additionalShading is not going to work because the component will assume that those shades are always there. In this case, you should be plugging a list of 8760 values into the transmissivity that correspond to when the shades are pulled. When the blinds are completely up, the value should be the tranmittance of your window and, when they are down, the value should be the window tranmittance multiplied by the fraction of light coming through the shades.
4) You have shades/blinds but they are transparent or are not completely opaque. The additionalShading_ input assumes that all shade geometry is opaque and so you cannot use it to account for such shades. Accordingly, you will need to account for it through the tranmissivity.
In the future, I may try to pull more information about blinds and glass properties off of the HBzones inside the view factor component but, for now and for the next few months, the above describes how it works.
Theodore, for curved geometry, I think that your safest bet is going to be planarizing the Rhino geometry before you turn it into a HBZone (so you just divide the curved surface into a few vertical planar panes of glass that approximate the curve well enough). This is essentially what the runSimulation component does for you automatically (it meshes the geometry as you see here: https://www.youtube.com/watch?v=nMQ2Pau4q6c&index=12&list=PLruLh1AdY-SgW4uDtNSMLeiUmA8YXEHT_). If I were to figure out a way to incorporate shades in this automatic meshing workflow, your EnergyPlus simulation would take a very long time to run and I am not even sure if the result will be that accurate with the way E+ abstracts shades. So I don't think that it's really worth it over just planarizing the geometry yourself.
Lastly, I won't be able to figure out the problem with your current run Theodore, unless I get the GH file from you. Make sure that you are using all up-to-date components.
-Chris…
Because the Adaptive methodology is founded upon the notion that there are hundreds of social factors that influence comfort and that the best we can do to forecast comfort is to find variables with good correlations to these social factors (like outdoor temperature), the premise that these published Adaptive model holds regardless of cultural norms is dangerous. Notably, the founders of the adaptive model have stressed that this particular linear correlation that you cite comes from recent surveys of buildings where people have both the the ability to open windows AND a great freedom to dress down. Hypothetically, if occupants were able to open the windows in Abraham's building but the cultural norm was that everyone was expected to wear multi-layered suits or dresses (as in historic Britain), a different correlation between outdoor temperature and comfort temperature would exist. In fact, historical European comfort surveys show that people likely preferred cooler temperatures in buildings (about 1-2C cooler) than today's occupants. Accordingly, after recognizing this social premise in the Adaptive model, I have built in a few ways to adjust/alter the version in Ladybug based on the literature I have read (even though these alterations are not a part of any official ASHRAE or European standard).
Abraham, you might have to be a bit more specific about how you would like to adjust the Adaptive comfort model for me to help your particular case and this may lead to me adding in new functionality. For the time being, I can tell you that the 'Ladybug_Adaptive Comfort Parameters' component is going to be your friend and I would recommend using the Adaptive Comfort Chart to visualize how you are changing the model. You can plug these 'Adaptive Comfort Parameters' into the 'Adaptive Comfort Recipe' component to have the microclimate analysis run with these parameters. Here are a few examples of how to alter the model:
1) Mixed-mode Building - Humphreys and the European Adaptive comfort team derived two separate correlations.
One for naturally ventilated buildings:
and conditioned buildings:
The dimensionless value between 0 and 1 for _levelOfConditioning allows you to create different correlations depending on whether occupants have complete freedom of dress and window operability (0) or have slight restrictions like in a mixed mode building (0.5, for example):
2) Changing Response Time of Occupants - There has been a bit of a debate in Literature about whether it is better to use the average monthly temperature or a weekly running mean temperature. The avgMonthORRunningMean input allows you to adjust this like so:
Average Month:
Running Mean:
3) Greater Temperature Range Tolerance - While this last one is actually a part of the European and Adaptive standards, you can adjust the range of the comfort band with either the 'eightyOrNintetyComf' input or the comfortClass input like so:
Ninety Percent Comfortable
Eighty Percent Comfortable
Abraham, let me know if you would like more controls over the model or if this is enough to do what you are thinking of. This example file allows you to construct the images I have above:
http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Adaptive_Comfort_Chart&slide=0&scale=1&offset=0,0
-Chris…
ther math and logic. i can usually conceptualise what i want to do and cobble some semi working thing together but don't know which components to use and how to patch it. so i'm super happy to have someone who knows what he's doing to find this interesting.
and i'm glad you mention the fanned frets again, there is one input parameter that's still missing for the multiscale frets to be fully parametric, it's the angle of the nut or which fret should be straight. it depends a bit on personal preferences and playing posture what is more comfortable. so being able to adjust this easily would be cool. again i have no idea how the maths for that work or if you can just rotate each fret the same amount around it's middle point. The input either as fret number (for the straight fret) or as a simple slider from bridge to nut should do as input setting.
Here are the two extremes and the middle ground:
i've been thinkin today while analysing your patches and cleaning up my mess what exactly the monster should do.
Here are the input parameters needed, i think it's the complete list
scale length low E string
scale length high e string
fret angle/straight fret
string width at nut
string width at bridge
number of frets
fretboard overhang at nut (distance from string to fretboard bounds)
fretboard overhang at last fret
string gauges
string tensions
fretboard radius at nut (for compound radius fretboard radius at bridge is calculated with the stewmac formula)
fretwire crown width
fretwire crown height
action height at nut (distance between bottom of string and fretwire crown top)
action height at last fret
pickup 1 neck position
pickup 2 middle position
pickup 3 bridge position
nut width
the pickup positions should be used to draw circles for the magnet poles on each string so they are perfectly aligned and can be used for the pickup flatwork construction. ideally they would need a rotation control aligning the center line of the pickup so it's somewher between the last fret angle and bridge angle. personally i do this visually depending on the design i'm looking for, some people have huge theories on pickup positioning but personally i don't believe in it.
that should result in everything needed to quickly generate all the necessary construction curves or geometry for nut/fingerboard/frets/pickups. this is the core of what makes a guitar work, the more precise this dynamic system is the better the guitar plays and sounds.
i posted another thread trying to understand how i could use datasets form spreadsheets,databse, csv to organize the input parameters. What would make sense for the strings for example is hook into a spreadsheet with the different string sets, i attached one for the d'Addario NYXL string line which basically covers all combos that make sense.
The string tension is an interesting one, and implmenting it would sure be overkill albeit super interesting to try. it should be possible to extrapolate from the scale length of each string what the tension for a given string gauge of that string would be so that you could say 'i want a fully balanced set' or 'heavy top light bottom) and it would calculate which SKU from d'addario would best match the required tension. All the strings listed in the spreadsheet are available as single strings to buy.
i'm trying to reorganize everything which helps me understand it. i just discovered the 'hidden wires' feature which is great since once i understood what a certain block does or have finished one of my own, i can get the wires out of the way to carry on undistracted. a bit risky to hide so many wires but it makes it so much easier not to get completely lost :-)
btw, the 'fanned fret' term is trademarked, some guy tried to patent it in the 80's which is a bit silly since it has been done for centuries. there is a level of sophistication above this as well, check out http://www.truetemperament.com/ and that really is something else. it really is astounding how superior the tuning is on those wigglefrets, the problem is that it's rather awkward for string bending and also you can't easily recrown or level the frets when they are used. …
me logic produced by running the 2-d voronoi component.
From a given set of polylines we can extract the centers and this can drive both the voronoi component as well as provide the XYZ drill points for the cnc. The definition has a variety of different options. You need Lunchbox, Weaverbird, and Starling. I can't tell you how amazing these 3 tools are from a design perspective. They are extremely powerful so if you don't have these you must install them asap. You can get the tools at http://www.food4rhino.com/
This definition works by first choosing a grid type, next you choose voronoi type, and subdivision type. From the voronoi type list you can choose basic (just grid), truncation (uses truncation calculated via the image sampler), truncation dual (uses the dual of the truncated image based grid), and subdivision (takes the basic grid type and uses different subdivision shcemes). Each of these provide different patterns of polylines from which we can extract our drilling points. I am rather proud of this definition since the overall idea is one which is so simple it's easy to overlook - the idea that drilling with a ball end mill makes voronoi plots. Now when you combine that with all of these amazing tools it can go off right quick. The nice thing is the paatern you see on screen is the pattern that gets made by drilling wysiwyg cnc patterns.
VORONIO_DRILLing.gh
Here are some on screen patterns in process in the following order truncation, basic, subdivision:
here is a video moving over a machined example:
…