e chosen to dive into Grasshopper. I’m about 6 months in. If some of my comments are completely off, please take that to mean that a feature is too inaccessible to a newish user rather that it’s just missing, as I may have stated.
One of my primary pain points is this. Things that can be done in other programs are invariably easier in other programs. This is a big enough issue that I doubt there’s an easy solution that an armchair qb like myself can offer up.
The interface:
I’ve used a lot of 3D programs. I’ve never encountered one as difficult as grasshopper. What in other programs is a dialog box, is 8 or 10 components strung together in grasshopper. The wisdom for this I often hear among the grasshopper community is that this allows for parametric design. Yet PTC (Parametric Technology Corp.) has been doing parametric design software since 1985 and has a far cleaner and more intuitive interface. So does SolidWorks, Inventor, CATIA, NX, and a bunch of others.
In the early 2000's, when parametric design software was all the rage, McNeel stated quite strongly the Rhino would remain a direct modeler and would not become a parametric modeler. Trends come. Trends go. And the industry has been swinging back to direct modeling. So McNeel’s decision was probably ok. But I have to wonder if part of McNeel’s reluctance to incorporate some of the tried and proven ideas of other parametric packages doesn't have roots in their earlier declaration to not incorporate parametrics.
A Visual Programming Language:
I read a lot about the awesomeness and flexibility of Grasshopper being a visual programming language. Let’s be clear, this is DOS era speak. I believe GH should continue to have the ability to be extended and massaged with code, as most design programs do. But as long as this is front and center, GH will remain out of reach to the average designer.
Context sensitivity:
There is no reason a program in 2014 should allow me to make decisions that will not work. For example, if a component input is in all cases incompatible with another component's output, I shouldn't be able to connect them.
Sliders:
I hate sliders. I understand them, but I hate ‘em. I think they should be optional. Ya, I know I can r-click on the N of a component and set the integer. It’s a pain, and it gives no feedback. The “N” should turn into the number if set. AAAnd, sliders should be context sensitive. I like that the name of a slider changes when I plug it into something. But if I plug it into something that'll only accept a 1, a 2, or a 3, that slider should self set accordingly. I shouldn't be able to plug in a “50” and have everything after turn red.
Components:
Give components a little “+” or a drawer on the bottom or something that by clicking, opens the component into something akin to a dialog box. This should give access to all of the variables in the component. I shouldn't have to r-click on each thing on a component to do all of the settings.
And this item I’m guessing on. I’m not yet good enough at GH to know if this may have adverse effects. Reverse, Flatten, Graft, etc.; could these be context sensitive? Could some of these items disappear if they are contextually inappropriate or gray out if they're unlikely?
Tighter integration with Rhino:
I'm not entirely certain what this would look like. Currently my work flow entails baking, making a few Rhino edits, and reinserting into GH. I question the whole baking thing, btw. Why isn't it just live geometry? That’s how other parametric apps work. Maybe add more Rhino functionality to GH. GH has no 3D offset. I have to bake, offsetserf, and reinsert the geometry. I’m currently looking at the “Geometry Cache” and “Geometry Pipeline” components to see if they help. But I haven't been able to figure it out. Which leads me to:
Update all of the documentation:
I'm guessing this is an in process thing and you're working toward rolling GH from 0.9.00075 to 1.0. GH was being updated nearly weekly earlier this year. Then it suddenly stopped. If we're talking weeks before a full release, so be it. But if we're looking at something longer, a documentation update would help a lot. Geometry Cache and Geometry Pipeline’s help still read “This is the autogenerated help topic for this object. Developers: override the HtmlHelp_Source() function in the base class to provide custom help.” This does not help. And the Grasshopper Primer 2nd Ed. was written for GH 0.60007.
Grasshopper is fundamentally a 2D program:
I know you'll disagree completely, but I'm sticking to this. How else could an omission like offsetsurf happen? Pretty much every 3D program in existence has this. I’m sure I can probably figure out how to deconstruct the breps, join the curves, loft, trim, and so forth. But does writing an algorithm to do what all other 3D programs do with a dialog box seem reasonable? I'm sure if you go command by command you'll find a ton on such things.
If you look at the vast majority of things done in GH, you'll note that they're mostly either flat or a fundamentally 2D pattern on a warped surface.
I've been working on a part that is a 3D voronoi trimmed to a 3D model. I've been trying to turn the trimmed voronoi into legitimate geometry for over a month without success.
http://www.grasshopper3d.com/profiles/blogs/question-voronoi-3d-continued
I’ve researched it enough to have found many others have had the exact same problem and have not solved it. It’s really not that conceptually difficult. But GH lacks the tools.
Make screen organization easier:
I have a touch of OCD, and I like my GH layout to flow neatly. Allow input/output nodes to be re-ordered. This will allow a reduction in crossed wires. Make the wire positions a bit more editable. I sometimes use a geometry component as a wire anchor to clean things up. Being able to grab a wire and pull it out of the way would be kinda nice.
I think GH has some awesome abilities. I also think accessing those abilities could be significantly easier.
~p…
re are major changes and enhancements.
HONEYBEE
More Flexible Workflow - Many small modifications were made to support a more flexible workflow, such as the ability to separate a zone created with masses2Zones into editable HBSrfs that can be recombined. For the energy components, it is now possible to plug custom constructions directly into the components that set the zone constructions without writing them first into the library. For the daylighting components it is now possible to change all of the materials of specific surface types at once.
Support for Complex Geometry - Many small bugs for complex geometry have been fixed including the ability to import energy results correctly for curved NURBS surfaces as well as unconventional window configurations. Also, the intersectMasses component now almost always succeeds in splitting all of the surfaces of adjacent zones, no matter how complex the intersection is.
Automatic Download Issues Fixed - Many users who faced issues with not having “gendaymtx.exe” or who had trouble syncing with our github know that we faced an issue with automatic background downloads.
Air Walls - Honeybee EnergyPlus models now officially support air walls (or virtual partitions) in a basic implementation. Now, any time that you use the air wall construction or set a surface type to “air wall,” the air between adjacent zones will be automatically mixed. At present, this mixing is just a constant flow based on the surface area between zones connected by air walls multiplied by an adjustable “flow factor.” It is important to stress that this basic air mixing is not with the EnergyPlus Airflow Network, although the groundwork laid in this release will eventually allow for the implementation of the Airflow Network in future releases. As such, this present air mixing is only suitable for multi-zone conditions where there is not significant buoyancy-driven flow between zones.
Natural Ventilation - To go along with the new potential introduced by air walls, there has been a basic implementation of EnergyPlus’s natural ventilation objects in a new component called “Set EP Airflow”. The current setup allows for three possible types of natural ventilation: 1) natural ventilation through windows (with auto-calculated flow based on window area, outdoor wind speed/direction, and stack effects), 2) custom wind and stack objects that can be used to model things such as chimneys off of single zones, and 3) constant, fan-driven natural ventilation.
Additional Thermal Mass - The capability to add additional thermal mass to zones has been added. This is useful for factoring in the mass of indoor furniture or heavy interior objects such as chimneys.
New Utility Components - Abraham has added a couple of useful components to help calculate lighting loads based on bulb types and target lighting levels as well as a converter from ACH to the m3/s-m2 that the other HB components accept. Along this vein, there is also a component for adding in the resistance of Air Films to HB constructions.
Improved and Editable Ideal Air Loads System - The EnergyPlus Ideal Air System now goes through an automatic sizing period at the start of the simulation based on the extreme weeks of the weather file. Furthermore, the ability to adjust many of the parameters of the ideal air loads system have been added with a new “Set Ideal Air Loads Parameters” component. The component allows you to add in heat recovery, air side economizers and demand-controlled ventilation.
OpenStudio Export Update - The OpenStudio workflow is still largely under development but this release includes a version with a working VAV and PTHP system template for those curious with experimenting. Note that not all of the new features available for the basic “Run Energy Simulation” component are available for the OpenStudio component (such as air walls, natural ventilation, or additional thermal mass).
Microclimate/Indoor Comfort Maps - Blossoming from initial experiments with the radiant temperature map, a workflow for looking into sub-zone microclimate and indoor comfort has been initiated. All components for this are presently under the Honeybee WIP tab but, over the next month, they will be completing their development phase and moving into the rest of the tabs. If you are interested in testing when they are ready, please let Chris know. For a teaser video of the intended capabilities, see this video: (https://www.youtube.com/watch?v=fNylb42FPIc&list=UUc6HWbF4UtdKdjbZ2tvwiCQ)
LADYBUG
Monthly Bar Chart - After much demand from multiple parties, a new component to create monthly bar and line charts has been added. The component is particularly useful for plotting the outputs of the “Average Data” component like monthly EPW data or averaged monthly-per hour data. It also supports daily data and any type of Energy simulation results.
Wind Profile - To go along with the new capabilities of natural ventilation in Honeybee, Ladybug now has a fully fleshed-out Wind Profile component that allows you to visualize how wind speed changes with height in relation to your building geometry. The component is geared to understanding the conditions of prevailing wind and will be useful in the future for setting up CFD models. Credit goes to Djordje Spasic for adding in all of the new capabilities. In a similar vein, the appearance of the wind rose has also been improved thanks to suggestions from Alejandra Menchaca.
Faster Solar Adjusted Temperature - Thanks to the SolarCal method from the Center for the Built Environment at UC Berkeley (http://escholarship.org/uc/item/89m1h2dg), the solar adjusted temperature component now includes an option for a much faster calculation that produces results that are very close to those originally obtained with the genCumSky component. Instead of using the cumulative sky, the component can now accept the direct and diffuse radiation from the ImportEPW component. Over a whole year, this essentially takes a calculation that used to be a half-hour and shrinks it down to 10 seconds. Thanks again to those at UC Berkeley for keeping their work open source!
Instructions - Last but not the least, [It took me almost two years to understand this but finally] we have a text file that describes the installation step by step and is way easier to modify than a video. You can find it in the zip file. Credit goes to Chris!
We also want to welcome Anton, Patrick and Sandeep to the team. Anton has kicked off his development by working on a component to import and visualize epw ground temperature data and he will be continuing to develop components to bring in reliable precipitation data to Ladybug. With this basis, he will continue to implement Honeybee components for ground heat storage, earth tubes, rain collection and hot water systems. Patrick and Sandeep are working on integration of Honeybee to Energy Performance Calculator.
As always let us know your comments and suggestions.
Enjoy!…
e a fundamental failure on my part. On the other hand, Grasshopper isn't supposed to be on a par with most other 3D programs. It is emphatically not meant for manual/direct modelling. If you would normally tackle a problem by drawing geometry by hand, Grasshopper is not (and should never be advertised as) a good alternative.
I get that. That’s why that 3D shape I’m trying to apply the voronoi to was done in NX. I do wonder where the GUI metaphor GH uses comes from. It reminds me of LabVIEW.
"What in other programs is a dialog box, is 8 or 10 components strung together in grasshopper. The wisdom for this I often hear among the grasshopper community is that this allows for parametric design."
Grasshopper ships with about 1000 components (rounded to the nearest power of ten). I'm adding more all the time, either because new functionality has been exposed in the Rhino SDK or because a certain component makes a lot of sense to a lot of people. Adding pre-canned components that do the same as '8 or 10 components strung together' for the heck of it will balloon the total number of components everyone has to deal with. If you find yourself using the same 8 to 10 components together all the time, then please mention it on this forum. A lot of the currently existing components have been added because someone asked for it.
It’s not the primary components that catalyzed this thought but rather the secondary components. I was toying with a component today (twist from jackalope) that made use of three toggle components. The things they controlled are checkboxes in other apps.
Take a look at this jpg. Ignore differences; I did 'em quickly. GH required 19 components to do what SW did with 4 commands. Note the difference in screen real estate.
As an aside, I really hate SolidWorks (SW). But going forward, I’ll use it as an example because it’s what most people are familiar with.
"[...] has a far cleaner and more intuitive interface. So does SolidWorks, Inventor, CATIA, NX, and a bunch of others."
Again, GH was not designed to be an alternative to these sort of modellers. I don't like referring to GH as 'parameteric' as that term has been co-opted by relational modellers. I prefer to use 'algorithmic' instead. The idea behind parameteric seems to be that one models by hand, but every click exists within a context, and when the context changes the software figures out where to move the click to. The idea behind algorithmic is that you don't model by hand.
I agree, and disagree. I believe parametric applies equally to GH AND SW, NX, and so forth, while algorithmic is unique to GH (and GC and Dynamo I think). Thus I understand why you prefer the term. I too tend to not like referring to GH as a parametric modeler for the same reason.
But I think it oversimplifies it to say parametric modelers move the clicks. SW tracks clicks the same way GH does; GH holds that information in geometry components while SW holds it in a feature in the feature tree. In both GH and SW edits to the base geometry will drive a recalculation, but more commonly, it’s an edit to input data, beit equations or just plain numbers, that drive a recalculation.
I understand the difference in these programs. What brought me to GH is that it can create a visual dialog that standard modelers can’t. But as I've grown more comfortable with it I’ve come to realize that the GUI of GH and the GUI of other parametric modelers, while looking completely different, are surprisingly interchangeable. Do not misconstrue that I’m suggesting that GH should replace it’s GUI with SW’s. I’m not. I refrain from suggesting anything specific. I only suggest that you allow yourself to think radically.
This is not to say there is no value in the parametric approach. Obviously it is a winning strategy and many people love to use it. We have considered adding some features to GH that would make manual modelling less of a chore and we would still very much like to do so. However this is such a large chunk of work that we have to be very careful about investing the time. Before I start down this road I want to make sure that the choice I'm making is not 'lame-ass algorithmic modeller with some lame-ass parametrics tacked on' vs. 'kick-ass algorithmic modeller with no parametrics tacked on'.
Given a choice, I'd pick kick-ass algorithmic modeller with no parametrics tacked on.
2. Visual Programming.
I'm not exactly sure I understand your grievance here, but I suspect I agree. The visual part is front and centre at the moment and it should remain there. However we need to improve upon it and at the same time give programmers more tools to achieve what they want.
I'll admit, this is a bit tough to explain. As I've re-read my own comment, I think it was partly a precursor to the context sensitivity point and touched upon other stated points.
This now touches upon my own ignorance about GH’s target market. Are you moving toward a highly specialized tool for programmers and/or mathematicians, or is the intent to create a tool that most designers can master? If it’s the former, rock on. You’re doing great. If it’s the latter, I’m one of the more technically sophisticated designers I know and I’m lost most of the time when using GH.
GH allows the same freedom as a command line editor. You can do whatever you like, and it’ll work or not. And you won’t know why it works or doesn't until you start becoming a bit of an expert and can actually decipher the gibberish in a panel component. I often feel GH has the ease of use of DOS with a badass video card in front.
Please indulge my bit of storytelling. Early 3D modelers, CATIA, Unigraphics, and Pro-Engineer, were unbelievably difficult to use. Yet no one ever complained. The pain of entry was immense. But once you made it past the pain threshold, the salary you could command was very well worth it. And the fewer the people who knew how to use it, the more money you could demand. So in a sense, their lack of usability was a desirable feature among those who’d figured it out.
Then SolidWorks came along. It could only do a fraction of what the others did, but it was a fraction of the cost, it did most of what you needed, and anyone could figure it out. There was even a manual on how to use it. (Craziness!) Within a few short years, the big three all had to change their names (V5, NX, and Wildfire (now Creo)) and change the way they do things. All are now significantly easier to use.
I can tell that the amount of development time that’s gone into GH is immense and I believe the functionality is genius. I also believe it’s ease of use could be greatly improved.
Having re-read my original comments, I think it sounded a bit snotty. For that I apologize.
3. Context sensitivity.
"There is no reason a program in 2014 should allow me to make decisions that will not work. For example, if a component input is in all cases incompatible with another component's output, I shouldn't be able to connect them."
Unfortunately it's not as simple as that. Whether or not a conversion between two data types makes sense is often dependent on the actual values. If you plug a list of curves into a Line component, none of them may be convertible. Should I therefore not allow this connection to be made? What if there is a single curve that could be converted to a line? What if you want to make the connection now, but only later plan to add some convertible curves to the data? What you made the connection back when it was valid, but now it's no longer valid, wouldn't it be weird if there was a connection you couldn't make again?
I've started work on GH2 and one of the first things I'm writing now is the new data-conversion logic. The goal [...] is to not just try and convert type A into type B, but include information about what sort of conversion was needed (straightforward, exotic, far-fetched. etc.) and information regarding why that type was assigned.
You are right that under some conditions, we can be sure that a conversion will always fail. For example connecting a Boolean output with a Curve input. But even there my preferred solution is to tell people why that doesn't make sense rather than not allowing it in the first place.
You bring up both interesting points and limits to my understanding of coding. I’ve reached the point in my learning of GH where I’m just getting into figuring out the sets tab (and so far I’m not doing too well). I often find myself wondering “Is all of this manual conditioning of the data really necessary? Doesn’t most software perform this kind of stuff invisibly?” I’d love to be right and see it go away, but I could easily be wrong. I’ve been wrong before.
5. Components.
"Give components a little “+” or a drawer on the bottom or something that by clicking, opens the component into something akin to a dialog box. This should give access to all of the variables in the component. I shouldn't have to r-click on each thing on a component to do all of the settings."
I was thinking of just zooming in on a component would eventually provide easier ways to access settings and data.
I kinda like this. It’s a continuation of what you’re currently doing with things like the panel component.
"Could some of these items disappear if they are contextually inappropriate or gray out if they're unlikely?"
It's almost impossible for me to know whether these things are 'unlikely' in any given situation. There are probably some cases where a suggestion along the lines of "Hey, this component is about to run 40,524 times. It seems like it would make sense to Graft the 'P' input." would be useful.
6. Integration.
"Why isn't it just live geometry?"
This is an unfortunate side-effect of the way the Rhino SDK was designed. Pumping all my geometry through the Rhino document would severely impact performance and memory usage. It also complicates the matter to an almost impossible degree as any command and plugin running in Rhino now has access to 'my' geometry.
"Maybe add more Rhino functionality to GH. GH has no 3D offset."
That's the plan moving forward. A lot of algorithms in Rhino (Make2D, FilletEdge, Shelling, BlendSrf, the list goes on) are not available as part of the public SDK. The Rhino development team is going to try and rectify this for Rhino6 and beyond. As soon as these functions become available I'll start adding them to GH (provided they make sense of course).
On the whole I agree that integration needs a lot of work, and it's work that has to happen on both sides of the isle.
You work for McNeel yet you seem to speak of them as a separate entity. Is this to say that there are technical reasons GH can only access things through the Rhino SDK? I’d think you would have complete access to all Rhino API’s. I hope it’s not a fiefdom issue, but it happens.
7. Documentation.
Absolutely. Development for GH1 has slowed because I'm now working on GH2. We decided that GH1 is 'feature complete', basically to avoid feature creep. GH2 is a ground-up rewrite so it will take a long time until something is ready for testing. During this time, minor additions and of course bug fixes will be available for GH1, but on a much lower frequency.
Documentation is woefully inadequate at present. The primer is being updated (and the new version looks great), but for GH2 we're planning a completely new help system. People have been hired to provide the content. With a bit of luck and a lot of work this will be one of the main selling points of GH2.
It begs the question that I have to ask. When is GH1.0 scheduled to launch? And if you need another person to proofread the current draft of new primer.
patrick@girgen.com
I can’t believe wikipedia has an entry for feature creep. And I can’t believe you included it. It made me giggle. Thanks.
8. 2D-ness.
"I know you'll disagree completely, but I'm sticking to this. How else could an omission like offsetsurf happen?"
I don't fully disagree. A lot of geometry is either flat or happens inside surfaces. The reason there's no shelling (I'm assuming that's what you meant, there are two Offset Surface components in GH) is because (a) it's a very new feature in Rhino and doesn't work too well yet and (b) as a result of that isn't available to plugins.
I believe it’s been helpful for me to have figured this out. I recently completed a GH course at a local Community College and have done a bunch of online tutorials. The first real project I decided to tackle has turned out to be one of the more difficult things to try. It’s the source of the questions I posted. (Thanks for pointing out that they were posted in the wrong spot. I re-posted to the discussions board.)
I just can't seem to figure out how to turn the voronoi into legitimate geometry. I've seen this exact question posted a few times, but it’s never been successfully answered. What I'm showing here is far more angular than I’m hoping for. The mesh is too fine for weaverbird to have much of an effect. And I haven't cracked re-meshing. Btw, in product design, meshes are to be avoided like the plague. Embracing them remains difficult.
As for offsetsurf, in Rhino, if you do an offsetsurf to a solid body, it executes it on all sides creating another neatly trimmed body thats either larger or smaller than the original. This is how every other app I know of works. GH’s offsetsurf creates a bunch of unjoined faces spaced away from the original brep. A common technique for 3D voronois (Yes, I hit the voronoi overuse easter egg) is to find the center of each cell and scale them by this center. If you think about it, this creates a different distance from the face of the scaled cell to the face of the original cell for every face. As I've mentioned, this project is giving me serious headaches.
Don't get me wrong, I appreciate the feedback, I really do, but I want to be honest and open about my own plans and where they might conflict with your wishes. Grasshopper is being used far beyond the boundaries of what we expected and it's clear that there are major shortcomings that must be addressed before too long. We didn't get it right with the first version, I don't expect we'll get it completely right with the second version but if we can improve upon the -say- five biggest drawbacks (performance, documentation, organisation, plugin management and no mac version) I'll be a happy puppy.
--
David Rutten
Thank you for taking the time to reply David. Often we feel that posting such things is send it into the empty ether. I’m very glad that this was not the case.
And thank you for all of the work you've put into GH. If you found any of my input overly harsh or ill-mannered, I apologise. It was not my intent. I'm generally not the ranting sort. If I hadn't intended to provide possibly useful input, I wouldn't have written.
Cheers
Patrick Girgen
Ps. Any pointers on how to get a bit further on the above project would be greatly appreciated.
…
n, so if you want Strings, be sure to encase them in double quotes:
Concat = " alcohol doesn't affect me"
● Added initialization code to the Cull Nth component.
● Added initialization code to the Cull Index component.
● Added initialization code to the Random Reduce component.
● Added initialization code to the Duplicate component.
● Added initialization code to the List Item component.
● Added initialization code to the Repeat Data component.
● Added initialization code to the Shift List component.
● Added initialization code to the Split List component.
● Added initialization code to the Sequence component.
● Added initialization code to the Constant E component.
● Added initialization code to the Constant Epsilon component.
● Added initialization code to the Factorial component.
● Added initialization code to the Fibonacci component.
● Added initialization code to the Golden Ratio component.
● Added initialization code to the Constant Pi component.
● Added initialization code to the Random component.
● Added initialization code to the Range component.
● Added initialization code to the Series component.
● Added initialization code to the Square component.
● Added initialization code to the Square Root component.
● Added initialization code to the Cube component.
● Added initialization code to the Cube Root component.
● Added initialization code to the Log10 component.
● Added initialization code to the Log component.
● Added initialization code to the Exponent component.
● Added initialization code to the Power of 2 component.
● Added initialization code to the Power of 10 component.
● Added initialization code to the Sine component.
● Added initialization code to the Sinc component.
● Added initialization code to the Cosine component.
● Added initialization code to the Tangent component.
● Added initialization code to the ArcSine component.
● Added initialization code to the ArcCosine component.
● Added initialization code to the ArcTangent component.
● Added initialization code to the Secant component.
● Added initialization code to the Cosecant component.
● Added initialization code to the Cotangent component.
● Added initialization code to the One over X component.
● Added initialization code to the Absolute component.
● Added initialization code to the Sign component.
● Added initialization code to the ToDegrees component.
● Added initialization code to the ToRadians component.
● Added initialization code to the N-Base log component.
● Added initialization code to the Smaller Than component.
● Added initialization code to the Larger Than component.
● Added initialization code to the Equal To component.
● Added initialization code to the Similar To component.
● Added initialization code to the Addition component.
● Added initialization code to the Subtraction component.
● Added initialization code to the Multiplication component.
● Added initialization code to the Division component.
● Added initialization code to the Integer Division component.
● Added initialization code to the Minimum component.
● Added initialization code to the Maximum component.
● Added initialization code to the Modulus component.
● Added initialization code to the Power component.
● Added initialization code to the Concatenate component.
● Added initialization code to the String Split component.
● Added initialization code to the String Join component.
● Added initialization code to the Evaluate Length component.
● Added initialization code to the Circle component.
● Added initialization code to the Circle CNR component.
● Added initialization code to the Arc component.
● Added initialization code to the Curve component.
● Added initialization code to the Interpolated Curve component.
● Added initialization code to the Offset Curve component.
● Added initialization code to the Offset Curve Loose component.
● Added initialization code to the Offset Curve On Surface component.
● Added initialization code to the Extend Curve component.
● Added initialization code to the Catenary component.
● Added initialization code to the Line SDL component.
● Added initialization code to the Fillet component.
● Added initialization code to the Fillet Distance component.
● Added initialization code to the Move component.
● Added initialization code to the Scale component.
● Added initialization code to the Mesh Plane component.
● Added initialization code to the Mesh Box component.
● Added initialization code to the Mesh Sphere component.
● Added initialization code to the Sphere component.
● Added initialization code to the Surface Offset component.
● Added initialization code to the Surface Offset Loose component.
● Added initialization code to the Divide Curve component.
● Added initialization code to the Divide Curve Length component.
● Added initialization code to the Divide Curve Distance component.
● Added initialization code to the Curve Frames component.
● Added initialization code to the Curve Perpendicular Frames component.
● Added initialization code to the Square Grid component.
● Added initialization code to the Rectangular Grid component.
● Added initialization code to the Vector Amplitude component.
--
David Rutten
david@mcneel.com
Poprad, Slovakia
…
o my python component returning null despite running fine in the standalone python editor (i.e.: not through grasshopper).The original python script is as follows:
import randomimport rhinoscriptsyntax as rsrs.EnableRedraw(False)
def placeBuildings(curve, distance): pts=rs.DivideCurveLength(curve,5) counter=0 for myPoint in pts: counter=counter+1 #get the parmeter f current positision param=rs.CurveClosestPoint(curve,myPoint) #get teh tangent of this parameter tangent=rs.CurveTangent(curve,param) #calculate the angle of the tangent angle=rs.Angle((0,0,0),tangent) randomNumber=random.uniform(1,5) heightOfBuilding=random.uniform(4,40) rect=rs.AddRectangle(rs.WorldXYPlane(),randomNumber,2) rs.MoveObject(rect,(0,randomNumber,0)) hull=rs.ExtrudeCurveStraight(rect,(0,0,0),(0,0,heightOfBuilding)) rs.RotateObject(hull,(0,0,0),angle[0]) rs.MoveObject(hull,myPoint) #if counter%4: #rs.AddCircle(myPoint,3) #selection of curve#curveParameter=rs.GetCurveObject("sel curve")#curve=curveParameter[0]
curves=rs.GetCurveObject("select streets",4)distance=rs.GetInteger("distance?",4)for curve in curves: placeBuildings(curve,distance) rs.ReverseCurve(curve) placeBuildings(curve,distance)
When placed in grasshopper it is the following:
import randomimport rhinoscriptsyntax as rs
#randomNumber=random.uniform(1,5)#rs.AddCircle((0,randomNumber,0), 2)
def placeBuildings(curve, distance): pts=rs.DivideCurveLength(curve, 5) counter=0 for myPoint in pts: counter=counter+1 #get the parmeter f current positision param=rs.CurveClosestPoint(curve,myPoint) #get teh tangent of this parameter tangent=rs.CurveTangent(curve,param) #calculate the angle of the tangent angle=rs.Angle((0,0,0),tangent) randomNumber=random.uniform(1,5) heightOfBuilding=random.uniform(4,40) rect=rs.AddRectangle(rs.WorldXYPlane(),randomNumber,2) rs.MoveObject(rect,(0,randomNumber,0)) hull=rs.ExtrudeCurveStraight(rect,(0,0,0),(0,0,heightOfBuilding)) rs.RotateObject(hull,(0,0,0),angle[0]) rs.MoveObject(hull,myPoint)
#selection of curve#curveParameter=rs.GetCurveObject("sel curve")#curve=curveParameter[0]
curves=xdistance=y
for curve in curves: placeBuildings(curve,distance) rs.ReverseCurve(curve) placeBuildings(curve,distance)
I am unsure why there is no error being returned yet I cannot achieve any result other than null. Maybe someone could look at the script and tell me what is going wrong? I'm hoping to solve this before next Thursday so I might be asking for too much.
Much Appreciated.-A…
Added by Adem O'Byrne at 11:45am on October 9, 2014
guages I'd recommend all use the RhinoCommon SDK and thus all have access to the same functionality.
How long would it take me to understand and write my own code?
If you already know how to program, it probably won't take too long. If you're past the hurdle of what it means to declare and assign variables, how conditionals and loops work and what scope is, you've already rounded the hardest corner.
Is it even worth it?
That really depends. "Learn programming" is clearly not blanket good advice. Most people out there do not have to learn programming to be happy with their lives and successful in their careers. For some people it can make a small difference, and for a few people it can make a huge difference. If you feel you're in the 'some' category then this is indeed a question you have to answer. Note that the investment for learning programming is a continuous process. Unless you keep up your skills and learn about new stuff that becomes available, you'll lose the ability to write successful code over time.
Where do I start?
Step 1 is to answer the previous question. It is unlikely that anyone besides yourself can answer it, but you can start by making a list of things you do manually now that may be programmable. Then make a list of the things you are unable to do now but which you might be able to do with programming. If while looking at these lists your reaction is: "meh", the answer is probably no.
Step 2 is to pick a language. This is again a very personal thing; there's no wrong answer, because there's no right answer.
Step 3 is to start learning this language. My experience is that the best way to learn a programming language is to try and solve a real problem that you understand very well. If the problem statement is nebulous or poorly understood, you'll be learning two things and that's a recipe for unnecessary frustration.
Here are my thoughts on language:
Python: I don't use Python myself, I can sort of read it while moving my lips. I don't particularly like Python though. The indentation sensitiveness stresses me out, and I find the lack of type-safety disturbing. However it is a good language for mathematical/scientific programs. There are lots of additional code libraries you can easily import that will ease the development of mathematically intense algorithms.
C#: I like C# very much, but it does suffer from geekerosis. A lot of the keywords used in the language are not self-explanatory (abstract, sealed, virtual). For me this is no longer a problem as I've memorised what they all mean. C# is designed to be an efficient language to write, rather than an easy one to learn.
The great thing about C# though is that there's a huge amount of material out there for learning it. It is one of the most popular, mature and modern languages you can hope to pick.
VB: I learned VBScript as my first language, and then moved on to VB5, VB6 and VB.NET. It is somewhat more friendly than C#, and functionally it is almost identical. The switch from VB to C# is reasonably low-threshold and there are excellent tools for translating VB code to C# and vice versa.
Since you already know some Python, it probably makes the most sense to continue on that path. If you want to switch, C# is more like Python than VB, so C# would be my next suggestion.
As for where to get information... you have 4 major options when developing code for Rhino.
If it's a question about the language itself, StackOverflow is a great resource. It can be a pretty hostile place for beginner questions, but I find that mostly the questions I'm asking have been asked already and the answers on SO tend to be good. In fact usually when I google my questions, the first few hits are always SO posts.
If it's a question about the Rhino SDK or Grasshopper, you can ask it either on the GH forums (where we are now), or on Discourse. We're not as quick on the draw as SO, but we do know about Rhino.
If you're looking for a basic explanation of what a keyword or a type is for, perhaps with an example, MSDN is the best first choice. In fact if you google the name a of a .NET type, the first hit is almost always an MSDN page.…
Added by David Rutten at 2:03pm on December 3, 2014
umbrella of Urban Heat Island (UHI) and I am going to try to separate them out in order to give you a sense of the current capabilities in LB+HB.
1) UHI as defined as a recorded elevated air temperature in an urban area:
If you have access to epw files for both an urban area and a rural area, you can use Ladybug to visualize and deeply explore the differences between the two weather files. Ladybug is primarily a tool for weather file visualization and analysis and it can be very helpful for understanding the consequences of UHI on strategies for buildings or on comfort. This said, if you do not have both rural and urban recorded weather data or you want to generate your own weather files based on criteria about urban areas (as it sounds like you want to do), this definition might not be so helpful.
2) UHI defined by air elevated air temperature but viewed as a computer model-able phenomenon resulting primarily from urban canyon geometry, building materials, and (to a lesser degree) anthropogenic heat:
This definition seems to fit more with they type of thing that you are looking for but it is unfortunately very difficult and computationally intensive such that we do not currently have anything within Ladybug to do this right now. I can say that the state-of-the art for this type of modeling is an application called Town Energy Budget (TEB) and this is what all of the advanced UHI researches that I know use (http://www.cnrm.meteo.fr/surfex/spip.php?article7). Unfortunately for those trying to use it in professional practice, it can take a while to get comfortable with it and it currently runs exclusively on Linux (this does mean that it is open source, though, and that you can really get deep into the assumptions of the model). A couple years ago, a peer of mine translated almost all of TEB into Matlab language making it possible to run it on Windows if you have Matlab. He wrapped everything together into a tool called the Urban Weather Generator (UWG), which can take an epw file of a rural area and warp it to an urban area based on inputs that you give of building height, materials, vegetation, anthropogenic heat, etc. I would recommend looking into this for your project, although, bear in mind that is it not open source like the original TEB tool and that you may need to get a (very expensive) copy of MATLAB (http://urbanmicroclimate.scripts.mit.edu/uwg.php).
3) UHI as defined by a thermal satellite image of an urban area depicting an elevated average radiant environment that reaches a maximum a the city center and changes by land use:
This is the definition of UHI that I am most familiar with and was the basis of much of my past research. I feel that it is also a definition of UHI that is a bit more in line with where a lot of contemporary UHI research is headed, which is away from the notion of UHI as a macro-scale meteorological phenomena that is averaged as an air temperature over a huge area towards one that accepts that different land uses have different microclimates and (importantly) different radiant environments. While the air temperature difference between urban and rural areas usually does not change more than 1-4 C, the radiant environment can be very different (on the order of 10-15 C differences). The best way to understand UHI in this context is with Thermal satellite images, for which there is ha huge database of publicly available data on NASA's glovis website (http://glovis.usgs.gov/) or their ECHO website (http://reverb.echo.nasa.gov/reverb/#utf8=%E2%9C%93&spatial_map=satellite&spatial_type=rectangle). I tend to use thermal data from LANDSAT 5-8 and ASTER satellites in my research. Unfortunately, there is a lot f bad data with a lot of cloud cover mixed in with the really good stuff and it can take some time to find good images. Also, there aren't too many programs that read the GeoTiff file format that you download the data as. I know that ArcGIS will read it, a program called ENVI will read it (I think that the open source QGIS can also red it). I have plans to write a set of components to bring this type of data into Rhino and GH (I may get to it a few months down the line).
4) UHI as a computer model-able notion of "Urban Microclimate" with consideration of local differences and the local radiant environment:
This is where a lot of my research has lead and, thankfully, is an area that Honeybee can help you out a lot with. EnergyPlus simulations can output information on outside building surface temperatures and these can be very helpful in helping get a sense of the radiant environment around individual buildings. Right now, I am focusing just on using this data to fully model the indoor environments of buildings as you see in this video:
https://www.youtube.com/watch?v=fNylb42FPIc&list=UUc6HWbF4UtdKdjbZ2tvwiCQ
I have plans to move this methodology to the outdoors once I complete this initial application to the indoors. For now, you can use the "Surface result reader" and the "color surfaces based on EP result" components to get a sense of variation in the outside temperature of your buildings.
I hope that this helped,
-Chris
…
ahams's question about how shades are accounted for in the simulation/thermal map and Theodore's thought that just accounting for shades in the E+ run was sufficient. I think that it may be clearest to explain what is going on with this infographic:
As the graphic shows, the thermal maps are made from 4 key types of inputs. The radiant temperature map is formed through a consideration of both the temperature of the surfaces surrounding the occupants and the direct solar radiation that might fall onto the occupants through un-shaded windows. The first surface temperature effect is easily computable from your Energy simulation results and the HBZone geometry. However, the second is calculated by seeing how sun vectors pass through the windows of the zones and uses the SolarCal method of the CBE team (http://escholarship.org/uc/item/89m1h2dg) to compute an MRT delta resulting from solar radiation. This delta is then added to the initial values computed through surface temperature view factor. When you do not connect up your shading brep geometry, internal furniture breps, or outdoor context geometry that might block sun to the additionalShading input, the thermal map will assume that sun can pass unobstructed through the window or through indoor furniture to fall onto occupants. It is important to stress that the EnergyPlus simulation does not count for blind geometry or internal furniture as actual geometry. Just as numerical abstractions of surface area and material properties. So we need you to plug in the actual geometry of these things when we compute the MRT delta resulting from sun falling directly onto people.
Next, to clear up the definition of window transmissivity. The important thing to clarify here is that, whether it refers to the tranmittance of glass or to the amount of sun coming through a fine screen of blinds, the value is multiplied by the radiation falling on the occupant and thus has a direct correlation to the MRT Delta from sun falling on occupants. So, if you set transmissivity to zero, the sun falling on the occupants will not be considered in the calculation and, if you set the transmissivity to 1, the assumption is that there is no window (or the window glass is 100% clear). So, Abraham, your definition of it as a coefficient is appropriate.
Normally, I would just recommend that you leave this value at the default 0.7, which corresponds to the transmittance of the default glass material in Honeybee. However, there are 4 cases in which you might consider changing it:
1) You are not using the default Honeybee glazing material, in which case, you should change the transmissivity to be equal to this new value.
2) You have a lot of really small blind/shade geometries and you do not want the view factor component to take several minutes to trace sun vectors through the detailed shade geometry and so you are ok with using just a simple abstraction instead of plugging shade breps into the additionaShading. In this case, you might try to estimate the average percentage of radiation coming through the blind geometry (maybe with some simple Ladybug radiation studies or with your intuition about the amount of sun blocked by the shades). You will then multiply this by the tranmissivity of your glass and this will be the value that you input to the component.
3) Your blinds for your Honeybee simulation are dynamic, in which case, plugging shade breps into additionalShading is not going to work because the component will assume that those shades are always there. In this case, you should be plugging a list of 8760 values into the transmissivity that correspond to when the shades are pulled. When the blinds are completely up, the value should be the tranmittance of your window and, when they are down, the value should be the window tranmittance multiplied by the fraction of light coming through the shades.
4) You have shades/blinds but they are transparent or are not completely opaque. The additionalShading_ input assumes that all shade geometry is opaque and so you cannot use it to account for such shades. Accordingly, you will need to account for it through the tranmissivity.
In the future, I may try to pull more information about blinds and glass properties off of the HBzones inside the view factor component but, for now and for the next few months, the above describes how it works.
Theodore, for curved geometry, I think that your safest bet is going to be planarizing the Rhino geometry before you turn it into a HBZone (so you just divide the curved surface into a few vertical planar panes of glass that approximate the curve well enough). This is essentially what the runSimulation component does for you automatically (it meshes the geometry as you see here: https://www.youtube.com/watch?v=nMQ2Pau4q6c&index=12&list=PLruLh1AdY-SgW4uDtNSMLeiUmA8YXEHT_). If I were to figure out a way to incorporate shades in this automatic meshing workflow, your EnergyPlus simulation would take a very long time to run and I am not even sure if the result will be that accurate with the way E+ abstracts shades. So I don't think that it's really worth it over just planarizing the geometry yourself.
Lastly, I won't be able to figure out the problem with your current run Theodore, unless I get the GH file from you. Make sure that you are using all up-to-date components.
-Chris…
Because the Adaptive methodology is founded upon the notion that there are hundreds of social factors that influence comfort and that the best we can do to forecast comfort is to find variables with good correlations to these social factors (like outdoor temperature), the premise that these published Adaptive model holds regardless of cultural norms is dangerous. Notably, the founders of the adaptive model have stressed that this particular linear correlation that you cite comes from recent surveys of buildings where people have both the the ability to open windows AND a great freedom to dress down. Hypothetically, if occupants were able to open the windows in Abraham's building but the cultural norm was that everyone was expected to wear multi-layered suits or dresses (as in historic Britain), a different correlation between outdoor temperature and comfort temperature would exist. In fact, historical European comfort surveys show that people likely preferred cooler temperatures in buildings (about 1-2C cooler) than today's occupants. Accordingly, after recognizing this social premise in the Adaptive model, I have built in a few ways to adjust/alter the version in Ladybug based on the literature I have read (even though these alterations are not a part of any official ASHRAE or European standard).
Abraham, you might have to be a bit more specific about how you would like to adjust the Adaptive comfort model for me to help your particular case and this may lead to me adding in new functionality. For the time being, I can tell you that the 'Ladybug_Adaptive Comfort Parameters' component is going to be your friend and I would recommend using the Adaptive Comfort Chart to visualize how you are changing the model. You can plug these 'Adaptive Comfort Parameters' into the 'Adaptive Comfort Recipe' component to have the microclimate analysis run with these parameters. Here are a few examples of how to alter the model:
1) Mixed-mode Building - Humphreys and the European Adaptive comfort team derived two separate correlations.
One for naturally ventilated buildings:
and conditioned buildings:
The dimensionless value between 0 and 1 for _levelOfConditioning allows you to create different correlations depending on whether occupants have complete freedom of dress and window operability (0) or have slight restrictions like in a mixed mode building (0.5, for example):
2) Changing Response Time of Occupants - There has been a bit of a debate in Literature about whether it is better to use the average monthly temperature or a weekly running mean temperature. The avgMonthORRunningMean input allows you to adjust this like so:
Average Month:
Running Mean:
3) Greater Temperature Range Tolerance - While this last one is actually a part of the European and Adaptive standards, you can adjust the range of the comfort band with either the 'eightyOrNintetyComf' input or the comfortClass input like so:
Ninety Percent Comfortable
Eighty Percent Comfortable
Abraham, let me know if you would like more controls over the model or if this is enough to do what you are thinking of. This example file allows you to construct the images I have above:
http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Adaptive_Comfort_Chart&slide=0&scale=1&offset=0,0
-Chris…
ther math and logic. i can usually conceptualise what i want to do and cobble some semi working thing together but don't know which components to use and how to patch it. so i'm super happy to have someone who knows what he's doing to find this interesting.
and i'm glad you mention the fanned frets again, there is one input parameter that's still missing for the multiscale frets to be fully parametric, it's the angle of the nut or which fret should be straight. it depends a bit on personal preferences and playing posture what is more comfortable. so being able to adjust this easily would be cool. again i have no idea how the maths for that work or if you can just rotate each fret the same amount around it's middle point. The input either as fret number (for the straight fret) or as a simple slider from bridge to nut should do as input setting.
Here are the two extremes and the middle ground:
i've been thinkin today while analysing your patches and cleaning up my mess what exactly the monster should do.
Here are the input parameters needed, i think it's the complete list
scale length low E string
scale length high e string
fret angle/straight fret
string width at nut
string width at bridge
number of frets
fretboard overhang at nut (distance from string to fretboard bounds)
fretboard overhang at last fret
string gauges
string tensions
fretboard radius at nut (for compound radius fretboard radius at bridge is calculated with the stewmac formula)
fretwire crown width
fretwire crown height
action height at nut (distance between bottom of string and fretwire crown top)
action height at last fret
pickup 1 neck position
pickup 2 middle position
pickup 3 bridge position
nut width
the pickup positions should be used to draw circles for the magnet poles on each string so they are perfectly aligned and can be used for the pickup flatwork construction. ideally they would need a rotation control aligning the center line of the pickup so it's somewher between the last fret angle and bridge angle. personally i do this visually depending on the design i'm looking for, some people have huge theories on pickup positioning but personally i don't believe in it.
that should result in everything needed to quickly generate all the necessary construction curves or geometry for nut/fingerboard/frets/pickups. this is the core of what makes a guitar work, the more precise this dynamic system is the better the guitar plays and sounds.
i posted another thread trying to understand how i could use datasets form spreadsheets,databse, csv to organize the input parameters. What would make sense for the strings for example is hook into a spreadsheet with the different string sets, i attached one for the d'Addario NYXL string line which basically covers all combos that make sense.
The string tension is an interesting one, and implmenting it would sure be overkill albeit super interesting to try. it should be possible to extrapolate from the scale length of each string what the tension for a given string gauge of that string would be so that you could say 'i want a fully balanced set' or 'heavy top light bottom) and it would calculate which SKU from d'addario would best match the required tension. All the strings listed in the spreadsheet are available as single strings to buy.
i'm trying to reorganize everything which helps me understand it. i just discovered the 'hidden wires' feature which is great since once i understood what a certain block does or have finished one of my own, i can get the wires out of the way to carry on undistracted. a bit risky to hide so many wires but it makes it so much easier not to get completely lost :-)
btw, the 'fanned fret' term is trademarked, some guy tried to patent it in the 80's which is a bit silly since it has been done for centuries. there is a level of sophistication above this as well, check out http://www.truetemperament.com/ and that really is something else. it really is astounding how superior the tuning is on those wigglefrets, the problem is that it's rather awkward for string bending and also you can't easily recrown or level the frets when they are used. …