aph relaxation in 3D and more). There is much more already in our GitHub repos and more to be added. For getting an idea of our future direction check this lecture out. For getting a better understanding of graphs and graph theory watch this lecture and this lecture on a gamified spatial configuration process. Stay tuned for more and do not hesitate to post Python questions in the meantime.
ps. If you are having installation problems, please check the remedy suggested below:
Comment by Iman Sheikhansari on August 26, 2019 at 8:33amDelete Comment
HiIf you are encountering a problem with rhino 6 versions don't worryFollow these steps.1. Download SYNTACTIC from https://sites.google.com/site/pirouznourian/syntactic-design2. Install it and go to the installation folder, Drag & drop SYNTACTIC(green one) over your grasshopper canvas.3. Close your rhino and reopen it. 4. Type GrasshopperDeveloperSettings5. Tick the Memory load *.GHA assemblies using COFF byte arrays option6. Run grasshopper and enjoy plugin
…
cremental release is available for download. It fixes several bugs reported in the 0.9.0005 & 0.9.0006 versions. To wit:
Computer mice with smooth scrolling would not zoom well, this is fixed.
Previewable parameters with a lot of consecutive null items would crash, this is fixed.
Identical GHA files would collide during the loading process, this is handled.
GHA files with identical names would collide during the loading process, this is handled.
Solver Undo setting was not persistent, this is fixed.
Widget ZUI Zoom setting was not persistent, this is fixed.
Markov Widget Corner setting was not persistent, this is fixed.
Markov Widget Suggestion Count setting was not persistent, this is fixed.
Drag and Drop on Document and Template preview materials wasn't recorded, this is fixed.
AssignDataToParameter() COM-Access method was broken, this is fixed.
Geometry and Generic parameters with persistent data would not deserialize correctly, this is fixed.
Operator shortcuts via the Canvas popup instantiation menu no longer assigned data to the second parameter, this is fixed.
Cull Duplicates component did not always show the correct label upon deserialization, this is fixed.
Legacy VB/C# components would not correctly deserialize List access on input parameters, this is fixed.
Cloud Display component would still display old sprites on disconnect, this is fixed.
Minor changes to a document would trigger lengthy preview cache updates, slowing Grasshopper down. This is fixed.
Sphere 4Pt did not work correctly, this it fixed.
Failed data conversions in parameters would result in missing entries, this is fixed.
Text Tag components (2D & 3D) would not bake via the component menu, this is fixed.
There are also some new features:
Added Jump object for quickly navigating across a Canvas (Params.Util dropdown).
Added Relative Differences component which is basically the inverse of Mass Addition (Math.Operators dropdown).
Added tooltip wiggle controls to the Preferences window, Interface section.
'Draw Full Names' now also attempts to change the display of existing components, but only in the active document.
Drag+Dropping GHA, GHPY and GHUSER files onto the canvas now puts the original file into the bin.
Replaced Set Union component with a new one that has variable input parameters.
Replaced Set Intersection component with a new one that has variable input parameters.
Replaced And and Ternary And components with a single new one that has variable input parameters.
Replaced Or and Ternary Or components with a single new one that has variable input parameters.
Replaced Concatenate component with a new one that has variable input parameters.
Concatenate component now has a segment join option available via the component menu.
Added Digit options to the Transform Matrix Display object.
Integer parameters which represent options now have more informative context menus.
--
David Rutten
david@mcneel.com
Poprad, Slovakia
…
Added by David Rutten at 11:06am on September 14, 2012
hat since we create a list of materials and we assign them to surfaces - volumes the next step could be to have an Life Cycle Analysis and Financial assessment produced.
The most common form to produce an LCA into a form that is commonly used and easily communicated is in the form of Environmental Product Declarations (EPDs) that follow ISO 14025:2006. As every form of LCA, EPDs raise a bunch of question regarding their boundaries and the accuracy of the results especially if we include the factor of location. In comparison with other LCA practices though, EPDs have to be followed by Product Category Rules (defining the boundaries of the study) that can be reviewed by external parties if the EPD is to go public. Part from that EPD results reflect each stage of the life cycle of a product including potential benefits from Reuse or Recycling. Finally if you have a system - for example a building - you can add the EPDs of the different subcomponents forming the building and get a final EPD for the building itself - the point where I think HB's functionality is fully aligned.
The financial assessment can easily be concluded if one has the price of the material he/she uses. Finally the environmental indicators of the EPDs (LCI, LCIA) can be translated into Shadow Costs (Shadow costs for Environmental Indicators here) and added to the final financial assessment as an option.
I have developed a similar plug-in (in C#) for Grasshopper for my master's thesis last year. The project focused on the comparison between constructing normally and constructing implementing Design for Deconstruction practices in steel buildings. The idea was to compare the two cases based on their environmental and financial performance. In the process I included also options for transportation of the material and for shadow cost, embodied energy and carbon assessment and more. The final outcome can be visualised in Rhino's viewports and exported to excel sheets. The plug-in is connected to local db with EPD data for steel profiles. The same scheme though can be followed for any type of material if we have the right database to connect it to!
Please have a look if interested at the report here! And let me know if you have any questions!
Please note that the report includes 3+ chapters dedicated to design for deconstruction practices e.t.c that are irrelevant with the topic but maybe interesting to read:)
Also if someone is interested in the report I can always send it to you.
(I will upload a video -runthrough of the plug-in later this week)
I would be very interested to have these capabilities in LB and HB and happy to help realising it!
Thanks
Tasos
…
se (like in nature). the length of the sticks shall be controlled by the brightnessvalues of a picture. so the bend have to be controlled, too.
now we have several problems:
1. how can i map a hexgrid on a curved surface?
2. how can i adapt the grid to the dimensions of the surface (no overlap, no gaps to the bound)?
3. important
: to create the curved sticks, we use points on a line and we move some of them and then we want to connect the right points via interpolated curve to create each curved stick. now the problem is that the points have to been filtered in the right way. we know that we have to filter each list of points to the index values of the points. the number of index values is the number of hexgrid rows, so there are a lot and we can't use a list item for each one. it could be hundreds.
is there any opportunity to sort a list after the index values (first every index=0, then index=1, ...n)?
or is there any component which does a group of operations for n-times (n is the flexible number of index values) ?
4. how can i control the length and bend of the sticks via the brightnessvalues of a picture?
please help us. thanks.
german version:
In einem hexagonalen Raster soll sich senkrecht zu Oberfläche ein Stab im Mittelpunkt jedes Sechsecks befinden. Dieser soll sich ab einem gewissen (festgelegten) Punkt Richtung Boden biegen. Zusätzlich wird die Länge des Stabes zum Beispiel durch die Information eines Bildes gesteuert, so dass auch die Biegung, je nach Länge, geregelt werden muss.
Wir haben ein Hexagonales Grid (HexGrid) erzeugt und in jeden Mittelpunkt eine Linie senkrecht zum Grid erzeugt, aus der wir uns Punkte mit CurvePoint ausgeben lassen. Der letzte ist verschoben, um eine Biegung zu simulieren. Um die Punkte zu einer interpolierten Kurve zu verbinden, müssen sie nach dem Index sortiert werden. Gibt es eine andere Möglichkeit, als jeden einzelnen Indexwert über ein ListItem herauszufiltern (Da die Rasterung flexibel einstellbar sein soll, entstehen n Indexwerte)? Oder kann man eine Liste nach den Indexwerten, also nicht nach den Punkten, sortieren?
Und wie kann man über Bildhelligkeitswerte die Länge der Stäbe und damit auch die Biegung steuern (ein kurzer Stab biegt sich weniger als ein langer Stab)?
Gibt es die Möglichkeit ein hexagonales Raster auf eine gekrümmte Fläche zu mappen?
Und wie passt man ein solches Raster (HexGrid) in eine Fläche mit definierten Maßen ein, ohne dass das Raster an den Rändern übersteht oder die Fläche nicht vollkommen ausfüllt?
danke.…
Added by doro hamann at 7:34am on December 20, 2011
bi-directional link, the link is unidirectional (downflow only), because of the use of proxies.
Matrix transforms and persistent constraints: I don't think this is true. The parts can have mates to other parts that preserve geometric relationships like 'coincident' , 'aligned' etc. These are essentially bi-directional. GH's algorithmic approach does not do relationships in the same / flexible way. In GH, the 'relationship' has to be part of the generation method that dependent on the creation sequence. I.e. draw line 2 perpendicularly from the end of point of line 1. If you are thinking about parts or assemblies sharing, or referencing parameters as part of the regen process, this is also possible. iLogic does this, and adds scripting. So does Catia. Inventor/iLogic can also access Excel and have all the parameter processing done centrally, if required.
Consequently, scripting the placement of components is irrelevant in GH, unless you decide that each component needs to be contained in its own separate file.
I wouldn't be too hasty here. Yes, you are right about compartmentalisation. I think this needs to happen with GH, in order to deal with scalability/everyday interoperability requirements. Confining projects to one script is not sustainable. MCAD apps have been doing this for ages with 'Relational Modeling'.The Adaptive Components placement example illustrates that it is beneficial to be able to script some 'hints' that can be used on placement of the component. Say, if your component requires points as inputs, then its should be able to find the nearest points to the cursor as it moves around. I think Aish's D# / DesignScript demo'd this kind of behaviour a few years ago. Similarly, Modo Toolpipe reminds me how a lot of UI based transactions can be captured as scripts (macro recorder etc). Allowing this input to be mixed in and/or extended by GH I think will yield a lot of 'modeling efficiency' around the edges. This is a (mis)using GH as an user-programmable 'jig' for placing/manipulating 'dumb' elements in Rhino. It may even give the 'dumb elements' a bit more 'intelligence' by leaving behind embedded attributes, like links to particular construction planes etc.Even if we confine ourselves to scripting. GH is a visual or graphic programming interface. A lot of 'insert and connect' tasks can be done more easily using graphic methods. If we need to select certain vertices on a mesh as inputs for, say, a facade panel, its going to be quicker to do this 'graphically' (like the AC example), then ferreting out the relevant indices in the data tree et al. The 'facade panel' script would then have some coding to filter/prompt the user as to what inputs were acceptable, and so on.
This also brings up the point that generating components and assemblies in MCAD is not as straightforward. In iParts and iAssemblies, each configuration needs to be generated as a "child" (the individual file needs to be created for each child) before those children can be used elsewhere.
Not sure what you mean here. If the i-parts are built up using sketches /profiles or other more rudimentary features (like Revits' profile/face etc family templates) then reuse should be fairly straight forward. I suppose you could make it like GH scripting, if you cut and paste or include script snippets that generate the desired Inventor features.
One of the reasons why the distributed file approach makes perfect sense in MCAD, is that in industry you deal with a finite set of objects. Generative tools are usually not a requirement. Most mechanical engineers, product engineers and machinists would never have any use for that.
I don't think this is true. Look at the automotive body design apps, which are mostly Catia based. All of the body parts are pretty much 'generative' and generated from splines, in a procedural way, using very similar approaches to GH. Or sheet metal design. It's not always about configuration of off-the-shelf items like bolts. And, the constraints manager is available to arbitrate which bit of script fires first, and your mundane workaday associative dimensions etc can update without getting run over by the DAG(s) :-)
…
m
-Area of blue line: min. 80% of the rectangel a x b
-Max. hight h of the top point: h,max = a
-Min. Volume between rectangel a x b and membrane: 500 m3
Can anyone help me?…
mplex the models are. If we are running multi-room E+ studies, that will take far longer to calculate.
Rhino/Grasshopper = <1%
Generating Radiance .ill files = 88%
Processing .ill files into DA, etc. = ~2%
E+ = 10%
Parallelizing Grasshopper:
My first instinct is to avoid this problem by running GH on one computer only. Creating the batch files is very fast. The trick will be sending the radiance and E+ batch files to multiple computers. Perhaps a “round-robin” approach could send each iteration to another node on the network until all iterations are assigned. I have no idea how to do that but hope that it is something that can be executed within grasshopper, perhaps a custom code module. I think GH can set a directory for Radiance and E+ to save all final files to. We can set this to a local server location so all runs output to the same location. It will likely run slower than it would on the C:drive, but those losses are acceptable if we can get parallelization to work.
I’m concerned about post-processing of the Radiance/E+ runs. For starters, Honeybee calculates DA after it runs the .ill files. This doesn’t take very long, but it is a separate process that is not included in the original Radiance batch file. Any other data manipulation we intend to automatically run in GH will be left out of the batch file as well. Consolidating the results into a format that Design Explorer or Pollination can read also takes a bit of post-processing. So, it seems to me that we may want to split up the GH automation as follows:
Initiate
Parametrically generate geometry
Assign input values, material, etc.
Generate radiance/ E+ batch files for all iterations
Calculate
Calc separate runs of Radiance/E+ in parallel via network clusters. Each run will be a unique iteration.
Save all temp files to single server location on server
Post Processing
Run a GH script from a single computer. Translate .ill files or .idf files into custom metrics or graphics (DA, ASE, %shade down, net solar gain, etc.)
Collect final data in single location (excel document) to be read by Design Explorer or Pollination.
The above workflow avoids having to parallelize GH. The consequence is that we can’t parallelize any post-processing routines. This may be easier to implement in the short term, but long term we should try to parallelize everything.
Parallelizing EnergyPlus/Radiance:
I agree that the best way to enable large numbers of iterations is to set up multiple unique runs of radiance and E+ on separate computers. I don’t see the incentive to split individual runs between multiple processors because the modular nature of the iterative parametric models does this for us. Multiple unique runs will simplify the post-processing as well.
It seems that the advantages of optimizing matrix based calculations (3-5 phase methods) are most beneficial when iterations are run in series. Is it possible for multiple iterations running on different CPUs to reference the same matrices stored in a common location? Will that enable parallel computation to also benefit from reusing pre-calculated information?
Clustering computers and GPU based calculations:
Clustering unused computers seems like a natural next step for us. Our IT guru told me that we need come kind of software to make this happen, but that he didn’t know what that would be. Do you know what Penn State uses? You mentioned it is a text-only Linux based system. Can you please elaborate so I can explain to our IT department?
Accelerad is a very exciting development, especially for rpict and annual glare analysis. I’m concerned that the high quality GPU’s required might limit our ability to implement it on a large scale within our office. Does it still work well on standard GPU’s? The computer cluster method can tap into resources we already have, which is a big advantage. Our current workflow uses image-based calcs sparingly, because grid-based simulations gather the critical information much faster. The major exception is glare. Accelerad would enable luminance-based glare metrics, especially annual glare metrics, to be more feasible within fast-paced projects. All of that is a good thing.
So, both clusters and GPU-based calcs are great steps forward. Combining both methods would be amazing, especially if it is further optimized by the computational methods you are working on.
Moving forward, I think I need to explore if/how GH can send iterations across a cluster network of some kind and see what it will take to implement Accelerad. I assume some custom scripting will be necessary.…
oftware connections built from the initial seed of the project. As always you can download the new release from Food4Rhino. Make sure to remove the older version of Ladybug and Honeybee and update your scripts.
This release is also special since today it is just about 3 years (3 years and 2 weeks) from the first release of Ladybug. As with any release, there have been a number of bug fixes and improvements but we also have some major news this time. In no specific order and to ensure that the biggest developments do not get lost in the extensive list of updates, here are the major ones:
Mostapha is re-writing Ladybug!
Ladybug for DynamoBIM is finally available.
Chris made bakeIt really useful by incorporating an export pathway to PDFs and vector-based programs.
Honeybee is now connected to THERM and the LBNL suite thanks to Chris Mackey.
Sarith has addressed a much-desired wish for Honeybee (Hi Theodore!) by adding components to model electric lighting with Radiance.
Djordje is on his way to making renewable energy deeply integrated with Ladybug by releasing components for modeling solar hot water.
There is new bug. Check the bottom of the post for Dragonfly!
Last but definitely not least (in case you’re not still convinced that this release is a major one) Miguel has started a new project that brings some of Ladybug’s features directly to Rhino. We mean Rhino Rhino - A Rhino plugin! Say hi to Icarus! #surprise
Before we forget! Ladybug and Honeybee now have official stickers. Yes! We know about T-Shirts and mugs and they will be next. For now, you can deck-out your laptops and powerhouse simulation machines with the symbology of our collaborative software ecosystem.
Now go grab a cup of tea/coffee and read the details below:
Rewriting Ladybug!
Perhaps the most far-reaching development of the last 4 months is an effort on the part of Mostapha to initiate a well structured, well documented, flexible, and extendable version of the Ladybug libraries. While such code is something that few community members will interact with directly, a well-documented library is critical for maintaining the project, adding new features, and for porting Ladybug to other software platforms.
The new Ladybug libraries are still under development across a number of new repositories and they separate a ladybug-core, which includes epw parsing and all non-geometric functions, from interface-specific geometry libraries. This allows us to easily extend Ladybug to other platforms with a different geometry library for each platform (ie. ladybug-grasshopper, ladybug-dynamo, ladybug-web, etc) all of which are developed on top of the ladybug-core.
Without getting too technical, here is an example of a useful outcome of this development. If you want to know the number of hours that relative humidity is more than 90% for a given epw, all that you have to code (in any python interface) is the following:
import ladybug as lb
_epwFile = r"C:\EnergyPlusV7-2-0\WeatherData\USA_CO_Golden-NREL.724666_TMY3.epw"
epwfile = lb.epw.EPW(_epwFile)
filteredData = epwfile.relativeHumidity.filterByConditionalStatement('x>90')
print "Number of hours with Humidity more than 90 is %d "%len(filteredData.timeStamps)
Compare that to the 500 + lines that you would have had to write previously for this operation, which were usually tied to a single interface! Now let’s see what will happen if you want to use the geometry-specific libraries. Let’s draw a sunpath in Grasshopper:
import ladybuggrasshopper.epw as epw
import ladybuggrasshopper.sunpath as sunpath
# get location data form epw file
location = epw.EPW(_epwFile).location
# initiate sunpath based on location
sp = sunpath.Sunpath.fromLocation(location, northAngle = 0, daylightSavingPeriod = None, basePoint =cenPt, scale = scale, sunScale = sunScale)
# draw sunpath geometry
sp.drawAnnualSunpath()
# assign geometries to outputs
...
Finally we ask, how would this code will look if we wanted to make a sunpath for dynamo? Well, it will be exactly the same! Just change ladybuggrasshopper in the second line to ladybugdynamo! Here is the code which is creating the sunpath below.
With this ease of scripting, we hope to involve more of our community members in our development and make it easy for others to use ladybug in their various preferred applications. By the next release, we will produce an API documentation (documentation of all the ladybug classes, methods and properties that you can script with) and begin making tutorials for those interested in getting deeper into Ladybug development.
LADYBUG
1 - Initial Release of Ladybug for Dynamo:
As is evident from the post above, we are happy to announce the first release of Ladybug for Dynamo! You can download the ladybug package from Dynamo package manager. Make sure to download version 0.0.6 which is actually 0.0.1! It took a number of trial and errors to get it up there. Once you have the file downloaded you can watch these videos to get started:
The source code can be find under ladybug-dynamo repository and (as you can already guess) it is using the new code base. It includes a very small toolkit of essential Ladybug components/nodes but it has enough to get you started. You can import weather files, draw sunpaths and run sunlighthours or radiation analyses.
There are two known issues in this release but neither of them is critical. You need to have Dynamo 0.9.1 or higher installed which you can download from here (http://dynamobuilds.com/). It is recommended that you run the scripts with ‘Manual’ run (as opposed to ‘Automatic’) since the more intense calculations can make Dynamo crash in automatic mode.
To put things in perspective, here is how we would map Ladybug for Dynamo vs Ladybug and Honeybee for Grasshopper on the classic ‘Hype graph’. The good news is that what we learned a lot from the last three years, making development of the Dynamo version easier and getting us to the plateau of productivity faster.
We should also note that the current development of the Dynamo interface is behind that of the Ladybug-Core, which means there are a number of features that are developed in the code but haven’t made their way to the nodes yet. They will be added gradually over the next month or two.
If you’re interested to get involved in the development process or have ideas for the development, follow ladybug on Facebook, Twitter and Github. We will only post major release news here. Facebook, github and twitter will be the main channels for posting the development process. There will also be a release of a new ladybug for Grasshopper soon that will use the came Ladybug-Core libraries as the Dynamo interface [Trying hard not to name it as Ladybug 2].
2 - New Project “Icarus” Provides Ladybug Capabilities Directly in Rhino
Speaking of expanded cross-platform capabilities, the talented Miguel Rus has produced a standalone Rhino Plugin off of the original Ladybug code that has been included in this release. After writing his own core C# libraries, Miguel’s plugin enables users to produce sunpath and run sunlight hours analyses in the Rhino scene without need of opening Grasshopper or engaging the (sometimes daunting) act of visual scripting.
This release includes his initial RHP plugin file. It is hoped that Miguel’s efforts will extend some of the capabilities of environmental design to individuals who are unfamiliar with visual scripting, casting the network of our community into new territory. We need your help spreading the word about Icarus since the people who will benefit the most from it have probably not read this far into the release notes. Also, as the project is in the early stages, your feedback can make a great difference. You can download the current release from this link.
Once you download the zip file. Right click and unblock it. Then extract the files under C:\Program Files\Rhinoceros 5 (64-bit)\Plug-ins\ folder. Drag and drop the RHP file into Rhino and you should be ready to go. You can either type Icarus in the command line or open it via the panels. Here is a short video that shows how to run a sunlighhours analysis study in Rhino.
3 - BakeIt Input Now Supports a Pathway to PDF +Vector Programs
As promised in the previous release, the BakeIt_ option available on Ladybug’s visual components has been enhanced to provide a full pathway to vector-based programs (like Illustrator and Inkscape) and eases the export to vector formats like PDFs.
This means that the BakeIt_ operation now places all text in the Rhino scene as actual editable text (not meshes) and any colored meshes are output as groups of colored hatches (so that they appear as color-filled polygons in vector-based programs). There is still an option to bake the colored geometries as light meshes (which requires smaller amounts of memory and computation time) but the new hatched capability should make it easier to incorporate Ladybug graphics in architectural drawings and documents like this vector psychrometric chart.
4 - Physiological Equivalent Temperature (PET) Now Available
Thanks to the efforts of Djordje Spasic, it is now possible to compute the common outdoor comfort metric ‘Physiological Equivalent Temperature’ (PET) with Ladybug. The capability has been included with this release of “Thermal Comfort Indices” component and is supported by a “Body Characteristics” component in the Extra tab. PET is particularly helpful for evaluating outdoor comfort at a high spatial resolution and so the next Honeybee release will include an option for PET with the microclimate map workflow.
5 - Solar Hot Water Components Available in WIP
Chengchu Yan and Djordje Spasic have built a set of components that perform detailed estimates of solar hot water. The components are currently undergoing final stages of testing and are available in the WIP tab of this release. You can read the full release notes for the components here.
6 - New Ladybug Graphic Standards
With the parallel efforts or so many developers, we have made an effort in this release to standardize the means by which you interact with the components. This includes warnings for missing inputs and the ability to make either icons or text appear on the components as you wish (Hi Andres!). A full list of all graphic standards can be found here. If you have any thoughts or comments on the new standards, feel free to voice them here.
7 - Wet Bulb Temperature Now Available
Thanks to Antonello Di Nunzio - the newest member of the Ladybug development team, it is now possible to calculate wet bulb temperature with Ladybug. Antonello’s component can be found under the WIP tab and takes inputs of dry bulb temperature, relative humidity, and barometric pressure.
8 - New View Analysis Types
The view analysis component now allows for several different view studies in addition to the previous ‘view to test points.’ These include, skyview (which is helpful for studies of outdoor micro-climate), as well as spherical view and ‘cone of vision’ view, which are helpful for indoor studies evaluating the overall visual connection to the outdoors.
HONEYBEE
1 - Connection to THERM and LBNL Programs
With this release, many of you will notice that a new tab has been added to Honeybee. The tab “11 | THERM” includes 7 new components that enable you to export ready-to-simulate Lawrence Berkeley National Lab (LBNL) THERM files from Rhino/Grasshopper. THERM is a 2D finite element heat flow engine that is used to evaluate the performance of wall/window construction details by simulating thermal bridging behavior. The new Honeybee tab represents the first ever CAD plugin interface for THERM, which has been in demand since the first release of LBNL THERM several years ago. The export workflow involves the drawing of window/wall construction details in Rhino and the assigning of materials and boundary conditions in Grasshopper to produce ready-to-simulate THERM files that allow you to bypass the limited drawing interface of THERM completely. Additional components in the “11 | THERM” tab allow you to import the results of THERM simulations back into Grasshopper and assist with incorporating THERM results into Honeybee EnergyPlus simulations. Finally, two components assist with a connection to LBNL WINDOW for advanced modeling of Glazing constructions. Example files illustrating many of the capabilities of the new components can be found in there links.
THERM_Export_Workflow, THERM_Comparison_of_Stud_Wall_Constructions
Analyze_THERM_Results, Thermal_Bridging_with_THERM_and_EnergyPlus
Import_Glazing_System_from_LBNL_WINDOW, Import_LBNL_WINDOW_Glazing_Assembly_for_EnergyPlus
It is recommended that those who are using these THERM components for the first time begin by exploring this example file.
Tutorial videos on how to use the components will be posted soon. A great deal of thanks is due to the LBNL team that was responsive to questions at the start of the development and special thanks goes to Payette Architects, which allowed Chris Mackey (the author of the components) a significant amount of paid time to develop them.
2 - Electrical Lighting Components with Enhanced Capabilities for Importing and Manipulating IES Files
Thanks to the efforts of Sarith Subramaniam, it is now much easier and more flexible to include electric lighting in Honeybee Radiance simulations. A series of very exciting images and videos can be found in his release post.
You can find the components under WIP tab. Sarith is looking for feedback and wishes. Please give them a try and let him know your thoughts. Several example files showing how to use the components can be found here. 1, 2, 3.
3- Expanded Dynamic Shade Capabilities
After great demand, it is now possible to assign several different types of control strategies for interior blinds and shades for EnergyPlus simulations. Control thresholds range from zone temperature, to zone cooling load, to radiation on windows, to many combinations of these variables. The new component also features the ability to run EnergyPlus simulations with electrochromic glazing. An example file showing many of the new capabilities can be found here.
Dragonfly Beta
In order to link the capabilities of Ladybug + Honeybee to a wider range of climatic data sets and analytical tools, a new insect has been initiated under the name of Dragonfly. While the Dragonfly components are not included with the download of this release, the most recent version can be downloaded here. An example file showing how to use Dragonfly to warp EPW data to account for urban heat island effect can also be found here. By the next release, the capabilities of Dragonfly should be robust enough for it to fly on its own. Additional features that will be implemented in the next few months include importing thermal satellite image data to Rhino/GH as well as the ability to warp EPW files to account for climate change projections. Anyone interested in testing out the new insect should feel free to contact Chris Mackey.
And finally, it is with great pleasure that we welcome Sarith and Antonello to the team. As mentioned in the above release notes, Sarith has added a robust implementation for electric light modeling with Honeybee and Antonello has added a component to calculate wet bulb temperature while providing stellar support to a number of people here on the GH forum.
As always let us know your comments and suggestions.
Enjoy!
Ladybug+Honeybee development team
PS: Special thanks to Chris for writing most of the release notes!…
ne diverse digital design methodologies and the use of different tools such as Autodesk Maya, Rhinoceros and Grasshopper.
Building up technical skills will provide the attendees with a solid platform from which to start rethinking and exploring innovative architectural ideas in collaboration with the team and the tutors.
URBAN FIELDS
Phase I
In the first part of the workshop attendees will be looking at field conditions and how to generate and design such fields that can help structure a possible urban condition in Florence.
We will be exploring dynamic systems, geometric systems and network theories to generate and design an abstract field condi- tion that extends the urban experience of the city onto the vertical dimensions of towers. Simple operations that would span variations from an initial state will give rise to high level of com- plexity.
The goal of this exercise is to create a rich and diversified intel- ligible urban space that can be later on subjected to local inter- ventions and zooming in to locally enhance each design.
AGENT - BODIES POLYMORPHISM
Phase II
The second part of the workshop will build upon first phase; par- ticipants will select one archetype (high rise tower) as a study model for further development.
Besides engaging with multi agent algorithms design strategies, attendees will address strategic utilisation of structurally and environmentally generated morphologies to design coherent and highly differentiated tower exo-skeletons.
Tutors will introduce agent-bodies polymorphism in order to explore the generation of structural aware and capable geom- etries through agent based formation of non-linear hierarchies and emergent patterns. These agent-bodies will operate in a complex spatial manner to form structure, partitions or enclo- sure and will operate across scales, creating a poly-scalar level of detail.
Attendees will speculate how autonomous systems can cre- ate new structures and intelligent distribution of structural elements, about new collaborative strategies of construction and the performativity they will evoke (performance, effects, responsiveness, interaction).
Fees
Early registration (before 1st June)
Students 390€ - Professionals 440€
Late registration (after 1st June)
Students 490€ - Professionals 540€
More info and Applications
https://www.ax-om.com/edu/polymorphism/
…