use I don't agree with the practice of using site EUI as a metric to evaluate the thermodynamic performance, environmental impact, or monetary value of a building. I disagree with this practice for the same reason that there are no "totalThermalLoad" and "thermalLoadBalance" for simulations run with full HVAC. I can summarize these reasons in the following way:
When we run a simulation with ideal air loads, the heating/cooling values we get are THERMAL ENERGY that is directly added to or removed from the zone. In this way, we can draw a rough parallel between these two types of energy since they are are generally of a similar type and quality. As such, I am ok with adding them together to get total thermal load or subtracting them to get a sense of thermal load balance.
However, when we run a simulation will full HVAC, the heating/cooling values that we get are usually HEATING FUEL ENERGY and ELECTRICITY respectively. Fuel energy and electricity are fundamentally two different types and qualities of energy. To cite the second law of thermodynamics, the exergy (or the capacity to do work) of electricity is much greater than that of fuel. This is evident in the fact that, to produce a given unit of electricity, I often have to burn at least 3 units of fuel energy (though this can be much more for inefficient plants). With each step in a power plant - making steam, turning a turbine, turning a generator - there are significant energy losses. This difference in exergy is also evident in the fact that there are so many more things that I can do directly with a unit of electricity than I can do with the same unit of fuel energy. I can use electricity to directly refrigerate, produce light energy or power a motor just as easily as I can use to to cook, produce hot water, or heat a space. While I can cook, make hot water, or heat a space directly with fuel energy, refrigeration and lighting are much more difficult. For this reason, I do not feel comfortable adding electricity and fuel together either in the totalThermalLoad output or in a site EUI metric.
Still, the use of site EUI has become so ingrained in the industry that I have to acknowledge it and at least show users how it's calculated. In my view, it's an ad-hoc metric that was invented to deal with previously limited amount of information on energy sources.
Instead of using site EUI, I would recommend using the following metrics depending on what you are trying to evaluate:
Utility Cost / Square Meter - to measure the monetary value of a building to an owner or user
Kg CO2 / Square Meter - to measure the environmental and climatic impact of a building
Emergy / Square Meter - to measure the overall thermodynamic performance of a building
The first two are actually fairly easy to calculate these days just by researching your site's utility rates or grid energy mixture and multiplying the building electricity or fuel by their respective rates. I will add in some capabilities to Honeybee soon to make it even easier for you to get these values from your EPW file and databases of utility rates/grid mixture. Emergy is much harder to calculate as you have to trace all your energy sources all of the way back to the sun but there are a number of experts at work to make this calculation possible (probably in the next few years, we may have much easier ways to calculate it).
Hope this helps explain the current setup.
-Chris…
ake a network of lines (i.e. a graph) and make a Plankton Mesh, from which you can use Cytoskeleton to make a solid mesh (and then smooth it with Weaverbird).
Works with ngons (polygons with 3 or more sides). Other examples I found only worked with tris and quads.
Works on open or closed surfaces
While these examples start with a surface, you could start with a network of lines and make a patch surface
This is meant for 2D networks/surfaces. I haven't attempted filling a 3D volume. My guess is this wouldn't work as it would require a non-manifold mesh that Plankton wouldn't handle.
Note similar results could be achieved with the following:
TSplines
MeshDual (dual of a tri mesh, not as much freedom/control)
Working backwards, here is the GhPython script from Will Pearson that builds a Plankton Mesh from vertices and faces. The vertices are a list of 3D coordinates, the faces are a tree a lists, with each list containing the indices of vertices that form a closed loop. From Will, "Plankton only handles manifold meshes, i.e. meshes which have a front and a back. This orientation is determined by the "right-hand rule" i.e. if the vertices of a face are ordered counter-clockwise then the face normal will be out of the page/screen."
# V: list of Point3d # F: tree of int
import Grasshopper appdata = Grasshopper.Folders.DefaultAssemblyFolder
import clr clr.AddReferenceToFileAndPath(appdata + "Plankton.dll")
import Plankton
pmesh = Plankton.PlanktonMesh()
for pt in V: pmesh.Vertices.Add(pt.X, pt.Y, pt.Z)
for face in F.Branches: face = list(face)[:-1] pmesh.Faces.AddFace(face)
These vertices and faces are precisely the output from Starling. Starling takes in a list of Polylines which form the (properly oriented) face loops.
The polyline face loops can be generated...
Directly from Panels on a surface using LunchBox
Using any network of lines/curves on a surface (curves will need to be converted to polylines before Starling)
The latter was achieved using the Surface Split command, then converting the face edges (converted to curves) into polyline loops to represent faces.
…
I tell you what I had to do and how I did it.
I have the following situation. A urban context with a square plot 40m x 40m surrounded by buildings.
If I extrude the plot I get 4 surfaces and I need to calculate the minimum daily quantity of direct sunlight hours each test point receives in the period from 22nd of April to 22nd of August. For example for the test point at index 21 of surface with index 1 (I am just creating these numbers in my mind) the minimum is on 27th of April and the test point receive 8 hours (this is also invented for the sake of the example) of direct sunlight. All the other days it receives more. So the values I have to found are these minimums for all the test points. Now how to calculate these minimum quantities is a different issue of the topic of this post and actually I manage it.
Continuing with the explanation of what I had to... so I have only the initial plot that generate 4 surfaces, then I want to test smaller plots generated by an offset of 4 m of the original one, and the relative 4 surfaces for each smaller plot.
So in this case I think I cannot use your suggestion because the object don't exist yet.
I managed creating a loop with Anemone, the loop generate an offset starting from the original at 0 until 4 (then I multiply it by 4 to obtain the offset at 0, 4, 8, 12 and 16. Then I did like you also suggest I record every time the result with the DataRecorder and I create for each result a different branch with the index coming from the loop (0, 1, 2, 3 and 4) with the Flatten component.
In this image you can see all the surfaces saved in the same way as described above and in white the test points that receive minmum or equal than 2.5 hours per day of direct sunlight in the period from from 22nd of April to 22nd of August and in dark gray the test points that receive less.
The main point of this discussion is just the fact that instead use this tricky way I used, or the one you suggest, to analyze separately (because they shade each other) 20 geometries (in this case 20 they could be many more) it would be good if it would be possible just to input all the geometries at the same time and they would not shade each other so to get directly all the results with one run and in a more simple way.
Francesco
…
requires four weather data inputs: air temperature (_dryBulbTemperature), relative humidity (relativeHumidity_), wind speed at 1.1 meters from the ground (windSpeed_) and mean radiant temperature (meanRadiantTemperature_).You can add values to the first three inputs from the Ladybug "Import Epw" component. For the last (meanRadiantTemperature_), you can add it from Ladybug's "Outdoor Solar Adjusted Temperature Calculator" component, or let "Thermal Comfort Index" component to calculate it. Both use different methods to calculate the final values.
I attached an example file below with second option.For more precise calculations you can use Honeybee and Chris' microclimate maps.An icing on the cake for the end: one of Ladybug developers yesterday released a set of Ladybug components for modelling in ENVI-met application. ENVI-met is cutting-edge microclimate software, which can be downloaded for free. It opens a number of advanced new analysis in outdoor domain, which couldn't have been done with the current Ladybug+Honeybee tools. So you can perform the simulation in ENVI-met 4 free software, and then add mean radiant temperature values from ENVI-met simulation to "Thermal Comfort Indices" component. Here is an example file.If you would like to go with the last approach, then the best would be to post a question about it in this topic.
1) You can make a polygonized tree.I haven't subtracted the trunk from the crown, but I guess it makes sense that it can be done.2) In most solar related simulations, a default albedo value of 0.2 is used. This corresponds to average albedo value taken from materials surrounding the urban or countryside location (concrete, grass, gravel, sand, asphalt...). However the presence of snow can significantly magnify the average albedo value several times. "Sunpath shading" components albedo_ input has an ability to calculate albedo due to presence of snow, if nothing is added to it (to albedo_ input). As you are performing the analysis of PET in a horizontal plane, it will not affect your calculations.3) Most thermal comfort indices will require performing analysis at 1.1 meters above the ground. This is considered to be height of standing person's gravity center.The same goes for PET index. So you are correct: you should place the analysis grid at 1.1 meters above the ground before adding it to the "Sunpath Shading" component.It is worth mentioning that "Thermal Comfort Indices" component used in this topic's PET_on_Grid2.gh and PET_on_Grid3.gh files is from last year, and much slower than the newest one (VER 0.0.64 MAR 18 2017) used in the example attached below. Just a remainder if you have been using older version of this component.Let me know if I misunderstood some of your questions, or if I missed to answer some of them.
EDIT: sorry for posting a double reply. When I posted it the first time, I only got links visible, with no text. Something has been wrong with grasshopper ning forum for the last couple of months.…
search for residential type and surprisingly there are none. This can be, but i'm surprised.
The location in example is the Financial District of Manhattan. I assume there might not be too many purely residential buildings there. If you increase the radius to 300meters it will find one.The OSMobject "Residential building" will look for mostly purely residential buildings. For example those in Chinatown or Lower East Side.However most of the time a building might be a multi-purpose: shops on the ground floor, offices above, and above them residential apartments. Users can sometimes avoid tagging these kind of buildings, and may just tag them with "buildings"="yes", not the type of the building too (for example: "building"="multiuse"). So this may be the problem why you might not get too many residential buildings.I guess the only solution to this issue is to add these tags by yourself. Then Gismo will instantly make use of them.I mentioned previously that I will create a couple of video tutorials, but I seemed to never found enough time. I apologize for that. The process is actually quite simple.
Here is small step by step tutorial on how to do that. It may take you about 2 minutes to tag your building and use that tag in Gismo.
Also office buildings. I imagine this is not up to you, but can be kind of disappointing. I wanted for example to do some Ladybug analysis only on residential or office buildings ... pitty.
"Office building" has not been added to "OSMobjects" dropdown list. I have just added it.However, whenever some sort of object is not defined in "OSMobjects" dropdown list, one can use the _requiredKey and requiredValues_ inputs of the "OSM tag" component:
I just tried looking for office building for the same location we have in the create_legend_example.gh file and it found 3 of them. There would probably need to be more, but it may be that nobody tagged those with "building"="office"
The legend is nice, though i think is not completely synchronized with the LegendBakeParameters: You need to provide a point for the LegenPlane input and another for the titleOriginPt output of the CreateLegend.
Unlike Ladybug, Gismo threats the title and the legend separately. So the legend's color bar would have its own starting point (plane) while the title will have its own. I found myself puzzled sometimes in Ladybug, why this wasn't possible.Or did I misunderstand you?…
Added by djordje to Gismo at 12:33pm on May 8, 2017
a direct answer for you, in part because that statement suggests a complicated data tree indeed! And in part because I can't quite shake the question "Why?".
It's my nature to "think out of the box" and as you may have noticed, my answers in the other two threads related to this topic weren't quite what you asked for. The first question that comes to mind in this case is why look for the two closest trunks? Why not just the closest or why not "N" (all or all those within a given radius)?
The next question is why use a plane intersection at arbitrary height to get a point on each trunk for measuring distances between them?
So please bear with me as I explain how I've explored this problem so far, knowing I don't have an answer yet and, in fact, am not even sure that the question makes sense to me. ;)
First, I got tired of looking at these upside down "trees" so I flipped them right side up. I used 'Mirror' instead of 'Rotate' which might cause problems? But lets move on. I changed your preview colors so they wouldn't conflict with my 'Tree/List Viewer' defaults and to increase contrast a bit.
Then I skipped your methods for finding "Cluster 'B' and 'C'" and used 'Curve Proximity' between the trunks instead.
This is hard to convey with a static image but might make more sense interactively. There are two copies of 'Tree/List Viewer'; the second one ("slave" group) is driven by the set of sliders in the first one. As you move the 'path idx' slider, one of the trunks will be cyan in color, as set above. The others will be blue except for one that is yellow. As you move the 'list idx' slider, the yellow highlight will move among the blue trunks, showing the closest trunk at 'list idx' = 0 and the furthest at 'list idx' = 3 (five trunks total, one selected by 'path idx' and the other four by 'list idx'. The result is that for each trunk, we have all the other trunks sorted by distance.
That's all for now! There is a very simple way to connect the 'Twigs' instead of the 'Trunks' to 'Crv A' with interesting results, but it requires flattening the 'Twigs' so isn't as useful as we want.
The big question for me remains: what is the data/tree structure of the results you seek? From the statement I quoted above, it sounds like:
One branch per trunk.
For each trunk, one branch per twig? (or...?)
For each twig branch...? A list of distances to each of the other trunks? (or a list of the other trunks sorted by their distances from this twig?)
It sounds like a complicated mess, frankly. And again begs the question, why? What's the underlying goal beyond the objectives you have outlined so far?…
Added by Joseph Oster at 7:59pm on November 14, 2017
). It deals with the potential possibility to port GH into AEC fields (real-life AEC fields, nothing to do with academic thinking). The bad news are that the smart AEC sector is occupied solely by Bentley/GenComp – expect soon Revit/Dynamo as well (not to mention CATIA). The good news are that there’s millions of designers/engineers/industrial designers out there who could be interested for a 3rd alternative.
Intro: Well, in the old days (when men had mustache and muttonchops) AEC design performed in a nice top-to-bottom sequence (kinda like a vector) : the Big Man (aka The Brain) did some sketches (with crayons) and the rest (known as the “others”) struggled to make The Idea a reality. Today things are different, mind. Or they should be different. Or may be different. Or whatever. The big easy:For a zillion o reasons (AEC matures, PLM, cost, outsourcing, sustainable engineering…add several more) this vector like process of the past is like a Brown motion these days: Right down the moment that you (or your team) “sketch” The Big Idea … another team design simultaneously (i.e. in parallel) the components (parts) that compose the whole. This is the so called bottom-to-top design mentality. So the whole and the parts meet in some "middle point" instead the later being dictated by the former. In quite a few occasions parts dictate the whole (cost, cost and cost being the main reasons). The more a design is contemporary the more this bottom-to-top thing plays a critical role. Ignore it and have a very big time (sooner or later).The bad news:If you accept the above…well GH – at present phase - is not ready for contemporary AEC work. At.All.3 Main reasons for that:1.You can’t use parametric parts (i.e. nested blocks to speak Rhino language) into a given definition (in this case attached : truss nodes, connection flanges, mount plates, cable tensioners, planar glazing components, roof skin components…etc etc). This is obviously a Rhino domain.2.You can’t bake a given solution in such a way that the Rhino file is structured (i.e. assemblies of nested blocks). Or you can do it theoretically writing some VB/C code – but the core of the matter is that corresponding components are MIA. That means that you can’t export anything useful actually into established AEC oriented apps and/or established MCAD apps (for doing/calculating the parts for real-life production).3.The GH process can’t being interrupted. Imagine defining, say, a building “envelope” in GH and then …er…use Evolute tools in order to optimize things (say quad planarization and the likes). Then …continue in GH for more detailed work. Then design the parts as in 1 above. Then back to Evolute. Then back to GH.So…if anyone is interested I would be glad to start the mother of all debates and/or some kind of crusade (GH for President, that is).PS: This definition is a WIP thing – more refined stuff to follow (in particular a complex canopy tubes pre-stress system).
PS: Tree8 components are used sporadically.
PS: Use Saved Views
May the Dark Force be with us.Best, Peter …
al surface, but this is not happening with the edge mesh list!
In fact when I want to create mesh faces (the side mesh faces to close the boundaries surfaces) I can see (and I have tested/x2 checked within GH), that the corrispondent top edge and bottom edge of a determinated face are not coincident.
How can I assign a consistent index list to the top and bottom mesh edge list?
How the edge topology is assigned?
I post here the code, FYI, even I do not think that can help to find the problem.
And 2 screenshot to show you the problem.
double thickness = new double(); if (!DA.GetData(0, ref thickness)) return; Mesh baseMesh = new Mesh(); if (!DA.GetData(1, ref baseMesh)) return; List<Plane> localCSPlanes = new List<Plane>(); if (!DA.GetDataList<Plane> (2, localCSPlanes)) return; List<Mesh> solidMesh = new List<Mesh>(); List<Point3d> meshFaceCenterPlus = new List<Point3d>(); List<Point3d> meshFaceCenterMinus = new List<Point3d>(); List<Plane> localCSPlanesPlus = new List<Plane>(); List<Plane> localCSPlanesMinus = new List<Plane>(); List<Line> edgeMeshPlusLine = new List<Line>(); List<Line> edgeMeshMinusLine = new List<Line>(); Mesh offsetPlus = baseMesh.Offset(0.5*thickness, false); Mesh offsetMinus = baseMesh.Offset(-0.5 * thickness, false); Mesh recMesh = new Mesh(); solidMesh.Add(offsetPlus); solidMesh.Add(offsetMinus); //This for-cycle creates a index-consistent list of plus-offset mesh face center points and local coordinate system mesh planes. int nFcsPl = offsetPlus.Faces.Count; int nEdg = offsetPlus.TopologyEdges.Count; for (int i = 0; i <= nFcsPl - 1; i++) { meshFaceCenterPlus.Add(new Point3d(offsetPlus.Faces.GetFaceCenter(i))); localCSPlanesPlus.Add(new Plane(offsetPlus.Faces.GetFaceCenter(i), localCSPlanes[i].XAxis, localCSPlanes[i].YAxis)); } //This for-cycle creates a index-consistent list of minus-offset mesh face center points and local coordinate system mesh planes. int nFcsMn = offsetMinus.Faces.Count; int nEdgMn = offsetMinus.TopologyEdges.Count; for (int i = 0; i <= nFcsMn - 1; i++) { meshFaceCenterMinus.Add(new Point3d(offsetMinus.Faces.GetFaceCenter(i))); localCSPlanesMinus.Add(new Plane(offsetMinus.Faces.GetFaceCenter(i), localCSPlanes[i].XAxis, localCSPlanes[i].YAxis)); } for (int i = 0; i <= nEdg - 1; i++) { Line lineRecPlus = offsetPlus.TopologyEdges.EdgeLine(i); edgeMeshPlusLine.Add(lineRecPlus); Point3d recPointA = lineRecPlus.From; recMesh.Vertices.Add(recPointA); Point3d recPointB = lineRecPlus.To; recMesh.Vertices.Add(recPointB); Line lineRecMinus = offsetMinus.TopologyEdges.EdgeLine(i); edgeMeshMinusLine.Add(lineRecMinus); Point3d recPointC = lineRecMinus.To; recMesh.Vertices.Add(recPointC); Point3d recPointD = lineRecMinus.From; recMesh.Vertices.Add(recPointD); recMesh.Faces.AddFace(0 + 4 * i, 1 + 4 * i, 2 + 4 * i, 3 + 4 * i);
NOTE: I know how to create a solid from a single mesh surface, but this is not what I want now, because I have to sort (for further purpose) the solid mesh in top, bottom and side faces.
Thanks a lot for your help!
cheers,
matteo…
. From the Thermal Comfort Indices component, Comfort Index 11 (TCI-11):MRT = f(Ta, Tground, Rprim, e)
with:- Ta = DryBulbTemperature coming from ImportEPW component- Tground = f(Ta, N) where N comes from totalSkyCover input. Tground influences the long-wave radiation emitted by the ground in the MRT calculation.- Rprim defined as solar radiation absorbed by nude man = f(Kglob, hS1, ac)- ac is the clothingAlbedo in % (bodyCharacteristics input)- I can't find any definition in the code of Kglob and hS1. Could you tell me please what are those values referencered to? --> probably the globalHorizontalRadiation but how?- e = vapour pressure calculated from Ta and Relative Humidity input
Do you agree that in this case the MRT does not depend on these inputs: location, meanRadiantTemperature, dewPointTemperature and wind speed?It does not depend neither on the other bodyCharacteristics like bodyPosture, age, sex, met, activityDuration...?
MRT calculated by the TCI-11 method is the mean radiant temperature of a vector pointing vertically with a sky view factor of 100%?For ParisOrly epw,
2. From the SolarAdjustedTemperature component (that seems to be more used for the UTCI calculation examples on Hydra compared to TCI-11).
In contrast to the TCI-11, this component distinguishes diffuse and direct radiation and contextualizes the calculation thanks to _ContextShading input, right? It can also be applied to a mannequin thanks to the CumSkyMatrix and thus evaluate the dishomogeneity of radiation exposure.This component seems not to consider the influence of vapour pressure on the result --> is it then more precise to put the MRT output (from the TCI) as an input of meanRadTemperature for SolarAdjustedTemperature?The default groundReflectivity is set to 0.25 --> is GroundReflectivity taken into account in the Tground or MRT calculation in the TCI component? If yes, what is the hypothesised groundReflectivity?The default clothing albedo of 37% (TCI-11 bodyCharacteristics) corresponds to Clothing Absorptivity of 63%?
If the CumSkyMatrix input is not supplied, I get 9 results for the mannequin --> where are those points/results coming from?
If the CumSkyMatrix input is supplied,I suppose the calculation of the 482 results correspond to a calculation method similar to the radiation analysis component that is averaged over the analysis period. Right?But I don't understand why the mannequin is composed of 481 faces and meshFaceResult gives 482 results.
Finally, what is the link between the MESH results, the solarAdjustedMRT and the Effective Radiant field ? Is there a paper to have a detailed explanation of the method?
3. Here are some results for the ParisOrly energyplus weather data. You can find here attached the grasshopper definition.There is no shading in this simulation and the result coming from the ThermalComfort indices for MRT is very different compared to the solar adjusted MRT.Why such a big difference and which of the result should be plugged into the UTCI calculation component?
Results for ParisOrly.epwM,D,H:1,1,12
Ta : 6.5°Crh: 100%globalHorizontalRadiation: 54 Wh/m2totalSkyCover: 10MRT (TCI-11): 1.2°C
_CumSkyMtxOrDirNormRad = directNormalRadiation : 0 Wh/m2diffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.64°CMRTDelta: 4.14°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.47°CMRTDelta: 3.97°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = MRT (TCI-11)solarAdjustedMRT: 5.17°CMRTDelta: 3.97°C
Thanks a lot for your helpRegards,
Aymeric
…
hope this number will grow in future. Currently available features are:
1) Creation of 2d or 3d context for any kind of building related analysis: automatically generate the 2d/3d surrounding buildings for the location where you would like to perform visibility, solar radiation, cfd or any other type of analysis. You need some other plugin for the last three, like Ladybug. It only creates the context=surroundings! The "automatic generation" process also includes creation of the local topography (terrain) along with buildings.
2) Identification of certain 2d or 3d elements in the created context. For example: selection of all hotels, parks, hospitals, restaurants, residential buildings etc.
3) Performing direct terrain analysis (hillshading, slope, ruggedness, roughness, water flow...)
4) Creation of terrain shading masks and horizon files for further solar and photovoltaics analysis.
Gismo will be very grateful if he could get any suggestions, improvements, bug reports and testing in the following period. In case you are willing to provide any of these, the requirements, installation steps and .gh example files can be found here, here and here.
Thank you in advance !!…
Added by djordje to Gismo at 9:10am on January 29, 2017