. BIM and Parametric.
Posts and files over at Design By Many:
http://www.designbymany.com/content/model-pattern-american-cement-building
I am equally comfortable on both of these platforms, and built the same parameters into each model. My modeling experience was very similar to that of Santiago. The Revit model took 4 hours to build, while the GH deff. took 16 hours to build. Time invested is certainly not the only metric to be compared; however, it is a good demonstration of the immediacy with which modifications can be made to the component system if parameter adjustment is not satisfactory.
With credit to Andrew Kudless for his process work on Manifold, I have adapted a similar workflow tracing diagram to the two models:
My general observation is that both tool sets approach the same problem, namely providing a structured relationship between components and wholes, but from opposing directions. BIM excels at compartmentalizing individual components, while parametric modelers like GH excell at global system-wide manipulations.
In the case of the American Cement Building, modeling the cast component seems to have fit in the box of 'the whole being reducible to its parts' the best. Although i anticipated Revit having more trouble with the surface generation, I found it to be more flexible on all accounts. Building up the component in a Pattern Based Curtain System family, the direct interaction with the rig (specifying control point work planes, and offsets) allowed the network of interactions to be accessible and editable throughout the build process. This family was then applied to a curtain panel grid which itself could be flexed in proportion, and cell count.
With the GH build I originally had the intention of utilizing data trees for parallel component construction so that changes to the base grid would affect offset normals and the like. However, after i had spent three hours constructing one parametric rail curve, I was unable to continue keeping track of the parallel data structure, and reverted to building a singular component. While GH certainly has the capacity to handle this task, I have found personally that the user does not.…
between the two. A simple example would be if you plug Integer data into a Text parameter. It's perfectly possible to create a piece of text which represents the integer. I.e. the value 18 becomes the text "18".
It's also possible to convert a floating point number to text, although in that case the conversion is not lossless, as the text only shows a limited number of decimals, thus rounding the actual numeric value.
In your specific case here, you have connected a Curve parameter output with the Loft Options input. Loft options are about the type of loft, whether or not to rebuild/refit the resulting loft surface and -if so- what sort of tolerance to use.
If you look at the tooltips of the input parameter for the Loft component, you'll see that the first one takes all the section curves and the second one takes the options to be used to make the loft. You'll have to put all your curves into the first input:
This can be accomplished by holding SHIFT while making the second connection.
However this will generate a new problem. Loft operates on a list of curves, and for each list of curves you provide it will try to create a single loft. But if you merge the two curve streams, you'll sometimes get lists of 4 curves, this is probably not what you want.
At any rate, Loft is probably not what you want in the first place as an offsetted curve (especially curves with kinks) will result in incredibly messy lofts. I'd recommend Boundary Surface as an alternative, but that will generate trimmed surfaces, which may not be acceptable for you.
Now then, on to the Offset failure. Curve offsetting is a planar operation. By default, the plane in which Offset works is the world XY plane. Your curves are all perpendicular to the world XY plane, so that is already problematic. The fix would be easy (plug the curves also into the Offset P input), were it not that one of your section curves is wonky. This is probably either due to a bug in the Rhino Brep|Plane intersector or it's a problem with the input Brep. Either way, I could not get one of the curves to offset correctly, no matter what I tried.
In the end I solved it by using Loose Offset, which also means that the loft works much better because both the interior and the exterior curve have identical topology (see attached). Do note that Loose Offset does not guarantee an offset accurate to within document tolerance, it only moves the control-points.
--
David Rutten
david@mcneel.com…
ly planes instead of lines, so there is no equivalently elegant and orderly branching structure in there made from lines. You only get the mostly triangulated truss which is much tighter, shown here in blue in the 2D version:
If you only sparsely populate those truss points, you don't have as much triangulation and you do get more of a natural bone look, but you lose the orderly branching that I was so excited about in 2D. Also, since hexagons pack 2D space perfectly, the 2D case does create a lot of good areas of hexagons, but in 3D there is no similarly symmetrical space filling object except a cube, but cubes are not what Voronoi emulates at all. If the 2D case branches with three lines per vertex, then the 3D case could ideally branch with 4 lines per vertex, just like the atomic structure of diamond. I was hoping for that, naively, but am now discouraged. A surface adaptive diamonoid lattice is a long way off, it seems. Without the Voronoi relaxation cycles, just distorting an existing lattice somehow merged to the surface as needed local to the surface, won't even out well.
Diamond also is a very specific structure, not amendable to fractal like branching so I'm not even sure what the 3D equivalent of such branching is, whether there is an orderly system. "Branching" is the wrong concept anyway, since they both branch and join together again, forming cells. Pure branching with that ends at the surface is not coming out of Voronoi.
http://www.grasshopper3d.com/photo/stochastic-fractal
Here I have created a superior surface adaptive 3D Voronoi, by using my 2D system of only moving a lot the vertices already near the surface, leaving mostly alone the deeper ones, so I no longer get a blank hole in the interior but I do get lots of surface density:
…
Added by Nik Willmore at 2:01am on August 16, 2015
node geometry from line structure inputs.
In terms of trying to make all your panels regular hexagons... this topic comes up frequently on GH whether it be using only equilateral triangles, hexagons, pentagons etc;
http://www.grasshopper3d.com/forum/topics/folded-plane-subdivided-into-equilateral-triangles?id=2985220%3ATopic%3A1007963&page=2#comments
http://www.grasshopper3d.com/forum/topics/triangulation-using-only-equilateral-triangles
http://www.grasshopper3d.com/forum/topics/polygon-composition-with-hinges-1
In general, if you want a curved facade surface your hexagons cannot all be identical. There was a post on this forum about exactly this. I was convinced you could not have anything other than a flat surface with fixed, equalateral triangles but it turns out (and was shown by Daniel Piker and Kangaroo) that you can indeed have a non-planar surface panelled with equalateral triangles but it tends to be a kinked surface and it wasn't straightforward to control.
To try and reduce the variety of components in building structures like this, people have tried this sort of thing...
http://www.solidsmack.com/fabrication/you-can-now-build-your-own-geodesic-dome-at-home-in-under-an-hour-with-this-handy-kit/
...but notice the lack of panels!
Perhaps your best route is use something like what Bradley ended up with in the first link I posted then work on ID tagging each panel and node (and their orientations) so you have a construction procedure to follow.
One other thing to bear in mind... the simple construction above was really awkward to construct. On a larger scale it could be a nightmare! Once you have 2 nodes connected you can't fit the third without loosening the 2 that are already connected and shuffling them together bit by bit. Hard with 4 pieces, a disaster with many more so always think about how you intend to construct the pieces!…
I guess I'd try creating a mesh from those points. Tetgen only accepts a mesh.
However, there are advanced flags that could be changed by editing the Python code, which is fairly straightforward as far as Python goes.
chrome-extension://oemmndcbldboiebfnladdacbdfmadadm/http://wias-berlin.de/software/tetgen/1.5/doc/manual/manual.pdf
There's a way to add new points (-i flag), indeed, but that doesn't override the existing ones, and it adds tetrahedron points anywhere within the volume. This indeed requires its own separate .node file, it seems?
There's also a way to specify region attributes, (-A and -a) that I don't yet understand, as to whether it requires its own file or is somehow part of a full mesh input file alternative to the normal STL file that Tetgen reads. I'm creating an STL file from Python to make the script work and that's the only file I'm creating for Tetgen, so far.
5.2.2 .poly les
A .poly file is a B-Rep description of a piecewise linear complex (PLC) containing some additional information. It consists of four parts.
Part 4 - region attributes list
The optional fourth section lists regional attributes (to be assigned to all tetrahedra in a region) and regional constraints on the maximum tetrahedron volume. TetGen will read this section only if the -A switch is used or the -a switch without a number is invoked. Regional attributes and volume constraints are propagated in the same manner as holes.
One line: <# of region>
Following lines list # of region attributes:
<region #> <x> <y> <z> <region number> <region attribute>
...
If two values are written on a line after the x, y and z coordinate, the former is assumed to be a regional attribute (but will only be applied if the -A switch is selected), and the latter is assumed to be a regional volume constraint (but will only be applied if the -a switch is selected). It is possible to specify just one value after the coordinates. It can serve as both an attribute and a volume constraint, depending on the choice of switches. A negative maximum volume constraint allows to use the -A and the -a switches without imposing a volume constraint in this specific region.
Yeah, the manual sucks. I'm confused even what the workflow is and what are output files versus extra input files Tetgen reads from.
I basically have no idea what any of this means. What's the workflow for specifying a region's target tetrahedron maximum volume, and is that even possible?…
lts.
In the visualization, points is an interesting option. It's a matter of aesthetics I guess, I go with surfaces :) Also what you can try is selecting Filters -> Slice (you can also find it in the icons above the pipeline viewer), in the Slice options below the pipeline press Z normal and on the Z coordinate press some height relevant to the buildings (e.g. 1.75m a typical human scale). That would show you the flow around the buildings on that height. Experiment with selecting other normals and values. Keep playing with the filters there's some cool things in there. Also you can check out the mailing list and extensive paraview documentation.
Concerning the errors I apologize because I just downloaded your case.
It appears that the decomposeParDict is not included in the system folder. I am not sure if this is due to BF not going through the whole workflow yet or an ommission on our side. Please feel free to add it in Github. I will also note it down and pass it to Mostaph to check. In the meantime please find attached a VERY detailed decomposeParDict file. I took the liberty to set it at 4 processors (the numberOfSubDomains value) and also selected (that is uncommented) the scotch decomposition method. It's the easiest method to use since it is automatic and doesn't require any more inputs on how the domain is decomposed on the x,y,z directions (which would require you to change values in the attached file).
Now, the different folders created are simply snapshots of the current solution at the specific timestep. To control how often the solver is saving change the writeInterval number in the controlDict file. You can also change almost all these values on the fly, while OF is running.
Finally, concerning the other errors of parafoam it seems somehow parafoam is reading the intial condition names instead of actual results from the solution files and it doesn't like it.
Does this happen only when you open the case (i.e. at 0 time) or does it also happen when you move to an other timestep?
Also, are you using paraFoam, paraview or the paraFoam -builtin method?
The extension of the paraFoam file seems to be .foam which means you are probably using the built in viewer. That might be the issue but I'm not sure.
Can you try running paraview, navigate to your case folder, open the .foam file and see if there is still an error?
Also, if it isn't much trouble can you zip one of the time folders and attach it here? I'd like to take a look at what's inside to check against what the error report says.
Once again thanks for testing!
Kind regards,
Theodore.…
bit:
Unable to load grasshopper.dll plug-in: Rhino version not specified.
I've also tried the current WIP grasshopper (0.7 rev 57) and I receive a slightly different error message:
Unable to load grasshopper.rhp plug-in: Rhino version not specified.
A similar thread: http://www.grasshopper3d.com/forum/topics/plugin-eror
…
Added by Koabi Brooks at 1:30pm on October 2, 2010
or of the rectangle
Here is a sketch.
I found also into rhinoscript help some script however I don't see how to manage to create such a component for a use into grasshopper.
Could you possibly help me ?
Thank you.
Marc
GetRectangle
Pauses for user input of a rectangle.
Syntax
Rhino.GetRectangle ([intMode [, arrPoint [, strPrompt1 [, strPrompt2 [, strPrompt3]]]]])
Parameters
intMode
Optional. Number. The rectangle selection mode. If not specified, all modes (0) are available. The rectangle selection modes are as follows:
Value
Description
0
All modes.
1
Corner. A rectangle is created by picking two corner points.
2
3-Point. A rectangle is created by picking three points
3
Vertical. A vertical rectangle is created by picking three points.
4
Center. A rectangle is created by picking a center point and a corner point.
arrPoint
Optional. Array. A 3-D base point.
strPrompt1
Optional. String. The first prompt or message.
strPrompt2
Optional. String. The second prompt or message.
strPrompt3
Optional. String. The third prompt or message. The third prompt used only with 3Point and Vertical modes.
Returns
Array
An array of four 3-D points that define the corners of the rectangle if successful. Points are returned in counter-clockwise order. See the image below for details.
Null
If not successful, or on error.
Example
Dim arrRect
arrRect = Rhino.GetRectangle
If IsArray(arrRect) Then
Rhino.AddTextDot "0", arrRect(0)
Rhino.AddTextDot "1", arrRect(1)
Rhino.AddTextDot "2", arrRect(2)
Rhino.AddTextDot "3", arrRect(3)…
t defined from the discussion of radiation exchange between urban surfaces and the sky in urban heat island research (See Oke's literature list below). It will be affected by the proportion of sky visible from a given calculation point on a surface (vertical or horizontal) as a result of the obstruction of urban geometry, but it is not entirely associated with the solid angle subtended by the visible sky patch/patches.
So, I think using "geometry way" to approximate Sky View Factor is not correct. Sky View Factor calculation shall be based on the first principle defining the concept: radiation exchange between urban surface and sky hemisphere:
(image extracted from Johnson, G. T., & Watson, 1984)
Therefore, I always refer to the following "theoretical" Sky View Factors calculated at the centre of an infinitely long street canyon with different Height-to-width ratios in Oke's original paper (1981) as the ultimate benchmark to validate different methods to calculate SVF:
So, I agree with Compagnon (2004) on the method he used to calculate SVF: a simple radiation (or illuminance) simulation using a uniform sky.
The following images are the results of the workflow I built in the procedural modeling software Houdini (using its python library) according to this principle by calling Radiance to do the simulation and calculation, and the SVF values calculated for different canyon H/W ratios (shown at the bottom of each image) are very close to the values shown in Oke's paper.
H/W=0.25, SVF=0.895
H/W=1, SVF=0.447
H/W=2, SVF=0.246
It seems that the Sky View Factor calculated from the viewAnalysis component in Ladybug is not aligned with Oke's result for a given H/W ration: (GH file attached)
According to the definition shown in this component, I assume the value calculated is the percentage of visible sky which is a geometric calculation (shooting evenly distributed rays from sensor point to the sky and calculate the ratio of rays not blocked by urban geometry?), i.e solid angle subtended by visible sky patches, and it is not aligned with the original radiation exchange definition of Sky View Factor.
I'd suggest to call this geometrically calculated ratio of visible sky "Sky Exposure Factor" which is "true" to its definition and way of calculation (see the paper on Sky Exposure Factor below) so as to avoid confusion with "The Sky View Factor based on radiation exchange" as discussed in urban climate literature.
Appreciate your comments and advice!
References:
SVF: definition based on first principle
Oke, T. R. (1981). Canyon geometry and the nocturnal urban heat island: comparison of scale model and field observations. Journal of Climatology, 1(3), 237-254.
Oke, T. R. (1987). Boundary layer climates (2nd ed.). London ; New York: Methuen.
Johnson, G. T., & Watson, I. D. (1984). The Determination of View-Factors in Urban Canyons. Journal of American Meteorological Society, 23, 329-335.
Watson, I. D., & Johnson, G. T. (1987). Graphical estimation of sky view-factors in urban environments. INTERNATIONAL JOURNAL OF CLIMATOLOGY, 7(2), 193-197. doi: 10.1002/joc.3370070210
Papers on SVF calculation:
Brown, M. J., Grimmond, S., & Ratti, C. (2001). Comparison of Methodologies for Computing Sky View Factor in Urban Environments. Los Alamos, New Mexico, USA: Los Alamos National Laboratory.
SVF calculation based on first principle:
Compagnon, R. (2004). Solar and daylight availability in the urban fabric. Energy and Buildings, 36(4), 321-328.
paper on Sky Exposure Factor:
Zhang, J., Heng, C. K., Malone-Lee, L. C., Hii, D. J. C., Janssen, P., Leung, K. S., & Tan, B. K. (2012). Evaluating environmental implications of density: A comparative case study on the relationship between density, urban block typology and sky exposure. Automation in Construction, 22, 90-101. doi: 10.1016/j.autcon.2011.06.011
…
he example file to this file so you can give it a try with any version of Honeybee that you're already using. The only requirement is to have OpenStudio installed as the component is using OpenStudio libraries to parse gbXML files. If you're using the latest version available on github the component is also available under WIP tab.
Why?
The main purpose of developing this component is to save time and effort for importing Revit models for energy and daylight analysis. It bothers me to see a lot of smart people spend a lot of time to just come up with solutions just to get the geometry from Revit to Honeybee for analysis. This component is not solving all the issue but is a first step forward. In an ideal world, the future version of Honeybee, which works both under DynamoBIM and Grasshopper should address this issue but that can take some time to be fully ready!
How?
To use this component you need to Export your Revit model as gbXML and then use the file path to load the file into Grasshopper. There are several resources available online on how to prepare the analytical model in Revit and export the gbXML file. Here is an image for importing the Revit 2017 sample model using the default settings. As you can see the model will be just as good as what your original gbXML file from Revit is.
What can be improved?
Well, there are several items that can be improved and they are mostly not on us. To get it started I add what I think are the 3 main shortcomings and my thoughts on how they can be addressed in the future. Feel free to add what you think needs to be added to this list in the comments section.
1. Revit analytical models and as the results gbXML files, by design, are not intended to be clean. Watch this presentation from the Autodesk University to see the logic behind this approach which in short is it doesn't matter for a large scale early stage energy model. Well, This will be quite a problem for studies that you can do with Honeybee. Included but not limited to daylight and comfort analysis.
The best solution that I can think of, until Autodesk fixes their exporter, is to use Revit Rooms and Spaces and generate a clean model from the scratch. We have already tried this approach in Revit but since the Revit API doesn't provide access to Room openings we had a very hard time to get it to work.
That's why that I opened an idea on Revit ideas to get over this issue. With your support we already have 81 votes, but it hasn't been enough to make them to consider the idea for an official review. If you haven't voted already and you think this will be a helpful feature take a moment and vote so we can have it implemented at some point in the future.
2. There is no way (that I know) to export only part of the model. The way export gbXML is set up in Revit is to export the whole model once together. As a result, if you have a huge model with 100 rooms and you want to get one of the rooms into Honeybee using this component you have to export the whole model, which can take some time, and then import them all back into Grasshopper. To partially address this issue I added an input to the component that allows you input a list of names for rooms that you're interested to be loaded into Grasshopper. You can use the name of the room/space in Revit as an input for the component.
3. The component doesn't import adjacencies, loads, schedules and HVAC systems. I wasn't able to export a gbXML file from Revit with any of this data except for the adjacency, but even if you can do that, the component currently can only import geometries and constructions. I hope we get access to 1 and so we don't have to use the xml file approach at all, but if that takes a very long time then we will add these features to the component.
Happy 2017!
Mostapha…