r "virtual partitions" as follows:
What I mean "air walls" here, is derived from the description of the E+ documentation with the header of "Air wall, Open air connection between zones". (Page 17, http://apps1.eere.energy.gov/buildings/energyplus/pdfs/tips_and_tricks_using_energyplus.pdf)
As I understand, the term "air wall" used in E+ here refers to a description of something like "boundary condition" between adjacent interzone heat transfer surfaces, but not a kind of "construction or material" (like air space resistance or air gaps within a wall/double glazing window).
The main purpose of introducing the "air wall", is to simulate or approximate the airflow/convection/natural ventilation effect between multiple thermal zones which are connected by a large opening.
In my previous tests, using HBzones and GB, I managed to create the gbXML file which can be successfully imported to DB (without assigning any constructions within HB). And the adjacency condition can be recognized automatically by DB, even when I did not use the "Solve adjacencies" component in HB - shared surfaces between multiple thermal zones are recognized automatically by BD as "internal - partition"(which are standard partitions, but not virtual partitions).
In order to create/approximate "virtual partition", I need to manually draw a "hole" in the standard partition surface (fig.1&2). Again, the reason why we want to use "virtual partitions"(or "air wall") is that it allows airflow between multiple thermal zones which are connected by large openings and we could get different temperature of the each subdivided thermal zone which compose a large thermal zone.
My question is, if there is a possible way to simulate/approximate this kind of "virtual partitions"(or "air wall") in HBzones or in GB? If so, I would like to test if DB recognizes it or not. Actually, we expect that there is no need to involve any manual operations (like drawing a "hole" in the standard partition surface) in DB, due to an automatic optimization loop.
Thank you!
Best,
Ding
fig.1
fig.2
…
ers and researchers, programmers and artists, professionals and academics who come together for 4 days of intense collaboration, development, and design.
The sg2012 Workshop will be organised around Clusters. Clusters are hubs of expertise. They comprise of people, knowledge, tools, materials and machines. The Clusters provide a focus for workshop participants working together within a common framework.
Clusters provide a forum for the exchange of ideas, processes and techniques and act as a catalyst for design resolution. The Workshop is made up of ten Clusters that respond in diverse ways to the sg2012 Challenge Material Intensities.
Applicants to the sg2012 Workshop will select their preferred cluster from the following:
Beyond Mechanics
Micro Synergetics
Composite Territories
Ceramics 2.0
Material Conflicts
Transgranular Perspiration
Reactive Acoustic Environments
Form Follows Flow
Bioresponsive Building Envelopes
Gridshell Digital Tectonics
More information about the Workshop and Clusters can be found here:
http://smartgeometry.org/index.php?option=com_content&view=article&id=116&Itemid=131
The application process will close on January 15th, 2012.
Full Fee $1500
Reduced Fee $750
Scholarship Fee $350
Fees include attendance to both the workshop and conference from March 19th-24th.
Reduced Fee and Scholarships are available only for Academics, Students and Young Practitioners, and are awarded during a competitive peer review process.
sg2012 takes place from 19-24 March 2012 at EMPAC (http://empac.rpi.edu/) and is hosted by Rensselaer Polytechnic Institute in Troy, upstate New York USA. The Workshop and Conference will be a gathering of the global community of innovators and pioneers in the fields of architecture, design and engineering.
The event will be in two parts: a four day Workshop 19-22 March, and a public conference beginning with Talkshop 23 March, followed by a Symposium 24 March. The event follows the format of the highly successful preceding events sg2010 Barcelona and sg2011 Copenhagen.
sg2012 Challenge Material Intensities
Simulation, Energy, Environment
Imagine the design space of architecture was no longer at the scale of rooms, walls and atria, but that of cells, grains and vapour droplets. Rather than the flow of people, services, or construction schedules, the focus becomes the flow of light, vapour, molecular vibrations and growth schedules: design from the inside out.
The sg2012 challenge, Material Intensities, is intended to dissolve our notion of the built environment as inert constructions enclosing physically sealed spaces. Spaces and boundaries are abundant with vibration, fluctuating intensities, shifting gradients and flows. The materials that define them are in a continual state of becoming: a dance of energy and information. Material potential is defined by multiple properties: acoustical, chemical, electrical, environmental, magnetic, manufacturing, mechanical, optical, radiological, sensorial, and thermal. The challenge for sg2012 Material Intensities is to consider material economy when creating environments, micro-climates and contexts congenial for social interaction, activities and organisation. This challenge calls for design innovation and dialogue between disciplines and responsibilities. sg2010 Working Prototypes strove to emancipate digital design from the hard drive by moving from the virtual to the actual in wrestling with the tangible world of physical fabrication. sg2011 Building the Invisible focused on informing digital design with real world data. sg2012 Material Intensities strives to energise our digital prototypes and infuse them with material behaviour. They have the potential to become rich simulations informed by the material dynamics, chemical composition, energy flows, force fields and environmental conditions that feed back into the design process.
More information can be found at http://www.smartgeometry.org
Follow us on Twitter at http://twitter.com/smartgeometry…
Added by Shane Burger at 12:29pm on December 13, 2011
string may contain any number of curly bracket pairs with non-negative integers in them:
"When {0} brings back {1} days and {2}"
The number inside the brackets refers to the data to insert in that location. In effect, {x} is a placeholder for actual data. The data inserted into a specific bracket pair is the data supplied in the latter part of the function. {0} refers to the first item, {1} to the second, {2} to the third and so on ad infinitum.
If I supply some data the entire expression may look like this:
Format("When {0} brings back {1} days and {2}", "Spring", "blue", "fair")
which will result in the string "When Spring brings back blue days and fair".
If the data you're inserting is a number (or a date) then you have additional formatting flags that you can use. These additional flags appear behind the placeholder index integer separated by a colon.
Format("Pi = {0:0.00} ({0:0.000000})", Pi)
The :0.00 means the number will be formatted using two digits. The other flag will enforce six digits, resulting in: "Pi = 3.14 (3.141593)"
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
Added by David Rutten at 3:00pm on February 3, 2013
n the inability to be a real-life member within a parametric workflow (same kind of issue with Evolute Tools Pro).
As regards strictly AEC matters the main problem with GH is Rhino itself (not feature/constrain driven, not a solid modeler, not AEC oriented by any means and not biased towards assembly/component modeling). Other than that and due to the known GH inability to handle/manage blocks/nested blocks at bake time ... well... I hardly can see how "to set up work flows between different tools such as ..."
I'll post soon 5 - rather "trivial" - AEC cases that are totally undoable (shop drawing level) with anything other than CATIA (or NX).
BTW: since international practices grow and grow in numbers these days (and individuals are dead) I can't see any realistic limitation for creating dedicated teams (kinda like Frank Gerhy did) that can easily deal with the "extremely heavy" nature of the beast.
BTW: this is a job ad (Project Architect role) from one of the biggest US AEC practices (rather a corporation, he he)
How things change these days ... don't you agree?
best, Peter
…
nette for years.. but without the nice GUI. It also allows combining constraints solving to be part of the DAG.
What is parameterics? Or parametric associative as GC has been described. Can't remember. History or procedural modeling? Even constraints solving or rules based solving all use parameters. Is it generative or merely parametric? I guess the difference is a parametric door doe not generate other parameteric doors?
BIM has opened the door to a more data centric view and manipulation of the design model. To old skoolers a wall is a linear construct that can be abstracted into parameters... beginning and end points of wall in plan + height and thickness. But start adding other stuff and need to ineteroperate with others and things get problematic.
Pretty soon, all those abstractions (parametric or otherwise) need to be structured and you end up talking about schemas etc to control the format of the parameters using rules as checks or constraints..so that your parameters can interface with parameters from others without causing data quality issues. It all gets very database thinking like.
So, I would say parametrics as GH does it is more free form and ad hoc and at some point if it goes BIM, the parametrics will be need to be (re)structured..
BIM is dependent on IFC development which is not very fast. IFC4 is only beginning to think about parametrics and 'Design Transfer'.
…
humacher (Zaha Hadid) and in fact most issues of AD (Architecture Design)
The Politics of Parametricism: Digital Technologies in Architecture by Matthew Poole, which is kind of a follow up
In my opinion learning Grasshopper will be enough and there is no need to learn Python to use it successfully. Best to have a deep understanding of Grasshopper and what it can do then to try and learn too many things at once. It will help you in applying the principles to other code and not the other way round (ie. learning the concepts first and then going into grasshopper). The best way to learn the concepts is by applying and trying them in a tool like Grasshopper.
I absolutely recommend that you visit a Grasshopper workshop, as that will teach you a lot more than Youtube videos. If you cant visit a workshop, then I recommend the rese.arch video series on Grasshopper. They're really indepth and go from simple introduction to very advanced. You should ideally buy and complete all of them.
Also there is of course Dynamo and its integration with Revit and BIM, which is something to look at, although Grasshopper covers all of that as well, at least with the integration with ArchiCad. Autodesk products are more common around the world though.
Be aware that a lot of the power of Grasshopper is also in the plugins you can get for it, like Kangaroo (physics simulation), Ladybug&Honeybee (environmental analysis), Karamba (finite element analysis), Hoopsnake or Anemone (looping) and many, many more. You can find them at food4rhino.com.
Good luck!…
d the fact that one pipe goes out and one goes in, that the surface normal direction is opposite for the two surfaces? Based on an earlier thread, you should know why by now. The two curves have opposite directions (again!); see the white arrows using Rhino 'Analyze | Direction'?
As before, you can fix that by flipping one curve to match the other. HOWEVER, you connected your curves directly to the 'Divide' components instead of using 'Crv' geometry params - bad form. And as before, you "fixed it" by reversing the list of starting points ('S' input to 'BiArc'). Better like this - 'Crv' params are internalized, no need for Rhino file:
Well, well! That didn't fix the opposite surface normals after all! Trust me, though, using geometry params and being conscious about matching curve directions is "best practice". But I haven't lofted 'BiArc' curves for awhile, it's late and I want to move on. OH! I just noticed that you reversed the 'Z' direction for one half of the 'BiArc' - that explains it:
Moving on... You've basically got it, though I would do it differently - same result, like this:
I haven't really explained surface normal vectors - can you figure it out from here? One more little wrinkle (Normal_2017Mar17b.gh):
…
Added by Joseph Oster at 12:03am on March 18, 2017
ported to Rhino and "set" in Grasshopper, i trim both surfaces from their rectangular bases so that when sDivide is used it creates and distributes the same number of points on each surface.But heres the problems: a) if i use the "trimmed" surfaces with SrfGrid it errors warning: "A point in the grid is null. fitting operation aborted".I'd learned this was caused by "nulls" replacing position Data Items when the rectangular grid(surface base) was trimmed away. So i used Clean Tree which worked removing all nulls, then Shift Paths\Flip Matrix to create line-endpoint pairs for Polyline\Evaluate Curve. I Flattened the last Flip Matrix placing all data items in one source for SrfGrid, like in the working Untrim\CopyTrim definition.This time,.b) SrfGrid errored with: "The UCount value is not valid for this amount of points",.So, i substituted a 356 value, numeric Slider in the Addition B param., and tested its range until a valid UCount was found. Then SrfGrid fitted a surface thru the points, BUT,d) those SrfGrid surfaces are extremely deformed even thought the points preceding it from Evaluate Curve are accurate,SEE: def: "3b-RGH_SurfaceBlend.gh",AND,.a2) if i use Untrim with CopyTrim then SrfGrid works, but since the Jokers limbs WILL be in different surface positions then the blends between the Arm (for example) will rise from its relative FLAT position on the untrimmed Source surface to the Arm on the Target surface, rather than morphing from the Corresponding Arm position on the Source surface,. ..see def.: "4-RGH_SurfaceBlend.gh"So please let me know,..1) how to produce accurate surfaces from SrfGrid in def.: "3b-RGH_SurfaceBlend.gh",. ..(NOTE: BOTH these def's contain 2 indentical, "internalized" surfaces, but if def. 3b can be made to work it will also work with Dis-similar surfaces)2) which component to use or how else to determine the correct UCount value for a specified amount of points(ie:155), re: SrfGrid error: "The UCount value is not valid for this amount of points",.3) how else to force SrfGrid to work with Trimmed surfaces?, AND,..4) how to force intersurface, point-blend correspondence lines: Polylines(PLine) to be connected between correctly! correponding positions (Limbs) on the surfaces?,
Really! appreciate all help, definitions and kind generosity common to this knowledgable membership,
Cheers!,
Jeff…