tions or components.
Participants will learn concepts of object oriented programming and essential syntax of C# to endeavour into personally extending cad toolsets. The workshop will focus on introducing the .NET language C# and the Software Development Kit (SDK) RhinoCommon.
Topics
- use of Script Component within Grasshopper
- explanation to the .NET Framework
- introduction to RhinoCommon SDK
- basics of imperative / object-oriented programming
- data types, operators, properties
- variables, arrays, lists, enumerations
- methods
- objects, classes
- control structures: conditional statements (if, else, switch)
- control structures: loops (for, foreach, while, do)
- walk-through iterative und recursive code-samples
- use of RhinoCommon Geometry class library: creation, sorting, editing of Geometry (Points, Vectors, Curves, Surfaces)
- adding (baking) geometry to the active Rhino 3DM Document, including attributes (Name, Layer, Colors etc.)
- introduction to the Integrated Development Environment MS Visual Studio Express Edition
- compiling code to dll/gha files (plug-ins) / making your own Grasshopper custom components
Grasshopper wird auf der .NET Softwareplattform entwickelt, und kann ebenso wie das CAD Programm Rhinoceros mit "RhinoCommon", einem Software Development Kit, erweitert werden.
Dieser Kurs richtet sich an Designer, Architekten, Ingenieure und Techniker, welche mit dem grafischen Algorithmus-Modellierer "Grasshopper3d" sowie dem CAD-Programm "Rhinoceros" bereits vertraut sind und einen Einstieg in die Programmierung von Geometrie erlernen möchten.
Der Kurs Grasshopper II folgende Grundlagen:
Kennenlernen der Script Componente
Erläuterung zum .NET Framework
Einführung in RhinoCommon SDK
Grundlagen d. imperativen / objektorientierten Programmierung
Datentypen, Operatoren, Eigenschaften
Variablen, Reihen, Listen, Aufzählungen
Methoden
Objekte und Klassen
Kontrollstrukturen: Bedingte Ausführung, Schleifen
praxisnahe iterative und rekursive Code-Beispiele für generatives Design unter Verwendung der RhinoCommon Geometrie Klassenbibiliothek - Punkt- und Vektorgeometrie erstellen, sortieren, bearbeiten, Flächen und Netze erstellen - Geometrie in das Rhino 3DM Dokument baken, einschließlich Attribute (Name, Layer, Color)
Einführung in die Entwicklungsumgebung MS Visual Studio Express Edition
Kompilieren von Programmerweiterungen (plug-ins) als Komponenten (custom components)
Details, Anmeldung:
www.vhs-stuttgart.de
Trainer Peter Mehrtens
Kursdauer: 3 Tage x 8 h
Freitag, 21.02.2014, 9:00-17:00 Uhr Samstag, 22.02.2014, 9:00-17:00 Uhr Sonntag, 23.02.2014, 9:00-17:00 Uhr Ort: VHS Stuttgart, Fritz-Elsas-Str. 46/48
Teilnahmegebühr 510,00 €…
ssibili e facili da usare. Il corso parte dalle basi della programmazione di arduino fino ad arrivare all’interazione tra un oggetto fisico ed un imput informativo. tutor: Gianpiero Picerno Ceraso
Programma: I giorno Introduzione al Phisical Computing, input digitali e analogici, le basi del linguaggio di programmazione, esempi applicativi; led, pulsanti, fotorestistenze, servo motore, sensore di temperatura, di flessione, sensori di movimento, potenziometri.
II giorno Arduino ethernet, uso di un relè per carichi elevati, accelerometro, introduzione a Processing, interazione di Arduino e Processing, Introduzione a Grassoppher e Firefly e interazione con Arduino.
orario corso: 10:00 – 13:00 e 14:00 – 17:00 (pausa pranzo 13:00 – 14:00) costo: 150€ + IVA deadline: 13 marzo numero minimo di partecipanti: 3
Per iscrizioni scrivi a info@medaarch.com specificando nome, cognome, mail, recapito telefonico e il nome del corso al quali sei interessato. In seguito all’invio del modulo di pre-iscrizione, i partecipanti riceveranno una mail contenente tutte le specifiche di pagamento.
Per seguire il cluster su Arduino è necessario installare il software Arduino 1.0.5 al seguente linkhttp://arduino.cc/en/Main/Software#.Ux3hQj95MYE facendo attenzione a scaricare quello relativo al proprio sistema operativo, Windows 32 o 64 e Mac OS.
Software necessari solo per una parte del corso: Processing 2.1.1 https://processing.org/download/?processing
Rhino 5 http://www.rhino3d.com/it/download Grasshopper for Rhino5http://www.grasshopper3d.com/page/download-1Firefly http://fireflyexperiments.com/
Il cluster rientra in un fitto calendario di attività formative organizzate dalla Medaarch per lanno 2013-2014.…
you will need to deal with all of the curves that intersect the boundry curve, but you will also need to sort through all of the circles inside because the planar surface algorithm won't sort those out for you. The good news is that because you are using circles and linear segments, you can use "pure" geometry equations for some of these intersections instead of relying on NURBS curve "physical" intersections. In the end this means faster and also "more" reliable intersections (especially with the circles).
Method 1: Dealing with everything as a phyisical curve...
First things first, i guess the "easiest" way to do this would be to translate everything into an OnCurve derived class, and then use the IntersectCurve method to find the intersections. You will need to sort through the resulting ArrayON_XEVENT to find the parameter of each intersection. There should always be 2 intersections, and you're always going to be interested in the intersections of the circle not the boundry curve.
To trim the curves, you'll want to use the Split method along with one of the parameters on the curve that you retrieved from the intersection. The only issue is that the split method gets a bit complicated when using it on closed curves. You could either split at both parameters that you retrieved from the intersection results, then sort through the 3 resulting curves to join the two that you need. Or move the start point of the circle to where one of the intersection points happened, translate the other intersection point to the new curve parameter (ie the parameter will be a different number, but it will be physically in the same place), then split with that new curve parameter.
Method 2: Try and work with the circles as circles
Because you can tell if a circle intersects something by seing if the distance to its center point is less than the radius of the circle, this might be a quicker way to go. If you have the boundry curve as an OnCurve derived class, then you can use the GetClosestPoint method and use all of the center points for each of the circles. The nice thing is that after the 3Dpoint in, and the parameter on the curve that you'll get out, you have the option of supplying a maximum distance. If you do supply that value (use the radius of the circles), then you'll only get a result when the distance is less than or equal to that value. In which case there will be an intersection.
To go even further, you can treat the segments of the boundry curve each as a line, and find the closest point/distance to that. That's maybe more complex than your looking to go, but speed wise, it might just be worth it. Take a look at the following link for more code/discussion on the subject.
http://www.codeguru.com/forum/showthread.php?t=194400
Part 2: Circle-Circle intersections
If you're going to want to make a planar surface out of those circes and the boundry curve, then you'll need to resolve all of the intersections that you have there. Again this is probably something that would be best taken care of by doing some distance tests between the center points of all the circles and seeing if that distance is less than the radius your using. After you've found circles that intersect, you can be try intersecting the curves using the same method mentioned above, or even manually generating the intersection with some trig, but ultimately creating a final result might take a bit of work, especially where you have more than two circles intersecting. The "lazy" way out of this is what's used by the curve boolean command, which is to take each individual curve, make a planar surface from that individual curve, and use standard Rhino booleans to get the result. Luckily you're looking for the union of all those areas, which will be the easiest to create and deal with. After you create the planar surface of each one (RhUtil.RhinoMakePlanarBreps), you can use either RhUtil.RhinoBooleanUnion or the more specialized version, RhUtil.RhPlanarRegionUnion. Note that RhPlanarRegionUnion only takes 2 breps at a time and needs the plane of the intersection.…
as the design table? I think this could be 'drawn' and constrained in Inventor in a lot less time. I know the GH model would have a lot of flexibility, but in this case, what can you do with it that wasn't provided by an Inventor model?
Only the 27 lines mentioned were modeled in Rhino, the rest is modeled with GH.
The 5 hrs involved thinking about the approach, defining vertical lines, tilts, elevations, pitch of the roof, intersections.
Once I had decided what my approach would be, and tested the logic with those first lines, points and data path arrangements, it only took one more hour to get to this:
Which is actually quite fast, compared to MCAD workflows.
If you already have components (columns, beams, etc.) modeled and ready to drop into a project, of course it is lightning fast to model simple projects like this example.
I am not as much interested in those situations, because improving efficiency is straightforward and obvious.
I'm more interested in situations where there are no pre-defined families of objects, in which case you need to start from scratch.
The GH model I'm showing is modeled from scratch, except for the 27 lines in Rhino.
Here's one obvious advantage to modeling with GH, once the definition is set-up, it's virtually effortless to change inputs and alter the overall design. Here's an example, lets say we wanted to extend the roof 3 more units, curling away from the original direction.
Plan view before:
And after:
An MCAD app will also allow you to do this, as long as the location of additional elements follows the existing geometric method of definition. What happens if you want completely change the way you locate columns, roof slope, intersection points?
In MCAD, you'll need to re-model the underlying geometry, which will take the same effort as the first round. In GH, this process is not only much faster, it's open to algorithmic approaches, galapagos, etc. and it just takes some simple re-wiring to have all down-stream elements associate themselves to this new geoemtric definition.
For instance, here's the same definition applied to two curves, which are divided in GH, the resulting points are used as a starting point for lines directed at normal from curves.
This is not so easy to do in MCAD.…
Added by Santiago Diaz at 7:55pm on February 24, 2011
e. (C1 or C0)
In the case of C0: I have four other cases
In the case of C1 I have three other cases
If C0:
Pt1-Pt2 = 1 = 0
2-Pt1 = PTC1
2-Pt2 = PTC2
4-No Equal
if C1
1-Pt1 = PTC1
2-Pt2 = PTC2
3-No Equal.
Its become difficult to manage these conditions..
Thanks…
Added by Rémy Maurcot at 10:38am on April 18, 2012
n fact) according a vast variety of "modes" PLUS the required clash detection (ALWAYS via trigonometry). In plain English: outline any collection of Breps and "apply" a truss that is topologically sound (planarization in case of quads etc is an added constrain). PLUS outline/solve what comes "next" after that truss (like the planar glazing "add-on" brackets of yours [ the ones that need redesign, he he], or some roofing/facade skin system [secondary supports, corrugated sheet metal, insulation, final cladding, dogs and cats])
2. Imaging doing this in real life (nothing to do with "abstract" formations of "lines" or "shapes" or whatever). This means primarily adopting a BIM umbrella: in plain English AECOSim, Revit or Allplan (I'm a Bentley man so I use AECOSim + Generative Components). This also means using "in-parallel" a top MCAD app for 1:1 details, FEA/FIM and the vast paraphernalia required for real-life studies destined for real-life projects (made with real-life money by real-life people). My choice: CATIA/Siemens NX.
3. What to send to Microstation (if not using Generative Components, that is) and/or CATIA? In what "state"? To do what exactly? For instance even if you could design this feature driven tensile membrane anchor custom node in Rhino (you can't) it could be 100% useless in CATIA:
4. Imaging masterminding ways to send them nested instance definitions of ... er ... a coordinate system (all what you need). In plain English: since is utterly pointless to send them nested blocks that can't been parametrically controlled (variations/modifications/PLM management/BOM/specs etc etc)... send them simply the "instructions" to place coordinate systems of components that ARE parametrically designed within Microstation and/or CATIA (classic feature driven design approach blah blah). So GH solves topology et all (working on data imported via, say, Excel sheets related with sizes of components etc etc) and sends to Microstation simply this (a myriad of "this" actually):
I do hope that the gist of the "method" (the ONLY way to invite GH to the party) is clear.
best, Peter…
use I don't agree with the practice of using site EUI as a metric to evaluate the thermodynamic performance, environmental impact, or monetary value of a building. I disagree with this practice for the same reason that there are no "totalThermalLoad" and "thermalLoadBalance" for simulations run with full HVAC. I can summarize these reasons in the following way:
When we run a simulation with ideal air loads, the heating/cooling values we get are THERMAL ENERGY that is directly added to or removed from the zone. In this way, we can draw a rough parallel between these two types of energy since they are are generally of a similar type and quality. As such, I am ok with adding them together to get total thermal load or subtracting them to get a sense of thermal load balance.
However, when we run a simulation will full HVAC, the heating/cooling values that we get are usually HEATING FUEL ENERGY and ELECTRICITY respectively. Fuel energy and electricity are fundamentally two different types and qualities of energy. To cite the second law of thermodynamics, the exergy (or the capacity to do work) of electricity is much greater than that of fuel. This is evident in the fact that, to produce a given unit of electricity, I often have to burn at least 3 units of fuel energy (though this can be much more for inefficient plants). With each step in a power plant - making steam, turning a turbine, turning a generator - there are significant energy losses. This difference in exergy is also evident in the fact that there are so many more things that I can do directly with a unit of electricity than I can do with the same unit of fuel energy. I can use electricity to directly refrigerate, produce light energy or power a motor just as easily as I can use to to cook, produce hot water, or heat a space. While I can cook, make hot water, or heat a space directly with fuel energy, refrigeration and lighting are much more difficult. For this reason, I do not feel comfortable adding electricity and fuel together either in the totalThermalLoad output or in a site EUI metric.
Still, the use of site EUI has become so ingrained in the industry that I have to acknowledge it and at least show users how it's calculated. In my view, it's an ad-hoc metric that was invented to deal with previously limited amount of information on energy sources.
Instead of using site EUI, I would recommend using the following metrics depending on what you are trying to evaluate:
Utility Cost / Square Meter - to measure the monetary value of a building to an owner or user
Kg CO2 / Square Meter - to measure the environmental and climatic impact of a building
Emergy / Square Meter - to measure the overall thermodynamic performance of a building
The first two are actually fairly easy to calculate these days just by researching your site's utility rates or grid energy mixture and multiplying the building electricity or fuel by their respective rates. I will add in some capabilities to Honeybee soon to make it even easier for you to get these values from your EPW file and databases of utility rates/grid mixture. Emergy is much harder to calculate as you have to trace all your energy sources all of the way back to the sun but there are a number of experts at work to make this calculation possible (probably in the next few years, we may have much easier ways to calculate it).
Hope this helps explain the current setup.
-Chris…
ake a network of lines (i.e. a graph) and make a Plankton Mesh, from which you can use Cytoskeleton to make a solid mesh (and then smooth it with Weaverbird).
Works with ngons (polygons with 3 or more sides). Other examples I found only worked with tris and quads.
Works on open or closed surfaces
While these examples start with a surface, you could start with a network of lines and make a patch surface
This is meant for 2D networks/surfaces. I haven't attempted filling a 3D volume. My guess is this wouldn't work as it would require a non-manifold mesh that Plankton wouldn't handle.
Note similar results could be achieved with the following:
TSplines
MeshDual (dual of a tri mesh, not as much freedom/control)
Working backwards, here is the GhPython script from Will Pearson that builds a Plankton Mesh from vertices and faces. The vertices are a list of 3D coordinates, the faces are a tree a lists, with each list containing the indices of vertices that form a closed loop. From Will, "Plankton only handles manifold meshes, i.e. meshes which have a front and a back. This orientation is determined by the "right-hand rule" i.e. if the vertices of a face are ordered counter-clockwise then the face normal will be out of the page/screen."
# V: list of Point3d # F: tree of int
import Grasshopper appdata = Grasshopper.Folders.DefaultAssemblyFolder
import clr clr.AddReferenceToFileAndPath(appdata + "Plankton.dll")
import Plankton
pmesh = Plankton.PlanktonMesh()
for pt in V: pmesh.Vertices.Add(pt.X, pt.Y, pt.Z)
for face in F.Branches: face = list(face)[:-1] pmesh.Faces.AddFace(face)
These vertices and faces are precisely the output from Starling. Starling takes in a list of Polylines which form the (properly oriented) face loops.
The polyline face loops can be generated...
Directly from Panels on a surface using LunchBox
Using any network of lines/curves on a surface (curves will need to be converted to polylines before Starling)
The latter was achieved using the Surface Split command, then converting the face edges (converted to curves) into polyline loops to represent faces.
…
requires four weather data inputs: air temperature (_dryBulbTemperature), relative humidity (relativeHumidity_), wind speed at 1.1 meters from the ground (windSpeed_) and mean radiant temperature (meanRadiantTemperature_).You can add values to the first three inputs from the Ladybug "Import Epw" component. For the last (meanRadiantTemperature_), you can add it from Ladybug's "Outdoor Solar Adjusted Temperature Calculator" component, or let "Thermal Comfort Index" component to calculate it. Both use different methods to calculate the final values.
I attached an example file below with second option.For more precise calculations you can use Honeybee and Chris' microclimate maps.An icing on the cake for the end: one of Ladybug developers yesterday released a set of Ladybug components for modelling in ENVI-met application. ENVI-met is cutting-edge microclimate software, which can be downloaded for free. It opens a number of advanced new analysis in outdoor domain, which couldn't have been done with the current Ladybug+Honeybee tools. So you can perform the simulation in ENVI-met 4 free software, and then add mean radiant temperature values from ENVI-met simulation to "Thermal Comfort Indices" component. Here is an example file.If you would like to go with the last approach, then the best would be to post a question about it in this topic.
1) You can make a polygonized tree.I haven't subtracted the trunk from the crown, but I guess it makes sense that it can be done.2) In most solar related simulations, a default albedo value of 0.2 is used. This corresponds to average albedo value taken from materials surrounding the urban or countryside location (concrete, grass, gravel, sand, asphalt...). However the presence of snow can significantly magnify the average albedo value several times. "Sunpath shading" components albedo_ input has an ability to calculate albedo due to presence of snow, if nothing is added to it (to albedo_ input). As you are performing the analysis of PET in a horizontal plane, it will not affect your calculations.3) Most thermal comfort indices will require performing analysis at 1.1 meters above the ground. This is considered to be height of standing person's gravity center.The same goes for PET index. So you are correct: you should place the analysis grid at 1.1 meters above the ground before adding it to the "Sunpath Shading" component.It is worth mentioning that "Thermal Comfort Indices" component used in this topic's PET_on_Grid2.gh and PET_on_Grid3.gh files is from last year, and much slower than the newest one (VER 0.0.64 MAR 18 2017) used in the example attached below. Just a remainder if you have been using older version of this component.Let me know if I misunderstood some of your questions, or if I missed to answer some of them.
EDIT: sorry for posting a double reply. When I posted it the first time, I only got links visible, with no text. Something has been wrong with grasshopper ning forum for the last couple of months.…