ate 2 lists (nList, fList) as reference, and produce this fnList <List of List of On3dPoints> - when I output the resulting Object, it gives me the Object Type, not the value and therefore unusable. Please see image for the output. In the code below, I think the only lines one would need to evaluate is the bold lines.
Am I getting confused with referencing or something?
Thanks in advance,
kat
Private Sub RunScript(ByVal wPt As List(Of On3dPoint), ByRef NodeList As Object, ByRef FacetList As Object, ByRef FacetNodeList As Object) Dim xlApp As Object Dim i As Integer ' override language Dim oldCI As System.Globalization.CultureInfo = system.Threading.Thread.CurrentThread.CurrentCulture System.Threading.Thread.CurrentThread.CurrentCulture = New System.Globalization.CultureInfo("en-US") 'Grab a running instance of Excel xlApp = System.Runtime.InteropServices.Marshal.GetActiveObject("Excel.Application") Dim wb As Object = xlApp.ActiveWorkbook 'Dim sheet As Object = wb.ActiveSheet Dim xlshNode As Object = wb.Sheets("Nodes") Dim xlshFacet As Object = wb.Sheets("Facet Table") Dim nList As New List(Of On3dPoint) Dim fList As New List(Of On3dPoint) Dim fnList As New List(Of List(Of On3dPoint)) Dim rowBegin As Integer = 3 Dim col_name As Integer = 1 Dim col_1 As Integer = 2 Dim col_2 As Integer = 3 Dim col_3 As Integer = 4 If Not IsNothing(xlshNode) Then nList = readIntoPoints(xlshNode, rowBegin, col_1, col_2, col_3) NodeList = nList End If If Not IsNothing(xlshFacet) Then fList = readIntoPoints(xlshFacet, rowBegin, col_1, col_2, col_3) FacetList = fList End If If Not IsNothing(nList) And Not IsNothing(fList) Then For i = 0 To fList.Count - 1 Dim pt1Name As Double = fList(i).x Dim pt2Name As Double = fList(i).y Dim pt3Name As Double = fList(i).z Print("pt1Name : " + CStr(pt1Name)) Print("pt2Name : " + CStr(pt2Name)) Print("pt3Name : " + CStr(pt3Name)) Dim ptList As New List(Of On3dPoint) Dim ptOne As On3dPoint = nList(CInt(pt1Name) - 1) Dim ptTwo As On3dPoint = nList(CInt(pt2Name) - 1) Dim ptThree As On3dPoint = nList(CInt(pt3Name) - 1)
ptList.Add(ptOne) ptList.Add(ptTwo) ptList.Add(ptThree) Print(" ptList : " + ptList.ToString) fnList.Add(ptList) Next FacetNodeList = fnList End If End Sub…
lly it should not make much of a difference - random number generation is not affected, mutation also is not. crossover is a bit more tricky, I use Simulated Binary Crossover (SBX-20) which was introduced already in 1194:
Deb K., Agrawal R. B.: Simulated Binary Crossover for Continuous Search Space, inIITK/ME/SMD-94027, Convenor, Technical Reports, Indian Institue of Technology, Kanpur, India,November 1994
Abst ract. The success of binary-coded gene t ic algorithms (GA s) inproblems having discrete sear ch sp ace largely depends on the codingused to represent the prob lem variables and on the crossover ope ratorthat propagates buildin g blocks from pare nt strings to childrenst rings . In solving optimization problems having continuous searchspace, binary-co ded GAs discr et ize the search space by using a codingof the problem var iables in binary st rings. However , t he coding of realvaluedvari ables in finit e-length st rings causes a number of difficulties:inability to achieve arbit rary pr ecision in the obtained solution , fixedmapping of problem var iab les, inh eren t Hamming cliff problem associatedwit h binary coding, and processing of Holland 's schemata incont inuous search space. Although a number of real-coded GAs aredevelop ed to solve optimization problems having a cont inuous searchspace, the search powers of these crossover operators are not adequate .In t his paper , t he search power of a crossover operator is defined int erms of the probability of creating an arbitrary child solut ion froma given pair of parent solutions . Motivated by t he success of binarycodedGAs in discret e search space problems , we develop a real-codedcrossover (which we call the simulated binar y crossover , or SBX) operatorwhose search power is similar to that of the single-point crossoverused in binary-coded GAs . Simulation results on a number of realvaluedt est problems of varying difficulty and dimensionality suggestt hat the real-cod ed GAs with t he SBX operator ar e ab le to perform asgood or bet t er than binary-cod ed GAs wit h t he single-po int crossover.SBX is found to be particularly useful in problems having mult ip le optimalsolutions with a narrow global basin an d in prob lems where thelower and upper bo unds of the global optimum are not known a priori.Further , a simulation on a two-var iable blocked function showsthat the real-coded GA with SBX work s as suggested by Goldberg
and in most cases t he performance of real-coded GA with SBX is similarto that of binary GAs with a single-point crossover. Based onth ese encouraging results, this paper suggests a number of extensionsto the present study.
7. ConclusionsIn this paper, a real-coded crossover operator has been develop ed bas ed ont he search characte rist ics of a single-point crossover used in binary -codedGAs. In ord er to define the search power of a crossover operator, a spreadfactor has been introduced as the ratio of the absolute differences of thechildren points to that of the parent points. Thereaft er , the probabilityof creat ing a child point for two given parent points has been derived forthe single-point crossover. Motivat ed by the success of binary-coded GAsin problems wit h discrete sear ch space, a simul ated bin ary crossover (SBX)operator has been develop ed to solve problems having cont inuous searchspace. The SBX operator has search power similar to that of the single-po intcrossover.On a number of t est fun ctions, including De Jong's five te st fun ct ions, ithas been found that real-coded GAs with the SBX operator can overcome anumb er of difficult ies inherent with binary-coded GAs in solving cont inuoussearch space problems-Hamming cliff problem, arbitrary pr ecision problem,and fixed mapped coding problem. In the comparison of real-coded GAs wit ha SBX operator and binary-coded GAs with a single-point crossover ope rat or ,it has been observed that the performance of the former is better than thelatt er on continuous functions and the performance of the former is similarto the lat ter in solving discret e and difficult functions. In comparison withanother real-coded crossover operator (i.e. , BLX-0 .5) suggested elsewhere ,SBX performs better in difficult test functions. It has also been observedthat SBX is particularly useful in problems where the bounds of the optimum
point is not known a priori and wher e there are multi ple optima, of whichone is global.Real-coded GAs wit h t he SBX op erator have also been tried in solvinga two-variab le blocked function (the concept of blocked fun ctions was introducedin [10]). Blocked fun ct ions are difficult for real-coded GAs , becauselocal optimal points block t he progress of search to continue towards t heglobal optimal point . The simulat ion results on t he two-var iable blockedfunction have shown that in most occasions , the sea rch proceeds the way aspr edicted in [10]. Most importantly, it has been observed that the real-codedGAs wit h SBX work similar to that of t he binary-coded GAs wit h single-pointcrossover in overcoming t he barrier of the local peaks and converging to t heglobal bas in. However , it is premature to conclude whether real-coded GAswit h SBX op erator can overcome t he local barriers in higher-dimensionalblocked fun ct ions.These results are encour aging and suggest avenues for further research.Because the SBX ope rat or uses a probability distribut ion for choosing a childpo int , the real-coded GAs wit h SBX are one st ep ahead of the binary-codedGAs in te rms of ach ieving a convergence proof for GAs. With a direct probabilist ic relationship between children and parent points used in t his paper,cues from t he clas sical stochast ic optimization methods can be borrowed toachieve a convergence proof of GAs , or a much closer tie between the classicaloptimization methods and GAs is on t he horizon.
In short, according to the authors my SBX operator using real gene values is as good as older ones specially designed for discrete searches, and better in continuous searches. SBX as far as i know meanwhile is a standard general crossover operator.
But:
- there might be better ones out there i just havent seen yet. please tell me.
- besides tournament selection and mutation, crossover is just one part of the breeding pipeline. also there is the elite management for MOEA which is AT LEAST as important as the breeding itself.
- depending on the problem, there are almost always better specific ways of how to code the mutation and the crossover operators. but octopus is meant to keep it general for the moment - maybe there's a way for an interface to code those things yourself..!?
2) elite size = SPEA-2 archive size, yes. the rate depends on your convergence behaviour i would say. i usually start off with at least half the size of the population, but mostly the same size (as it is hard-coded in the new version, i just realize) is big enough.
4) the non-dominated front is always put into the archive first. if the archive size is exceeded, the least important individual (the significant strategy in SPEA-2) are truncated one by one until the size is reached. if it is smaller, the fittest dominated individuals are put into the elite. the latter happens in the beginning of the run, when the front wasn't discovered well yet.
3) yes it is. this is a custom implementation i figured out myself. however i'm close to have the HypE algorithm working in the new version, which natively has got the possibility to articulate perference relations on sets of solutions.
…
is our fault that it fell through the cracks.
To answer your questions:
1. With the current capabilities in Honeybee, there are three possible ways by which air enters a zone and each has its own separate set of inputs. These three are:
1) Infiltration
2) Ventilation through the HVAC System
3) Natural Ventilation
The reason for having these three separate set of inputs is because each of these has a separate reason for why the air is entering the zone:
1) Infiltration - This is air flow into the zone through cracks in the walls that you cannot control.
2) Ventilation through the HVAC System - This is additional ventilation that you do to ensure that occupants have enough fresh air to breathe and that smells do not accumulate. Note that ventilation through the mechanical system can only happen if the zone is conditioned so, if you wanted to adequate minimum ventilation in a completely passive zone, you have you use the third option below (or boost up your infiltration to an acceptable level).
3) Natural Ventilation - This is ventilation, usually at high volumes, that you are doing to cool down the zone in place of using mechanical cooling.
You can set the first two (infiltration and ventilation through the mechanical system) with the 'Set EnergyPlus Zone Loads' component. For the case that you describe, you should not add the two together but input them like so:
I am assuming that the minimum ventilation to ensure occupants have enough fresh air is 0.5, in which case, you don't need to add the two but can subtract the infiltration from the mechanical ventilation. I have included Abraham's awesome converter components from ACH to m3/s-m2, which should make things easier in your case. For natural ventilation, you have to use the "Set EP Airflow" component.
2. Oh gosh, I did not realize that I had said that in the videos. As you have stated, you are absolutely right that you want to leave a bit of room between your heating setpoint and your minTemperatureForNatVent. I know that there currently is not a video on the Set EP Airflow and I will make this clear when I put up a video on it soon. I should also probably take out the example of an infiltration schedule from the videos too since I know that was the best the components could offer in terms of air flow at the time. I usually leave at least 2 C between my heating setpoint and the minimum temperature for natural ventilation (usually my heating setp is 20C and nat vent setpoint is somewhere between 22-26C depending on how tightly controlled temperature needs to be in the space). This is the case, unless I am crafting some special type of summertime night flushing scheme where I will use the HVACAvailabilty input on the 'Set EnergyPlus Zone Schedules' to shut down the heating system for part of the year. To clarify again what happened in your case, setting the minTempForNatVent to the heating setpoint will mean that windows immediately open once the heating setpoint is reached, causing the heating system to immediately turn back on after it has just been turned off. Over time, you have this rapid oscillation between heating and opening windows that just blows through a ton of energy.
3. You cannot use the 'Set Ideal Air Loads Parameters' to account for a COP. By definition an ideal air system does not include a COP (ideal air is the type of HVAC system that the 'Run Energy Simulation' component uses). Ideal air systems can only tell you the heat removed or added to the zone by the system - NOT the values of electricity or fuel that it might take to add or remove this heat. If you want to obtain a rough estimate of your heating and cooling with those COPs you can post-process the results using the native grasshopper division component like so:
We did not include a formal Honeybee component to perform this division operation because we want you to be aware of what is going on. This division gives you a rough estimate of the energy but it is not as accurate as modelling a complete HVAC system. We are currently building out the capability to do this with the OpenStudio component. I have attached a file with the native GH division for you.
4. Looking at your file, it makes sense that the constructions in general would not change the simulation much since you only have a single side of the box that is not adiabatic. However, I imagine that the bigger reason why the simulation is not changing much is that the constructions I see that you are using have poor R-values for the heating-dominated climate that you are working in. Try just making a no-mass material and boost up the R-value to a high amount (maybe something crazy like 15) and you will see the maximum cutting of heating energy you can get by making a thick envelope.
Sorry again that no one got back to you in a good time frame but I also realize this discussion took a long response. In the future, it might be better to break this up into a few discussions with more focused topics. That way, the people who know the individual topics you brig up can get back to you individually much faster.
-Chris…
Meeting Agenda:
1) Discuss what the group would like to learn this term through our regular scheduled meetings. Topics include the priority and sequence of Grasshopper exercises we would like to explore during the winter term from http://www.digitaltoolbox.info/grasshopper_basic.html and Processing tutorials from the Processing Handbook I received from MIT.
2) Watch the Matt Storus Church Machine video and have a discussion about parametric and generative tools in design.
If you have a chance, please read the following article by Tim Love called Between Mission Statement and Parametric Model at:
http://places.designobserver.com/entry.html?entry=10757
3) Discuss a possible design build project over the following winter and spring terms using the skill set this group is developing. Conversation led by Chris Nielson (please see comments below for a brief backstory)
4) Discuss possible applied research and design work for the National Conference on the Beginning Design Student paper, Machine Craft and the Contemporary Designer: exploring parameters and variables through making physical artifacts. I wrote the attached abstract and submitted it for the conference the past fall and it was accepted. To continue with the research I need to assemble a team of students that will help explore the principles I set forth by making physical objects with the cnc router. In exchange for helping with the research I will show participants how to use the cnc router, how to author machine code and provide you with the cnc controller interface software necessary to simulate machine movements. Not to mention, your work will be sited in the research paper I present at the conference at UNC Charlotte in March. More tomorrow night, of course.
Thank you for your interest and I hope to see you there.
Sincerely,
Erik Hegre
Chris Nielson Reply by Eugene Parametric Society on January 7, 2010 at 12:02pm
All,
In response to Erik, who requested that I describe my intentions in a design-build project and to the article posted (definitely required reading for this group) I propose that we begin development of a project that spans the realm of "sustainable social" architecture and parametric design. The particulars of such a design do need to be made concrete, and it will be important to define the goals of such a project.
Therefore, I would suggest that this serve as a forum for the next few weeks for those interested in producing a built project. I agree with Nico that it may not be feasible to create the built piece, whatever it may be, this term; however we should have the groundwork and a plan in place by the end of the next 10 weeks.
Either way, I would ask that everyone who is interested to please provide as many concepts to this forum to begin a discussion. If you are indeed interested, please submit goals that this project could achieve (energy, socially, aesthetically, economically, related) and perhaps what you envision the project to physically be (shading device, public bench, water catchment, interactive thermal contraption, etc . . . )
I look forward to hearing your thoughts!
Cheers,
Christopher…
cess informing the user the network is incomplete.
I've been thinking for a while about reading in these blobs of incomprehensible data in an attempt to maintain them through an open/save cycle, but I'll never be able to get this process watertight.
2) When you release components, you should try and make sure that they are backwards compatible previous releases. For example, if you decide to change the number of inputs/outputs or the type of inputs/outputs, this might well break file IO. What you should do in those cases is:
- Copy-paste the old component source code and change the ComponentGuid property. In essence, you make a different component which will have the changes.
- Change the Exposure property on the old component to be GH_Exposure.hidden. This will hide the component from the interface.
This basically means that when people open a file that uses the old style component, they'll get the old-style component. If people instantiate the component anew, they'll get the new component.
Grasshopper and it's default gha assemblies feature dozens upon dozens of these hidden components, sometimes there's as many as 4 old-style components out there.
3) If you want to store additional data in the ghx file for a specific component, you'll need to override the Read() and Write() methods. Something like this:
Public Overrides Function Write(ByVal writer As GH_IO.Serialization.GH_IWriter) As Boolean
writer.SetBoolean("MySpecialBooleanValue", m_myBoolean)
writer.SetString("MySpecialStringData", m_myString)
Return MyBase.Write(writer)
End Function
and
Public Overrides Function Read(ByVal reader As GH_IO.Serialization.GH_IReader) As Boolean
m_myBoolean = False 'Default state
m_myString = String.Empty 'Default state
reader.TryGetBoolean("MySpecialBooleanValue", m_myBoolean)
reader.TryGetString("MySpecialStringData", m_myString)
Return MyBase.Read(reader)
End Function
It is usually possible to make the Reading process smart enough to handle backwards compatibility. You can ask the reader object whether or not a certain value exists and you can then decide whether you can safely use old or new reading logic. So any changes to this part probably don't require you to create a duplicate component and hide the old one.
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
Added by David Rutten at 2:34am on February 26, 2011
--
Did you try the package Camilo posted earlier and did you get any input by what he set up for processing? (If not I can give you a short guide, but won't if not necessary, you need to install a few libaries along with processing and set your LPDs device number in the processing sketch.. ).
Anyway - he got an Input from what he did into GH via a UDP signal read by Firefly or Ghowl.
His only problem was, he had the following input when he turned a knob... for instance ... :
Knob 1, turn fully right position: (1;129)
Knob 2, turned to middle: (2; 64)
Kno 2, tuned to left: (3;1)
But when he turned button 2; the first input was "gone" since the second signal arrived via UDP; and so was his parametric value and therefor he couldn't use more than one button to assign different values to his parametric model..
Therefor in GH, the UDP Listener gets connected to split and fed into 2 Item lists.
The item lists get inserted into a c# script (maybe you can do this in GH differently, but I have no idea how....):
Now, you create a C# script in GH. It comes with 2 inputs, I named them "WAHL" for the number channel (which knob), and "wert" for their value. They get connected to the 2 Lists. Both inputs should be set to integers.
Then zoom in and add more outputs to the GH C#-component, as many as you have buttons, I named them after LPDs Numeration K1-K8. The code when you doubleclick it as such is, for 2 buttons:
if( WAHL == 1){ a = wert; } if( WAHL == 2){ b = wert; }
K1 = a; K2 = b;
// and under the following section I set the following
//<Custom additional code> int a = 1;int b = 1;
Just add more lines for more buttons.
That's it. It just assigns the input value depending on which button is pressed to a reference object in the #c script, so the value won't get lost.
----
Now my problem is still that through Camilos processing script I don't get any signal from the buttons in the first place, only from the knobs. With the midibus library I did get them though, but his script uses the proMidi library... something to ask around for in the processing forums... …
s mostly related with panelization. Panelization means many things, for instance (1.1) designing an aluminum facade system (most common case: "hinged" extrusion profiles that contain opaque or transparent materials - the "facets"), (1.2) designing insulation and final "coating" in roofs, (1.3) ... (1.n) continue at infinitum.
2. Let's stick to the least understood (and less glamorous) part : topic (1.2). The best core material for the core job is FOAMGLAS:
http://www.foamglas.co.uk/building/applications/
3. Most ignorants in our trade believe that the main point/task of a thermal insulation is the U thing. But in fact is the Dew Point (DP) management the most important of them all (DP = critical temperature at witch the relative humidity reaches saturation). Thus we arrive to the compact "roof" (or some compact "part" of the AEC thing) matter: (3.1): Dew point INSIDE the thermal insulation, (3.2): no thermal bridges, (3.3): no air from the application medium (say plywood, corrugated/flat sheets, special Foamglas Px panels etc etc) up to to the water proofing membrane(s) (say 2 layers of SBS bituminous membranes). Here's the most typical case of them all (special tapered inserts not shown - notice the cladding fixing method without perforating the sheets, no other insulating material can do that):
4. The above image brings us directly to Kangaroo matters (if we add the "liquid" thing meaning no linear geometry around). By "liquid" I mean that our working surface is no more "flat":
In particular we must: (4.1) test if the corrugated sheets can follow the curvature (they can up to a point), (4.2) test if the FOAMGLAS panels (straight "boxes") can safely AND FULLY adhere to the medium without spending the GNP of Nigeria to do it (*), (4.3) test if the VM Zink (or Kalzip) cladding systems can cut the mustard - they are more flexible than the corrugated sheets (and can been tapered on-the-fly, Germans are very innovative on that matter) ... but... well ... you understand where the issue is, I do hope.
(*) you can use 85/25 bitumen (cheap and nightmare to put it) or PC500 (very expensive and easy to apply). Obviously some mechanical fixing is required as well.
And what is the most important test of them all? Well ... the 4.2 thing, what else?
more soon.
…
Send Feedback
Defines enumerated values for all implemented corner styles in curve offsets.
Namespace: Rhino.GeometryAssembly: RhinoCommon (in RhinoCommon.dll) Version: 5.1.30000.12 (5.0.20693.0)
Syntax
C#
public enum CurveOffsetCornerStyle
Visual Basic
Public Enumeration CurveOffsetCornerStyle
Members
Member name
Value
Description
None
0
The dafault value.
Sharp
1
Offsets and extends curves with a straight line until they intersect.
Round
2
Offsets and fillets curves with an arc of radius equal to the offset distance.
Smooth
3
Offsets and connects curves with a smooth (G1 continuity) curve.
Chamfer
4
Offsets and connects curves with a straight line between their endpoints.
…
s, Mesh Pleated Inflation". I am not an expert of this way of modelisation (first time today) but it is named funicular.
Almost 2 ways
1) Kangarooo from Daniel Piker , see example
2) http://www.grasshopper3d.com/profiles/blogs/finding-funicular-forms-using-the-dynamic-mass-method
I propose you a script, far from the real one but could help you to build a surface like you want, a smooth one. The real is not like that it has a lot of V shapes.
1) draw on XY plane the main lines of the structure
2) draw surface with rhino with corner points. Always with the same way in order to have U and V aligned correctly.
3) extract fixed edges with rhino (yellow here) put them on a specific layer
4) F10 => select all control point except edges moves them upper (z > 0)
I gave you an example far from perfect at this time. It uses kangaroo. Open rhino first followed by GH script.
Ways to improve :
In real shapes begin in zigzag.
Surface must be added to kangaroo, surely through mesh... play with goals of Kangaroo ... …
ach object has a "Source" property (layer, parent, object) - my fix causes it to look at this source property in order to determine where to draw the plot width value from. I was already doing this for color and material, but had neglected to do it for plot width.
2. The "Print Preview" viewport display option is calling the "PrintDisplay" command in Rhino, which you will notice takes a "Thickness" value - this is the conversion factor between plot weights/print widths (in mm) and the number of pixels in absolute screen width. As you note, this is a relative and not an absolute width in model units, so it does not change when you zoom. In most design applications it would be quite strange to specify the print widths of your geometry in absolute units - e.g. setting your lines to be 50 ft thick. In illustrator you are always working in "Paper Space" whereas in Rhino you have to be aware of the differences between Model Space and Paper Space (or Layout Space in Rhino terminology.)
My lineweight preview component operates on the basis of pixels - if you tell it "2" it will display a 2px-wide line irrespective of your zoom. The 4x conversion ratio you note is purely a function of the setting of your PrintDisplay command in Rhino.
3. The good news is my custom preview component ALSO supports "Absolute" lineweights in world-space units - so that they create a line that gets fatter when you zoom in and thinner when you zoom out (though it can't get thinner than a pixel, naturally.) Set the "Absolute" toggle (the 4th option" on the component - I think it will create the "Illustrator-like" behavior you're looking for, without having to create surfaces from your lines.
4. The dynamic pipeline component updates when the by-object plot weight changes. It does not update when the layer-level plot weight changes. In the end I have had to make some judgment calls about what kinds of changes should trigger a component refresh: too sensitive, and a definition could be forced to recompute unnecessarily on every little change; too insensitive, and you require too many forced refreshes.
In general I have focused on triggering updates from object-level attribute changes (Where they conceptually represent data about THIS OBJECT) and NOT from layer-level attribute changes (Where they conceptually represent data about a category). The Layer Table is the component that is designed to report changes to layer-level settings - and with "Auto Update" enabled on this component, it will in fact trigger an update on layer-level attribute changes.
With this approach, you may have to match up your geometry to the layers it belongs to, and then use the layer table component to retrieve the plot weight settings. The definition shown below is an example of how to do this. It assumes you are using layer-level plot weights.
…