till quite rough.
I went through your attached log but it seems to be a successful run, perhaps the error log wasn't attached. In any case, I believe we have identified this issue. The goal of the update fvSchemes component was to apply schemes to finalized meshes in an automatic way. While this is useful for new users it is also a dangerous thing to do in CFD studies.
The component works by relating mesh quality to the mesh non-orthogonality, which the checkMesh component reports. While non-orthogonality is one of the important criteria of mesh quality it does present difficulties on some kind of meshes, especially like the simple cases that BF has been meshing so far.
The example case of simple box buildings in a wind tunnel above for instance will appear as a good quality case for even the lowest of cell-count meshes, simply because it is an orthogonal geometry. That means that checkMesh will probably report low values (imagine an empty blockMesh of 10m blocks has a non-orthogonality of 0) which in turn means that higher order schemes might be paired with actually low quality meshes. This I believe is causing problems.
I posted a possible solution to this here https://github.com/mostaphaRoudsari/Butterfly/issues/57. The idea is that Buttefly provides additional options to the users, enabling them to choose between first-order (faster, more robust, but lower quality schemes) and second-order (slower, less robust, but more accurate) schemes depending on mesh quality, stage of assessment, etc. In cases like the above mesh quality a first-order scheme might provide a better option. To test this I am attaching an fvSchemes file you can use by replacing yours in the /system folder of the case.
As a note however, I would like to stress there is so much that a tool like Butterfly can provide in this area. Meshing is a quite complicated and demanding part of the process, involving a lot of trial and error. Sometimes the problem is just the mesh and not the solution options (GIGO stands true in CFD as well). It does however get easier with experience. The safe advice is the simplest one: when changing solution options doesn't help, refine mesh and run again.
Kind regards,
Theodore.…
Grasshopper. So, I once made an attempt to bind ms sqlServer in order to get frozen definitions at some states, to avoid managing baked objects in Rhino and also be able to retain whole results without using the GH state manager that rebuilds everything.
But at that time GH's VB.Net component didn't properly read referenced dlls and I forgot it since then.
At first, I was surprised by Slingshot's extensive interface : I was still having in mind my own old project, a tool that would have acted at the Rhino's geometry object level, and auto creating the needed tables.
The bd would have consisted of a main table, owning the objects ID and name, and related tables containing the necessary information relative to the main objects.
For example, a Brep is made of so and so underlying objects, passed to respective tables, according to GH objects definition layout (just the way they are written in the xml schema).
Then, on a db, query an object by name, and retrieve the whole object or underlying objects (e.g. at the bounding curves level, or points level for a Brep).
With Slingshot, I made a few attempts to cheat GH with BLOB data fields, but no way to get a whole object. It seems that GH simply provides an object.toString ... and GH is definitely not conceived to produce persistence outside of Rhino. If I have some spare time, I will try to extract
About points and colors, I am now simply using a single field with CHAR(asLargeAsNeeded...), as GH parses String to every Point (or Vector or Color) entry of any component.
I do so because it need less to display on the canvas...
Whatever I wrote before, I really like your conception, as opened to relational interactions between ...whatever you need or dream of !
One last thing : GH can't open the definition file "Genome_DB_Template.gh" that I've downloaded from your site : http://slingshot-dev.wikidot.com/database-genome. I was expecting to learn a lot from your very smart stuff ! (I am running GH 08.00.13 and Slingshot 0.7.2.0)
Slingshot is running great, opened to any use...Thanks again.
Best,
Stan
…
Refinement component at first, possibly using MeshMachine instead which is slow but actually gives many fewer triangles and adaptive meshing for tight curves too. Neither are easy to adjust on a deadline!
Then you have to sneak up on workable settings, using only a few lines, or Grasshopper will freeze perhaps indefinitely for 200 lines with extreme settings, especially the CS (Cube Size) setting that can blow up into a huge number if your scale is big.
Cocoon gives lots of nearly flat split quad faces so I quadrangulated those for fun:
Or MeshMachine can refine the mesh to make it efficient:
Whereas the Cocoon Refine component will merely return an equally fine mesh with more equilateral triangles but no serious remeshing to rid so many tiny triangles where they are not needed? Actually, it does seem to remesh also:
David said he used some of Daniel's MeshMachine code in there.…
ey provide all the means to what I try to achieve.
What I need is to get a fast (as possible) evaluation of passive heat/solar gain from a certain facade. I know my building can cool to a certain degree (lets say 80 W/m2 - now lets forget other internal gains) and I want to be sure my facade is not letting excessive amounts of heat into the room/building. Normally I would make a full blown simulation to count my overheating hours and thereby evaluate my facade. To speed up the process, the idea is just to evaluate overheating hours in a faster way. So what I am thinking is that excessive amounts may estimated by counting high intensity irradiation patches in a critical sky-component or whatever such thing would be called that surpasses my sensible cooling load. My hope is that any facade visible to the sky-patches would very similar to the number of overheating hours if properly calibrated to a simulated model. However I have no idea right now, if this can be done.
Why do this? Speed, convenience, whole building thermal analyses.
@Chris and @Abraham The critical sky-component is made with LBs radiance component radiation and filtering the beam-components with highest effects from a yearly epw-file.
@Chris Conductive heat gains are also important especially if the facade is badly insulated, so the next step is to filter the outdoor temperature parallel with that critical sky-component and then do a static heat transfer analysis and combine that with the effect from direct sun influence. Again, no idea if it works.
Hope it makes sense. I a little embarrassed I drew you into this little experiment. This was not at all the point of the discussion. But now we are into it I like to know what you think. If it works its kinda neat, at least i think it is.
/K…
pts organize in a data tree without losing the data structure. To create a folding surface as per image attach.
1. Replace items (to create a gradient) / Like the weight culling example.
Path {0} replace all indexes with a new value (a)
Path {1} replace 90% indexes with a new value (a)
Path {2} replace 80% indexes ...
2. Decrease value (a) in relation to path number
3. After Replace the above items value with
for even path number {0,2,...} replace items with a negative number
Did not find a easy way to create data tree that would achieve the above inside GH.
Point 2 & 3 are easy but i could not found a simple solution for points 1.
At the moment the only way i found is to create the list in Excell manually and import/ export or to create a list on indices for each path.
Any hint appreciated.
Might need to wait for the number slider or path mapper to accept input or notation ?
best
Stephane
…
on 2: I think the reason to draw a fitness landscape is to highlight graphically the presence of local minima, even in a simple optimisation problem. In architectural terms, this means getting an idea of how many sub-optimal solutions there are in a problem, which helps while exploring conceptual design proposals.
Have a look at this very basic example (which I published with two colleagues on "Shell Structures for Architecture", chapter 18): a shell footbridge (24m x 4m footprint), which is generated by two parabolic section curves (the two apex heights are the two design variables). The maximum displacement of the structure under gravity load and self-weight is the objective function. Simple example, but several local minima and interesting shell forms (image below).
@AB,
The expression used by David in the Number of Samples Input is a simple “x+1”. By grafting the Divide Curve Output, he got 81*81 lenghts (6,561 values). You have to make sure that number is divisible by the no. of samples. The second expression used for the Length output is only a scaling factor (my guess), to control the height of the fitness landscape drawing.
Cheers…
n account of the position of the sun and weather cannot be expressed in terms of a single set of luminous intensity values (which is what IES files do).
With regards to your example files, I agree with Chris. The primary reason for the low illuminance levels is that the light bounces are getting lost in the tube. Have you checked with the manufacturer/distributor if the location of the IES file should be inside the tube and not flush with the ceiling? Physically modelling such tubes in lighting software like Radiance (which is what HB uses) or AGI32 is a fairly expensive proposition. This is one of the reasons why manufacturers provide photometric data for such devices (however simplistic that data might be).
The candelamultiplier increases or decreases the luminous intensity values. So it will have a direct impact on the calculation. The primary reason for having that input was to enable users to do some testing with different lamp types and environmental factors such as dirt depreciation. You need not change them for your simulation. Assuming that the IES file is inside the tube, in order to make this calculation work inside HB you'd have to crank up the calculation settings to a very high level (start with -ab 10 -ad 4096).
Finally, due to shortcomings in the annual simulation software (Daysim), IES files will not work directly work with annual calculations. However, there is a fairly easy workaround for that issue. In case you are planning to run annual calculations with IES files, please let us know here.
Sarith…
ing illuminance and limiting exposure (lux hours). Hours with direct solar irradiance are likely to exceed the limiting illuminance thresholds, which range from (200 to 50 lux as per Table 3.4 in CIE 157:2004). It makes sense to consider direct illuminance (an ab=0 simulation in Honeybee) separately from a normal illuminance calculation.
Assuming that the museum exhibits have low to high responsivity to light, an ideal solution would minimize direct sunlight. For daylight from the sky and reflected light, it might be enough to keep the illuminance levels below the recommended thresholds and then sum up lux-hours.
Daysim, the annual daylighting engine used by Honeybee and DIVA, is not very accurate for direct-sun calculations. You will get more accurate results if you run your analysis with Radiance directly.
Instead of considering the horizontal illuminance grids, one can create grids that correspond to the dimensions of the exhibit and then average those values. I think single points, as shown in your gh file might not suffice. Calculating lux-hours is by far the simplest part of such a simulation. It will only require averaging these points, extracting them into an array and then summing up that array.…