input orientation of the objects. I can see that you've already done this with Vec2pt. Doing it with a sun vector is a little easier, because you are working with one vector, not a bunch of different vectors. you probably know a lot of this already, but I wanted to write a comment that is helpful to anyone coming across the discussion, because it is a common design task.
To orient a bunch objects towards a sun vector:
1. you need a vector to represent the sun's rays. You can either use an existing definition from the web (definitely look at Ted Ngai's amazing work on this), or just make a single adjustable vector as a stand in. I've often simply made a vector using azimuth and altitude angles as inputs, since those are common ways of describing the location of the sun, and makes it easy to look up a sun angle and put it in to your definition.
2. assuming you have some vector to represent the sun's rays, make a plane that is perpendicular to this vector. But Why, Precisely?, you're already familiar with some of the quirks of making a plane perpendicular to a vector, just keep those quirks in mind.
3. next, create reference planes for your panels. If your panels are flat (i.e. planar) this is really simple, just make a list of their planes, using whatever you like (check planarity, evaluate surface, whatever). If your panels are not planar, then you need to decide on a plane you can make from each one that you would like to use as a reference plane. plane from 3 points might be a good method here.
4.take your single plane that is perpendicular to your single sun vector, and place it at the origins of all of your reference planes. Now you have a sun-oriented plane for each panel.
5. Using the orient component, input your reference planes as the reference planes, and input your sun-oriented planes as the target planes, and input your panels as the geometry to transform. You should now have a bunch of panels oriented to the sun vector.
6. In this method, I've assumed that you want your panels perpendicular to a solar vector, to face the vector, but if you want a different relationship to the sun vector, you just need to change the relationship of that single first sun-oriented plane to whatever relationship you would like to make.
One thing to think of when designing for sun angles is just that at any given point in time, for any given point on the earth's surface, the rays of the sun are basically parallel. the angle of these rays changes over time, but at any other time, the rays are still parallel to each other, and can therefore be described by a single vector for each moment in time.…
rather far more "taxing" than the equivalent mesh.
2. Of course we can "decompose" any polyhedron in his BrepFaces and create a mesh.
3. Now the big thing ... see this? It's a sardinizer (C)(tm)(US Patent pending): a collection of paranoid C# thingies that create random planes on surfaces and then apply random transformations on ... er ... hmm ... sardine instance definitions (obviously of the finest quality).
4. Here's some sardines (of the finest quality) according some (user defined) paranoid transformations (i.e. arbitrary) - including scale (response is as always "real-time"):
5. So what may you ask ... what is exactly the big thing with these ^$@^$ sardines? Well the big thing is that these are meshes of a certain "complexity" (i.e. millions of mesh faces):
6. And despite the millions of faces ... GH emerges victorious with regard that "real-time" challenge:
Moral: use only Da Morgada sardines (in pure olive oil > yummy) and have faith.
…
way everything is consolidated and people can share their thoughts.
1. Cluster rollover tips
Id like to be able to create rollover help text for each cluster input. This could be a right click thing once the cluster is created or something specified before creating the cluster (a string input for the cluster input arrows?)
2. Disconnect all outgoing
I'd like to be able to right click on an output of any component and disconnect all wires coming out of it.
3. Disconnect all selected
It would be cool if you could disconnect all incoming or outgoing wires from all currently selected components (instead of just one at a time).
4. List item dynamic slider
There has been a lot of discussion about dynamic range sliders and the issues that they would cause. Id like one specifically for list item selection. This would be an integer slider that would have a range of 0 to the list length-1. If the range remaps and the previous value is no longer available, I think it's best to have the current value stay as close to the previous value as possible.
5. Cluster slider/toggle inputs
I think it could be valuable to cluster a series of variable inputs (like sliders or toggles) to make a sort of options cluster. In this case you could just have a list of sliders and toggles each connected directly to a cluster output arrow, select all and create your cluster. This would be great with the value list component as well.
6. Have a component that could output the x and y location of actual components on the canvas relative to the top left corner... Not sure exactly what you could do with this but I think someone could do something interesting with it.
7. 3d MD slider. I see the option is greyed out and don't see a way to activate it... Still under development? Seems like it could be just like the color picker. Would be cool.
That's all I can think of right now. I'm sure there will be more to come in the future.
Feel free to comment.
-Brian…
Added by Brian Harms at 2:22am on December 15, 2011
plug-ins can be found at this website:
http://www.food4rhino.com/?ufh
Th plug-in we were going to focus on yesterday was Kangaroo - a form finding mesh editing software. The instructions for downloading it can be found here, although all computers in the lab should already have it loaded.
http://www.food4rhino.com/project/kangaroo?ufh
Your assignment for next week is found below.
Assignment #6
1. You are to model a membrane shading structure for the WAAC outdoor courtyard adjacent to 1001 Prince Street (seen below).
This is an abstract model - no measurements are required. A basic model would be the walls of the the various building and the ground. Spend a few minutes in the space and get a feeling for what a shading membrane would look like in the space.
2. Use the Kangaroo plug-in to make the mesh. I would look at the tutorials below (do part 2 of the design Analyze as a minimum).
http://www.designalyze.com/tutorial/columbia-meshing-spring-2013-class-04
https://www.youtube.com/watch?v=9i8dGtcQxMk
https://www.youtube.com/watch?v=yY5WU_8L4S8
There is also a manual that comes with the download. It should be on the share file at the WAAC; if not just download it for free at the Food 4Rhino website.
3. It does not matter what your membrane structure looks like, but try to use your eye as a designer. Think about things such as sun angle, experience from both in the courtyard and also from the windows above, support location in the wall etc...
Take a screenshot of the completed image. Do not save to dropbox, but bring to class next week - March 23rd.
Post here if you have any questions. See you all next week!
…
t ''Morph'' turns Red saying ''Cannot morph from a degenerate box'' (image 2),
that's because every curve generates a box (image 3).
After what i check the Option ''Union'' box to make only one box for all the curves (image 4).
However, the result is aleatory and not accurate at all ... :/ (see image 6).I know you are developing Pufferfish and not ''Morph'' component, but recently you publish on instagram a video where i believe you could morph and Twist with success a collection of curves (please see image 7 and 8)...If you could give me a hint how that can be achieved, it would be awesome.(Piping/Meshing the curves with very small diameter will perhaps work and help for visualisation purposes, but i actually just need morphing Raw curves for fabrication purposes).Hope to read you very soon...Ghali,…
putational Planning Group (CPlan) and is a result of long term collaboration between academic institutions and praxis partners across the globe with the common goal to increase the efficiency and quality of architecture and urban planning.
For additional information, updates, examples and tutorials please visit DeCodingSpaces-Toolbox.org
Authors
Abdulmalik Abdulmawla1,
Martin Bielik1,6,
Peter Buš2,
Chang Mei-Chih2,
Ekaterina Fuchkina1,
Yufan Miao4,
Katja Knecht4,
Reinhard König1,4,5,
Sven Schneider1,3,6
Partners
Member institutions of the Computational Planning Group (CPlan):
1Bauhaus-University Weimar (Chair Computer Science in Architecture, Chair Computational Architecture)
2ETH Zürich (Chair Information Architecture)
3Emerging City Lab - Addis Ababa
4Future Cities Lab Singapore
5Austrian Institute of Technology Vienna 6DecodingSpaces GbR
Gallery
…
Added by Martin Bielik at 10:13am on September 28, 2017
"meshed" i assume that meant converting Surfaces with MeshUV\DeMesh?, and from your screenshots thats a substantial number of vertices and therefore lines to draw, well worth it though from the results!, i agree with your answer to 3) that a more automatic solution is required,.
1) By mesh, I should have said produce a surface – then convert surface to mesh – followed by de-mesh to get access to vertices etc.
You can reduce the resolution of points if you need to, depending on your hardware. The more points you use the harder and it is to compute a solution, however the more points you use, the more accurate your interpolated surface. You need to find your own balance between speed and accuracy.
- ..thats great news, equalizing vertex numbers is exactly what i need to do since my Blend surface "keyframes" by nature will likely have unequal point counts. However, a) ..when using default Rhino surf's your intruiging def. starting to work for me only after i replaced you "custom" Domain(VB\Python?,let me know) with Deconstruct Domain. then it connected each surf's vertices but did Not produce an intermediate surface or points. b) ..when using my IDENTICAL Blend surf's in your def. with Deconstruct Domain and Merge comp's it then produced intermediate vertices,. see def. screenshots or i can send def's i you like,. I'll also produce the 2nd, Non-identical Blend surf keyframe to test in your def.
2) I am not sure what you mean by my ‘custom domain’ are you referring to the definition in my second post – or the post I sent for David to look at? Perhaps you can circle the component and upload a screenshot so I know what you are referring to? Your second screen shot appears to have worked OK
- .. agreed, 6) does or will your latest def. contain more automated, vertex correspondence, Ln creation?
3) No, I moved away from morphing surfaces and moved my solution to generating surfaces based on point data. This cut out the requirement for me to generate the surface to begin with and allows very automatic production of surfaces from data out of excel. Perhaps this would also be a good solution for you? You could:
Move your point data to excel, by exporting the x, y, z of your vertices for each surface.
Use excel as your information repository then write a definition to interpolate between your start and end points from excel.
This is basically what I have done now, as I have 1700 different ‘surface’ snap shots from the data I am working with.
- ..perhaps i missed something, but after using Brep > Join on my polysurface SDivide still saw it as subsurfaces instead of a single surface,.
4) Sorry, perhaps I should have tried that – I didn’t get as far as trying to subdivide. There should be a way to then re-create as one surface if it is necessary… I will try and find out when I have time.
How many sets of surfaces are you trying to merge through? It is also possible to morph from 1 to 2, 2 to 3, 3 to 4 …… x-1 to x by using a slider which calculates the range and picks the correct two surfaces to morph. If you need more info let me know and I will write something. - ..that sounds perfect, esp. since the sets of surfaces will be as nearly unlimited as the feature film they're modeled from. Yes, i'd love to learn more info\def's on this subject, thanks,..
Sounds to me like you might be better taking the excel read, interpolate route? If you have nearly unlimited surfaces, then they must be generated from some other data source yes?
Let me know your thoughts, if you would like to discuss anything I am happy to make myself available on skype at some stage to talk you through some of this stuff.
Cheers
Lyndon
EDIT: I have uploaded a video, which shows a surface generated using excel data - which basically loops between 'snapshots in time' to give you an idea of whether this would suit your needs.
https://www.youtube.com/watch?v=f9XAne9byQc&feature=youtu.be
…
ow the steps of the successful run when step 1.2 is bypassed (note that the and OpenFOAM session is open in the background while running the Butterfly demo file):
1. create wind tunnel, and use different parameters of (4,4) for _globalRefLevel_ as suggested by Theodoro in this post
2. run blockMesh:
3. run snappyHexMesh:
4. run checkMesh:
5. connect the case from checkMesh to simpleFOAM and run the simulation:
6. the simulation converged at 1865 iteration, but the results visualization part has some problem:
7. so I revised this part according to suggestions from Hagit:
8. and the results can be visualized for P and U values:
The GH file used for the successful run shown above is attached here.
Now, the following is the error I got when the case from the update fvScheme component is used for simpleFOAM simulation:
the warning message on the simpleFOAM component is:
1. Solution exception: --> OpenFOAM command Failed!#0 Foam::error::printStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #1 Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #2 ? in "/lib64/libc.so.6" #3 double Foam::sumProd<double>(Foam::UList<double> const&, Foam::UList<double> const&) in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #4 Foam::PCG::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #5 Foam::GAMGSolver::solveCoarsestLevel(Foam::Field<double>&, Foam::Field<double> const&) const in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #6 Foam::GAMGSolver::Vcycle(Foam::PtrList<Foam::lduMatrix::smoother> const&, Foam::Field<double>&, Foam::Field<double> const&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::Field<double>&, Foam::PtrList<Foam::Field<double> >&, Foam::PtrList<Foam::Field<double> >&, unsigned char) const in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #7 Foam::GAMGSolver::solve(Foam::Field<double>&, Foam::Field<double> const&, unsigned char) const in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libOpenFOAM.so" #8 Foam::fvMatrix<double>::solveSegregated(Foam::dictionary const&) in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so" #9 Foam::fvMatrix<double>::solve(Foam::dictionary const&) in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/simpleFoam" #10 Foam::fvMatrix<double>::solve() in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/simpleFoam" #11 ? in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/simpleFoam" #12 __libc_start_main in "/lib64/libc.so.6" #13 ? in "/opt/OpenFOAM/OpenFOAM-v1606+/platforms/linux64GccDPInt32Opt/bin/simpleFoam"
The error message from the readMe! output node is attached below as a text file.
Hope you can kindly advise what the important steps or parameters I might have missed here. I assume it might be related to OpenFOAM rather than with the Butterfly workflow...
Thank you very much!
- Ji
…
nd helpfull for me, and I always wanted to know and explore it.. I used Galapagos for solving some task, and now I'm writng an article about what I'm doing.. I have several questions regarding the algorithm's steps you mentioned (I hope you can answer):
In your explanation you described several options for some parts of the algorithm (how to make coupling, mutating etc..). Can you please explain more in detailed (parameter, or atleast the methods only) you used for Galapagos?
To be persicely:
what is the population size on the beginning?
5.a) Did you use isotopically, exclusively or biased?? - If exclusively, what percentage? - If biased what is the 'vector of weights'? or however you implemented that..
5.b) For the implementation- do you have some Gaussian with a pick on the 'Inbreeding factor' (which is some number in [0,1], while 0 presents 'incestuous', and 1 presents 'zoophilic' or the opposite)?
5.c) Did you interpolated the values by averaging (i.e. equal weights) or using preference weight according to the fitness?
5.d) I see What you said about number of sliders, want to be sure I understand: the mutation here is just to pick some percentage of the genes (what percentage you used?) and change the child's value to be a random number in the range of the slider?
Can I change the percentage of individuals from G[n] are allowed (you said the default is 10%)?
What is the default for this? Is it the first one reached?
Can I specify the max number of iteration? Can I specify the number of generations? Can I specify a fitness value to stop in?
Maybe I missed some parameters, but I saw Galapagos as a "black box". But maybe I missed I can adjust it (in the later case, would like to know what are the defaults values).
I guess it is not an open-source code (right?) and maybe you don't want to share it for public. I would be glad if possible to know a bit more at least about the methods, so I describe it when writing my article, please J You can also answer me here: naama.glauber@gmail.com…
Added by Naama Glauber at 10:08am on November 14, 2018
e existing wires.
2) The capsule display is very similar to the first graph, but instead of drawing a line connecting relative y-values for each slider, each slider get's assigned a colour (from dark red to yellow) based on it's relative position. It allows you to see whether two genomes are similar or not without taking up too many y pixels.
3) This is a tricky one to explain. Every genome in a single species has the same 'dimensionality'. For example, if there are only two sliders you can say that the entire genome space for the species is 2-dimensional. For every possible combination of these two sliders, there is a fitness value (or a height) on this two dimensional plane. If your genome consists of 6 sliders, then we're talking about a 6-dimensional space.
As you probably know, distances between points are computed with the same formula, regardless of the dimensions of these points. Pythagoras' method works for all points with identical and integer dimensions. So even though I cannot display a 6-dimensional genome space on a two-dimensional computer screen, I can compute the distances between all the genomes in a species/generation. This then gives me a matrix with the distances from every genome to every other genome. I translate this distance matrix to a node-spring particle system and solve that system in two-dimensions, which ultimately results in the point-scatter graph you see on the screen.
The axes of this 2D representation of the ND distances are meaningless. The absolute position of the points inside this grid are governed partly by chance. However the relative positions are meaningful in that they convey which genomes are similar and which ones are different. Points which appear close together represent similar genomes, points which appear far apart represent different genomes.
Basically it becomes very simple to see the entire collection of genomes and get a feel for how varied the set is. You can often even see sub-species appear as distinct clusters of points.
4) For every generation, I display the fittest genome (upper boundary of yellow area), the worst genome (lower boundary of yellow area), average genome fitness (the thick red line) and the standard deviation of the fitness distribution in both directions (the orange area). Everything below the average is hatched.
Have you seen the Blog entry about galapagos?
--
David Rutten
david@mcneel.com
Seattle, WA…
Added by David Rutten at 1:37pm on November 26, 2010