taTree.
2. Since GH is acyclic by design we can't pick individually (without code, that is) our "picks" for the iceberg ... thus we need a global policy applied to ALL grid points at once.
3. This is what the next part does: it picks randomly some iceberg stuff and modifies their Z by a random value. If the Z is always "above" the grid or not it depends upon the domain of values to operate. Seed means "roll the bones again" (meaning another collection).
4. So we have the modified points Data Tree (that are steady - acting as the tips of the iceberg). Let's call them Anchors.
5. If we subtract set 4 from 1 we have the points prone to vary according some manipulation. Kangaroo does that manipulation (this is the best add-on that GH has to offer by 1M miles made by a very clever fella).
6. But if we instruct Kangaroo to do the job... he makes chaos since the points in 4 are not sufficient: we need perimeter steady points that act as Anchors as well. So we manage some logic to pick a variable set of perimeter points and we "merge" 4 and 6 and we have the final set of Anchors on hand - whilst all the rest are points willing to change.
7. Kangaroo is a physics engine meaning that the only thing that understands is ... er ... points and their relation (the "line" connecting them, that is). Kinda like a CPU that understands 0 and 1 and nothing else.
8. So we provide Kangaroo info about all the lines involved: how "stiff" they are and what is the expected/desired final length.
9. By double clicking the Kangaroo component ... the "simulation" starts running (in some kind of "loops") and goes towards an "equilibrium" where all our desires are satisfied - or the solution's entropy is the minimum possible (well up to some level, he he). Kangaroo displays a small control dialog that allows you to halt the process or reset it (meaning: start again).
10. If the instructions are "good"/"proper" the "loops" (iterations) are relatively few: if K does 1M "loops" ... this means that your instructions are silly or not well thought.
After stopping Kangaroo ... we have (hopefully) a "well" distorted collection of points (and their equivalent mesh) to proceed further via components usually found in the WB add-on
PS: If all the above sound Greek to you ... it's because I'm Greek, he he.
Moral: Get the gist of Kangaroo ASAP - worth spending some time I recon. If you do that and you need examples (other than the ones available at download time) ... well I have more than 300 (from simple to ultra paranoid).…
st variety of papers (mostly related with LIDAR airborne sampled clouds) ... but ... hmm ... no code (other than some "abstract" algos that may (or may not) work). Reason? A very hot cake that one these days: from reverse engineering to DARPA founded future defense systems and up to cruse missiles pattern recognition algos.
The solution (obviously doable only via code) is the so called flat hard clustering ... were points are sampled into clusters based on the coPlanarity "rule". For large amounts recursive octTrees (an oriented box divided in 8 "partitions") subdivisions are used and then pts are processed in parallel (and then clusters are re-evaluated in order to "absorb" other clusters with same plane A,B,C,D vars etc etc).
See what's happening in a very carefully made test point collection:
3.7 ms and the "ideal" clustering (7 search loops VS the max 42M theoretical threshold):
Depending on the pts "preparation" ... a considerable more time/search loops is required ... and ... well ... also "valid" clusters (4 points and up) made:
So "ideally" speaking in your case:
1. Mesh faces center points (or alternatively: mesh vertices) are sampled into a pts collection .
2. Hard flat coPlanarity clustering is attempted yielding pts/planes in equivalent DataTrees.
3. Planar Breps are made with respect the planes (like the black things captured above) and sampled, say, into a breps List.
4. The method Brep[] solids = Brep.CreateSolid(breps); is used for attempting to create your desired "engulfing" brep. This method is very slow mind (other waaaay faster approaches also available).
…
on excel (leaving 0,0 cell blank and also making sure there are no commas in the names ) Also let's call the names "ID"
2 - For the weight, use numbers ranging from 1 - 10 where 10 is the highest dependancy.
3 - Save the file as a Unicode CSV from excel
4 - Create another file on excel that has the attributes of your spaces, with the names of your spaces under the header ID (let's start with a simple "area" and "SNo" attribute but you could add more features for sorting and manipulating your data)
5 - Open Gephi and further open your matrix CSV file
5 - Import it as "," (comma delimited file) and make sure you check "matrix" for the data type
6 - Ensure the import is nondirectional as well (or Gephi adds silly arrows)
7 - Not gonna go into the gephi bit too much but select a force atlas layout and set the force to something high 1000 or 10000 depending on the size of the data and the attraction to a 1000th of that 1 or 10. Go to the data lab and import your excel with the attributes and append to your existing datasheet.
8 - Set the node attributes to use the area for the node size and color scheme to SNo
9 - Play around with all the layout options and finally go to your preview. Once you're happy with it, export it to a GDF graph file.
the GDF now has the coordinates of the circles and the diameters. as well as the edge connections.
I've written a very amateur script that converts this to GH geometry (below)
Hope this helps someone out, I'm still figuring out the gephi streaming API but I've only started with python about a month ago so might take a while to get there.
You can use the second half of the GDF files to also create dependency chord diagrams online as shown in the third image.
https://flourish.studio/2018/07/25/how-to-make-a-chord-diagram/
Cheers,
Sanjay
…
e actual method.
Below, I descibe how they work:
1) drag "scheduleDay" onto the canvas
2) drag some Gene Pool lists onto the canvas and connect a number slider - from 0 to 3.
3) connect the Gene Pool list to _genePool input. The component change some important features of the Gene Pool list automatically. Now you have LB_GenePool!!
4) choose the template that it's suitable for you.
5) disconnect LB_GenePool and if templates are not good, you can change them manually
6) drag "Ladybug annual schedule" onto the canvas
7) Connect LB_GenePools to inputs for the days of the week, Epw file and if you want to "_holiday" (in this way you consider holidays). Now you have your simple schedule.
8) a small workflow to visualize it into Rhino..
9) Connect "Ladybug annual schedule" to "Honeybee_Create CSV Schedule" to make your csv Schedule
You could make a schedule more complex than the one in the example above.
You can do that with _analysisPeriod input.
Bests
Antonello…
Tetrahedron: 24 Symmetries
Pyramid: 8 Symmetries
Design space = 24 X 8 = 192 permutations
So I decided to write a simple orientation script to iterate over all permutations. And this is the result. Below are some technical notes.
I used the vertices of the shapes for creating a 3 point plane, and used it for orientation.
I used compound transform to combine multiple steps of transformation.
The cross reference component is very handy, generating all the possible combinations without worrying too much about data tree.
The spatial relationship and the basic grammar A -> A + B and B -> A + B
The basic grammar and possible marker positions.
All results in 6 iteration steps
All results in 6 iteration steps (Top View)…
minativo (15, 16 y 17 de julio)
- MÓDULO 3. Curso de scripting con Grasshopper, Processing y Arduino: modelado iterativo, interacción y sensorización (22, 23 y 24 de julio)
Más información y reserva de plazas.
¡Consulta los descuentos para estudiantes de grado y posgrado y para más de un miembro del mismo estudio!
Los cursos serán impartidos en Madrid por dos Authorized Rhino Trainers.…
diverse group of design participants from afar for a full schedule of exchanges with leading practitioners, practices, fabrication labs… all while exposed to European transit infrastructure... trains planes & even a few mountain roads. LaN FLIGHT EUROPE marks LaN's fifth initiative on-the-fly & our first in EUROPE. JOIN us for the full trip or the leg that suits your interests. LaN is looking to attract a geographically diverse group of students & professionals with various design backgrounds. LaN FLIGHT 2012 EU is co-piloted by LaN Monika Wittig & Co-de-iT Andrea Graziano. LaN FLIGHT is looking for highly ambitious-adaptable-endurance oriented participants to fully embrace the nature of this curated experience. Please take a look at our 3 previous editions to best judge if this type of experience suits you. If you are willing to allot 8 days of your life to this pursuit and have no allergies to extreme mobility & group dynamics… welcome to LaN FLIGHT.…
mport the geometry again.
Right?
How about this? I add an extra object called something like "Geometry Cache". You have to give it a unique name. If you plug geometry data into the left side of this component, it will bake all that geometry and attach UserStrings to all those objects like "<name>: {0;0;3}(8)" where <name> would be your name and the rest is the exact location of that piece of geometry in a DataTree. It should probably also delete any objects already in the 3dm file that have that custom name/data assigned to them.
If you don't plug any wires into the left side, it will instead search the 3dm file for all geometry with the appropriate user data, load them into a correct DataTree and supply that data to whoever plugs into the right side.
If you plug wires in both ends, it will just function as a generic Geometry Parameter.
It might be tricky to write a good event handler for this thing, maybe I'll just restrict myself to an UPDATE NOW! button on the object itself, so you can trigger an update manually.
ps. benefit of this approach is that everyone can create and harvest geometry with such user text, whether they use Grasshopper or not.
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
me)
And got the same result as you did. Suddenly the definition started working. Although I got this error message when I opened the compression tension null.gh file:
Message log start (chronological): --------------------------------------------------------------------------------Plugin version: 0.8.0066 Input parameter chunk is missing. Archive is corrupt. Output parameter chunk is missing. Archive is corrupt. Output parameter chunk is missing. Archive is corrupt. Output parameter chunk is missing. Archive is corrupt.
Why is that?
Can I dare to ask you few more questions?
2) I want all of my members to be made of solid (not hollow) circular cross-sections.
Does that mean that my diameter and thickness need to have the same values? Like this:
?
3) I have wind load from 8 directions. Is there a way in Karamba to create load groups and choose the one with the most extreme values (group that will be used as the most relevant one for dimensioning)?
Thank you.…
are just the 8 cases, so you're actually doing it right here (scroll down on this page, and you'll see a separate subset all about marching tetrahedrons http://paulbourke.net/geometry/polygonise/). The benefit to using marching tetrahedrons is exactly this: that the number of possible "cuts" through the tetrahedron are dramatically smaller in number than those through a cube.
However, I have found that also what you're seeing that the linear interpolation creates some odd distortions (which is why I went ahead and later did the marching cubes implementation). Some of this comes from the density of the sampling grid: the more dense, the fewer distortions.
What I would suggest, if you want a (relatively) quick way to improve this outcome:
1) build up a full mesh rather that bunch of surfaces, and use Rhinocommon to combine identical vertices, and rebuild the vertex normals
2) run a couple rounds of laplacian smoothing on the mesh to better distribute your vertices (for each vertex, make it equal in location to the average of its neighbours)
3) create a line normal to each vertex roughly the length of your sampling grid and test the endpoints of it against your scalar field formula, and then do one final linear interpolation between those two points for your vertex.
This should give you a smoother mesh for sure.
But good work getting this far! …
Added by David Stasiuk at 1:37am on February 6, 2015