cided to expose the real-thing (something that is under construction these days, a "bit" bigger than the "demo" examples provided up to this point - around 25K m2). This means a more complex definition (a "bit"). The real thing is not made with GH.
4. Membranes are treated with 2 ways: Kangaroo (bad news: you receive a mesh that is anathema for shop-drawing level studies). MinSurf + trim (nurbs on hand but with the obvious disadvantages as regards "relaxing" the other components - more code, less time for windsurfing)
5. Puzzle: do you know that big "umbrella" thing made recently in Saudi Arabia? (colossally big membranes that ...er... hmm..."fold" - 1M times the size of an umbrella). Never heard about that? Shame I must say, he he.
best, Ron (actually I mean Stan).…
y you have a mesh with 100 vertices (points). The first one is at index 0, the second one at index 1, then 2, 3 etc. all the way to 99. A face might connect vertices 0, 1, 22 and 23.
You typically don't use this kind of low level method to create a mesh, though of course there's nothing stopping you. Most meshes are either the result of some operation on existing meshes, the approximate mesh of a surface/brep or based on one of the mesh primitives such as Plane, Box or Sphere.
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
stand the way that GH manages collections (Lists and DataTrees) and then do whatever. 99% of issues that users have are due to poor knowledge with regard how to "sample" (and "manage") things in collections.
2. Read 1.
3. Next: At least 1Z definitions are available in this Noble Forum ... so get some and do something.
best…
u see around you they lack any zen: that allows the 1% to shine.
2. Each thing that you do/design never dies inside you: Let's call it an "event". The sum of them formulate you as an Engineer. If some "events" are faulty (or worst: shallow) your kaleidoscopic sampling is disturbed proportionally to the sum of the bad "events".
3. "Events" MUST withstand Time: there's nothing worst than fading reasons for doing this or that.
4. "I want" means nothing (anyone wants this or that, so what?). "I did it because" means - maybe - something.
5. Cost (and modesty) should be always your FIRST concern: forget form exploitation(s), new frontier(s) and other similar nonsense > they just yield another pointless waste of resources and guide you gradually into an amoral state of the worst kind: Remember > 1 dollar can save the life of one kid in Africa per day > in this context it's rather hard to justify ... well ... pretty much anything.
6. If you can do anything imaginable this means that you should do LESS other wise you'll find yourself trapped into a rabbit hole that has many entries but not a single #^%^# exit.…
all the other rules.
2. No Flattening! use path shift / trim tree instead of flattening.
3. No Path Mapper! I have never met a data operation with the path mapper that could not be achieved through relative means.
4. No Simplify! It makes things *look* nicer but believe it or not those zeros are meaningful and shouldn't just be eliminated. If you are OCD about the way your paths look, then Path shift after every operation that introduces a new branch level (a new "0" at the end) IF AND ONLY IF you are sure that in the case of your definition the component will always function "1 to 1" - that is, for every single input there is only one output.
5. If you absolutely must flatten (to take a global bounds, or generate random values for every item, or whatever) be sure to Unflatten before continuing.
6. Design for the worst case - start with primary inputs in the most complex data structure your definition is likely to need to be able to handle (a tree for instance) rather than a single item.
If you follow the above rules, 99% of the time your definitions will respond appropriately to any change in upstream data structure. If you want an example of how this works in practice, post your definition and I can help find "relative" approaches to the "absolute" things you are currently doing. …
whole mesh. So if you have a mesh with 100 vertices, face 1 might point to vertices (0,1,2,3) and face two can look like (100,1,2,99). This only means, that vertex 100 and 99 are connected with vertex 1 and 2 to form a face.
The face normal (inside/outside) is deteminded by vertex order. So if the mesh wraps around, you are bound to find some vertices traveres in reverse order. This in neccessary to have all faces point outward.…
ten, he he > meaning that they deliver a (stunning if things are going well) product that when it fails in some method it reports: "unable to do it".
2. NEVER take the short route when coding: try to "report" anything that's happening; it's a bit boring but you'll see the actual reason when later you'll do real-life complex things.
More patch freaky stuff soon (but given the opportunity: where's my goats? nothing arrived insofar).
May the Force (the pink option) be with you (I'll take the black option).…
combination is nearly 0 (of course with 1 try). You have about 100 (?) dimensions... its just impossible to do it well. Even with billions of random genotypes for 1st generation.
Its like 1:googol (10^100) to succeed. If youll try and run it on your pc, youll probably consume all the energy in universe, and it will take longer time than our universe will exist.
Sorry :(
EDIT : As David wrote in his post - every added dimension results with almost half of "success ratio". So as with one slider you have e.g. 1:2 ratio of success, with 100 sliders you have :
1:633825300114114700748351602688 (2^99)
To somebody more familiar with math -> correct me if Iam wrong :)…
ata in Grasshopper. So we learned a lot about data sources, environmental data, etc. and how to use them in GH. It was a great experience and was my start with GH. I knew it a little bit, but I learned so many things about it there. Not sure if they will do it again, but it was quite successful, so maybe they will.
I have not used ELK before, so can't comment on it, but I can confirm that it took a really long time. But for the most part even for a very large architectural project you wont need OSM data for a whole city, especially not one as big as Paris. So if you want to use ELK use it with normal size maps like a few hundred meters square, because 99% of projects wont be bigger than that. If they are then you will have the proper resources ;)…
Added by Armin Seltz at 8:16am on November 6, 2015