h is attached below. it`s an arch, let`s say out of bricks with loads, represented with point loads which are taken from user-defined surfaces (to represent self-wight e.g).
goal of the study is to find a supporting arch which is inside the arch then (provided the strength of the bricks is high enough) the arch is ok -otherwise one have to change the geometry of the arch (make it thicker, or change the rise of the crown).
therefore i used kangaroo in combination with galapagos to find a catenary which fulfill the boundary conditions. it works very well. so the solution galapagos found is very satisfying.
it`s simple to prove the resulting forces in the arch if one knows the rise of the arch.
according to the formula N~ q*l^2/(8*f)
with q= 21 kn/m, l= 10.00 m and f= 0.965
one get N= 272 kn, which is very near on the solution which kangaroo founds in the middle of the arch (271 kn). due to the point loads this force have to get slightly higher the nearer one comes to the anchor points. this works perfect too.
but there is one irritating thing.
at the end of the catenary, near the anchor points, kangaroo gets two very different arch-forces. 281 kn in the next to last part. and 581 kn in the very last part. this is not possible and i am sure that the value of 581 kn is wrong. i calculated the example with a commercial fea-program too. it validates the kangaroo results except the first and the last one.
i think there is a problem with the calculated end length of the first (and last) element. they are twice too long as they have to be. or do i something wrong?
thank you for any reply and again for your work.
best peter
…
Added by pb to Kangaroo at 10:25am on October 22, 2011
ion y fabricación en un mismo proceso.
Para este taller se han seleccionado un conjunto de técnicas y estrategias para resolver problemas que hoy se presentan en el diseño y fabricación digital de formas complejas y euclidianas.
Bajo dos entornos de trabajo, entre técnicas interactivas y soluciones algorítmicas, se examinan conceptos y casos de estudio que le permitirán al participante decidir como y en que momento estas tecnologías pueden ser utilizadas como aliadas en los procesos de diseño y fabricación. Tomando como plataforma básica Rhino, se explora y optimiza el diseño y fabricación de topologías complejas bajo los entornos de Grasshopper, RhinoNest y RhinoCam.
En el mes de Febrero de 2010 (23 al 26 de febrero) se realizará el Workshop D.O.F Diseño-Optimizacion-Fabricacion en McNeel Argentina,
Está abierto para todas las personas y al participar obtendrás una licencia de Rhino 4.0.
Para hacer el workshop se requiere un conocimiento basico de Rhino 3.0 o 4.0
Contenidos:
1. Modelado Avanzado y sus Tecnicas. Aplanado y Desarrollo de Superficies.Anidado y distribución Nesting.
2. Introducción al Diseño Paramétrico.Definiciones Avanzadas de Grasshopper,posibilidades y limitaciones. Ajustes de escala para impresión y corte.
3. Introducción a la Manufactura en CNC - RhinoCAM 2.0. Visita al laboratorio CAM.
4. Guía Paso a Paso para la realización de un Renderizado usando Brazil 2.0. Presentación DIGITAL de proyectos.
El workshop tiene una duracion de 32 hrs. (4 dias x 8 horas por dia, horario 9 a 13 hrs y 15 a 19hrs)
Docentes
Andres Gonzalez Posada - McNeel Miami. - Grasshopper - RhinoCAM - RhinoNest
Facundo Miri - McNeel Argentina - Brazil for Rhino.
Se dictara en McNeel Argentina
Ciudad de la paz 2719 3A. - Belgrano - Capital Federal.
Costo del Curso
U$S250+IVA Curso D-O-F SIN entrega de licencia de Rhino 4
U$S350+IVA Curso D-O-F con entrega de licencia de RHino 4 Educativa (solo para docentes y estudiantes).- Precio de la licencia sola U$S195
U$S995+IVA Curso D-O-F con entrega de licencia de Rhino 4 Comercial. (profesionales y empresas) - Precio de la licencia sola U$S995
Contactos:
Facundo Miri
Facundo Miri (54-011) 4547-3458
facundo@mcneel.com
McNeel Argentina
Robert McNeel & Associates
McNeel Seattle - Miami - Buenos Aires
Ciudad de la Paz 2719 3A
www.rhino3d.TV - www.rhinofablab.com
Las personas interesadas pueden llamar al 4547-3458 o enviar mail a facundo@mcneel.com
Quienes esten fuera de la ciudad podran hacer un deposito bancario (solicitar datos de la cuenta por mail) y enviar por mail el comprobante de deposito con siguientes datos:
Nombres completos - DNI - Fecha de Nacimiento - Teléfono fijo - Celular - Correo Electrónico.
Muchas Gracias
You can find the prices at: http://www.rhino3d.com/sales/order-la.htm just click on the "Commercial" o "Student" tab.…
Added by Facundo Miri at 1:10pm on December 10, 2009
TB of RAM. I think I'm going to start a GoFundMe campaign to buy one for myself :)
2- The server's cost is about $13 an hour. I get free access to supercomputer through my university and xsede.org because I earned an NSF Honorable mention last March, however, the supercomputers available through both resources are a little complicated for me to use, as opposed to the one available from amazon that has Microsoft server 2012 already installed.
3- I wanted to run 400 annual glare simulations for 400 different views.
4- I tried a to perform annual glare simulation for one view on my Dell XPS that has Intel Core i7-6700HQ processor and 16GB of system memory. The simulation took 2 hours to complete. Radiance parameter ab was set to 6.
5- I wanted to obtain the batch file for each view so I can run them on the server. So I used the fly component to run all 400 simulations and closed the cmd windows, that wasn't bad ( for me at least) because I asked my son to this job for me, he was just glad to help me :)
6- I created one batch file using this cmd command:
dir /s /b *.bat > runall.bat
This created a file with the path to each .bat file. I edited this file in Notepad++ to include the word "start" at the beginning of each line. This was done using the "find and replace" dialogue box.
7- I split my newly created batch file into 3 batch files, each one has about 130 file names and " start" before the file names.
8- installed radiance on my server
9- Ran the first batch file on the server, this started 130 cmd windows performing my simulations, CPU usage was anywhere between 90% to 100% and about 105 GB of RAMs were used.
10. It took about 5 hours to complete all 130 simulations, I expected to run all in 2 hours but can't complain because this would've taken about 260 hours to run on my laptop. After the simulations done I ran the second and then the third batch files ( total of about 15 hours).
11. I got 400 valid dgb files. Couldn't be happier!
…
ns about them.
It's a direction for Kangaroo I very much intend to continue developing - and I am still getting to grips with the possibilities and experimenting with how different optimization and fairing forces work in combination with one another, so I would value your input and experience.
For those interested in some background reading material -
[1] http://www.cs.caltech.edu/~mmeyer/Research/FairMesh/implicitFairing.pdf
[2] http://mesh.brown.edu/taubin/pdfs/taubin-eg00star.pdf
[3] http://www.pmp-book.org/download/slides/Smoothing.pdf
[4] http://graphics.stanford.edu/courses/cs468-05-fall/slides/daniel_willmore_flow_fall_05.pdf
[5] http://www.evolute.at/technology/scientific-publications.html
[6] http://www.math.tu-berlin.de/~bobenko/recentpapers.html
[7] http://spacesymmetrystructure.wordpress.com/2011/05/18/pseudo-physical-materials/
[8] http://www.evolute.at/technology/scientific-publications/34.html
[9] http://www.evolute.at/software/forum/topic.html?id=18
At the moment the Laplacian smoothing is uniformly weighted, which tends to even out the edge lengths as well as smoothing the form, which is sometimes desirable, and sometimes not. It also tends to significantly shrink meshes when the edges are not fixed.
I plan to try some of the other weighting possibilities, such as Fujiwara or cotangent weighting (see [1] and [3]), as well as other fairing approaches, such as Taubin smoothing [2], Willmore flow[4], and so on. This also has applications in the simulation of bending of thin shells.
Planar quad panels are often desirable, but I'm finding that planarization forces alone are sometimes unstable, or cause undesirable crumpling, so need to be combined with some sort of fairing/smoothing, but the different types have quite different effects, and the balance is sometimes tricky.
There's also the whole issue of meshes which are circular (I posted a demo of circularization on the examples page), or conical (this one still isn't working quite right yet), and their relationship with principal curvature grids and placement of irregular vertices, all of which is rather different when the whole form is up for change, rather than having a fixed target surface [7].
I'm also trying to get to grips with ways of making surfaces of planar hexagons, which need to become concave in regions of negative Gaussian curvature (see this discussion)
and I hope to release soon a component for calculating CP meshes, as described in [8], which I think could have many exciting construction implications.
While there are a number of well developed smoothing algorithms, their main area of application so far seems to be in processing and improving 3D scan data, so using them in design in this way is somewhat new territory. There can be structural, fabrication or performance reasons for certain types of smoothness, but of course the aesthetic reasons are also often important, and I think there are some interesting discussions to be had here about the aesthetics of smoothness.
Anyway, that's enough rambling from me, hopefully something there triggers some discussion - I'm really keen to hear about how all of you envision these tools might be used and developed.
…
st work on lists? There may be a good reason for this, I just cuoldn't work it out while skimming the code.
2) I'd recommend declaring variables at the last possible moment, not all at the top of the file. It makes it very difficult to see what variable is used where that way. Also, if you change code, it's a lot of work figuring out what variables just became obsolete.
3) In VB.NET you can declare for loop iteration variables inside the loop, cleaning up the code: For t As Integer = 0 To X
4) If statements with conditionals should not be written like this: If (value = False) Then. There's nothing technically wrong with it, but the general rule is to write If (Not value) Then or If (value) Then.
5. Things like k = k+1 can be written shorter in VB.NET, namely k += 1. I just think that looks cooler :)
6. In VB.NET, Exit Sub is still legal (for legacy purposes) but the Return keyword is to be preferred.
7. I'm happy to see you're using sensible variable names and casing.
8. For a program like Grasshopper, one would expect to get the same results when the same setup is run at a later time. That means creating Random instances with a fixed seed value, not DateTime.Now.Millisecond. If your result depends in any way on the seed value, it should be kept constant.
On the whole pretty good work, code is quite self-documenting, properly commented and fast. Hats off.
--
David Rutten
david@mcneel.com
Poprad, Slovakia…
2. See this? It's a abstract (for the moment) layout of some WIP thing: Imagine a region where "evenly" random points are placed (and then a random zNoise is added) - then a ball-pivot/delauney triangulation is applied ... then ... :
2. I use my method to create "even" points (I suspect that David's is way better/faster/cooler ... but anyway): after a random point is found (inside the region) an additional check is performed: think of the point as a "candidate" that must "pass" a 2nd constrain: if the min distance from all the already found (random) points is smaller than a user defined one > reject and try again (the "try-again" thing [call it: min distance "loops"] is also user controllable). Thus that C# captured attempts to place 122 points but due to (a) the min distance constrain AND (b) the low (about 8 in this case) amount of "try-again" loops it finishes with "only" 59 (not a big deal for this case). The interesting part is that the attempts required are 1573 (~30 times the random points returned). Of course there's a lot of factors affecting this 1573 (variable) thing ... but don't stick to that.
3. So if David uses a "similar" culling method (add some " " more, he he) ... for 80K points ... well we are talking about a BIG number of attempts.
I can provide you with a "no-even" random points C# that (I assume/guess/hope) can speed things a bit (after all: who's gonna notice an "even" random distribution of 80K points within a micro cube?).
best, Peter
…
s the "Surface Populating" definition: I manage to populate my geometry over the surface, but after I bake it, I have to delete the boxes that define my components limits as well! Is there any way of populating and baking only the chosen component, without having to delete the boxes afterwards?
Secondly:
Basically: I am trying to cover a surface with two types of components [ an open one and a closed one] , which will be proliferated over my tubular surface according to the main sunlight direction.
1. I introduce the surface component.
2. I use "Divide Interval2" in order to have division into U and V.
3. i generate the target boxes [ "surfaceBox"] .
4. I use "Isotrim" ( same intervals) and "BRepArea" to find centroid of each area.
5. My "Curve" component introduces sun angle, with its "End Points".
6. I use "Vector 2Pt" to specify sun-light direction.
7. I want to measure the angle between sun-light and the surface normals, at the position of each component; after generating the centre points, I need the normals of each centre point to get the surface's points' UV, and "Evaluate" the srf at points.
8."Angle" and "Vector" components: I use them in order to evaluate the angle between the sun direction and the srf.
9. I convert this angle to degree by using a "Function" [ to see if the angle is bigger from the max.angle or not...]
10. Function "x,y" gives me boolean data.
11. Data become "Dispatch"ed...
12. Two "Morph" components , each one linked to one part of the "Dispatch" data, generate "closed" and "open" components over the srf.
The result should have been different types of components, based on the surface's curvature, diraction and sun-light direction...
I do not understand where the mistake is in this definition...
Thx in advance1
Spyros K.…
Component its manage a CSV like a Database, i found that the csv native Python module doesn't work in IronPython cuz its compiled in PythonC (or something like this), so i found a .Net module that can be imported to IronPython to work with CSV file
So Here a
Pseudo Code of my Toughs
1 - Import .net Module.
2 - Read Csv File. by a Path Assigned by interactive Imput.
3 - Identify the Header of The Csv File.
4 - Create a List of the Fields found in the Csv Header for Further (Sorting, Retrieve specific, create relations with others files by fields in commons, etc. etc. (whatever can imagine))
5 - Read the Data of the Csv File
6 - Out put the Attributes of the Header. (simple list of the fields to identify with what we deal for )
7 - Out Put a list of all the Attributes of the Header ( Complete List of the Fields by Rows Found in the Csv File.
8 - Out Put a list of all the Data inside the Csv File in relation with the Point 7.
over here's my thoughts Any Comment or Suggetions for the Component ?
i will be posting a Github for this, to all who want to collaborate with this.
i'm doing this cuz i'm dealing with a lot of amount of data in my Thesis Project, wich its focus on managing Shapes files, and Csv Files of Data that are not in the same dataset, so i need to relationate a lot of things just to develop my examinations techniques about the city.
…
nded from the centerline at a specified thickness, which may vary along the stent.Two parameters, tmid and tend, control thickness variation
along the segment’s longitudinal direction according to the kinematically admissible Hermitian
curve:
t (d) = tend + t(3d2 − 2d3) 0 ≤ d ≤ 1
t = tmid − tend
where d is the normalized distance along the segment’s
NURBS curve between its endpoint and midpoint. This
form ensures there are no discontinuities in thickness at
the segment midpoint or at the interface of segments in the
overall stent.
2)These normals are then checked and subject to a filleting
procedure to remove re-entrant corners, resulting in
two external sets of boundary coordinates defining the
external edges of the segment’s mid-plane.
3) This plane of nodes is then extruded at a specified angle
(see Section 2.3) to generate a 3-D set of nodes defining
the finite element mesh.
4) These nodes are then mapped to cylindrical coordinates.
5) Finally, 20-node brick elements are generated for finite
element analysis.
I have generated the centerline using 8 control points with degree 7 .
Would be great if anyone can help me with the drawing the normals from the centerline of specified length governed by Hermitian equation,so as to generate the 2D model of a stent (Please refer to the attached figure of a 2D stent and extruded figure) ,using Grasshopper.
The centerline of a single stent segment is representedas a NURBS curve.…
o it would cause troubles with unfolding and fabricating... that's why I used Extrude point component- it will give you similar result, but all surfaces are planar.. you can control extrusion direction with a tip point in rhino...
2)I changed tagging so every tube has 8 points form list A and 8 points from list B... first number of tag is a number of point within one tube... last number of the tag is order of tubes (I draw a little picture in GH, hope you'll understand)...I think original way of tagging wasn't really usefull.. but you can change tagging by yourself...
3) the definition is really messy, sorry about that, but it's just quite complicated task...
4)if you find some incorrect order of tagging, use the slider that controls Shift List component ... it will shift tagging..
5) if you won't be using this definition or find some better way, pleeeease don't tell me - I'll jump out the window :D ... it took me whole day to make it work :D
6)I can't guarantee you anything- I hope it works, but if not - at least I tried... so check everything (especially order of tags and points) twice before you fabricate it.. or print few tubes and make them paper first..
7)there is a part of original definition, that is not useful anymore.. I left it there, but you can delete it (I called it "UNUSED PARTS OF ORIGINAL FILE")
..good luck
Dimitri…