h. For a while I have been thinking about the possibility of gpu computing for leveraging computation but I am still scratching the surface mostly. I know its a kind of hot topic in scientific computing. and I know there is a community of parallel programming with python... but that's more down the line of my development...
anyways, I ended up just sorting all the points by their z value.
So I have been able to use the vertex to extract color gradient and draw lines aligned to the vertex normals. However the normals at the edges of the slabs I get diagonal normals, and they seem to be somewhat inconsistent. I would need to either round them up to a certain angle or cull them out... not sure how to do this right now.
and I feel there should be a convenient way to organize and branch points by what faces, mesh, or close curve they are included. and I found this thread where I can test points by their surface inclusion. I have never used the "D" output of surface evaluate but this might work.
Well anyhow the fun begins - facade design!
I want to detail rainscreen panels and offset them where there is more radiation falling over the course of the year.
I also want the southern facade to work like a solar chimney (in the lower half, like a climbing solar chimney) that channels hot air building up the facade from the sun all the way to the roof, and pull air cross the interior out of vents. So I need to further divide the facade surface breps to fixed rainscreens and operable shades.
my goal is to move the information over to detail and specificate to revit. and this would make a good occasion to test out the brand new mantis shrimp. I hope I can run revit + dynamo and this analysis together :)
So I think I might be at the end of this particular thread - the answer to the RAM bottleneck is - DOWNSIZE YOUR SAMPLES!
…
rendo posizioni lavorative fino a qualche tempo fa impensabili. Questo nuovo approccio ha infatti la caratteristica di avvicinarsi alla programmazione informatica, ma con un approccio facilitato grazie ai componenti visuali.Hai bisogno di un motivo in più per usare Grasshopper? Eccolo! Trattandosi di uno strumento ancora in fase di testing (anche se perfettamente funzionante) l’applicativo è completamente gratuitoScarica la tua versione e inizia subito ad usarlo!Corsi certificatiLe lezioni sono tenute da Antoni(n)o Marsala, docente certicato McNeel, con alle spalle oltre 5 anni di esperienza nell’insegnamento di Rhinoceros. Negli ultimi anni abbiamo tenuto in grande considerazione l’evolversi di questo plugin e abbiamo deciso di investire sulle sue potenzialità.Nel Febbraio del 2011, grazie ad Antoni(n)o Marsala, è uscito Algoritmi Generativi, edizione italiana del libro di Zubin Khabazi Generative Algorithms with Grasshopper. Entrami sono scaricabili gratuitamente e rappresentano dei validi strumenti per capire il mondo di Grasshopper.Da diversi mesi inoltre, il Mandarino BLU, ha attivato una collaborazione con La Bottega di Galileo di Pisa, officina del libero scambio di idee, presentando dei progetti formativi post universitari, per coloro che vogliono entrare nel mondo della progettazione di nuova generazione.Dalla collaborazione con Multiverso, nasce invece un progetto formativo più ampio sviluppato a Firenze in via Campo d’Arrigo 40rLeggi il nostro programma didattico o scarica la versione in pdf…
2:
-Developing the winning design into a working application -Testing -Beers and BBQ
Details:
-Tutors: Gregory Epps, RoboFold founder, Florent Michel RoboFold software developer. -See previous workshops here. -Download Poster here.
-Please install Rhino5 and Grasshopper and Godzilla before this event.
-No previous experience with Grasshopper necessary. -Hours: 10am-6pm. -Location details: here.
***COMPETITION: THE BEST USE OF GODZILLA GETS A FREE PLACE***
Judged on creativity and practicality. Submit your name, association and a link to your video to robots@robofold.com We add an additional place for the winner. Flights, accommodation etc are not free...
Join us for the first Godzilla robot workshop - experiment with the easiest robot software on the Grasshopper platform.
More details and resources on: http://www.grasshopper3d.com/group/godzilla
Workshop Fee:
Student: £ 399
Professional: £ 599…
16-20 / PUEBLA JULY 23-27
This workshop is intended primarily for architects and designers interested in learning parametric and generative design applied to the generation and rationalization of complex geometries for their implementation in different design processes. The course will cover basic concepts and methodology to address many design issues through the development of algorithmic tools via a visual programming language and the development of digital fabrication schemes. Rhinoceros 3D and Grasshopper are going to be used as our modeling tools and V-Ray as our rendering engine. Monday to Friday from 10am to 2pm and from 4pm to 8pm 40hrs.
No previous knowledge of Rhinoceros 3D or programming required, CAD background desirable.
Students: 4,000 MXN Professionals: 5,000 MXN Info: workshop@3dmetrica.com 044 55 28790084 www.3dmetrica.com
www.facebook.com/3dmetrica
TALLER DE VERANO ARQUITECTURA PARAMETRICA DISEÑO GENERATIVO RHINO + GRASSHOPPER + V-RAY
TOUR MÉXICO 2012
MEXICALI 25 AL 29 DE JUNIO / CIUDAD DE MÉXICO 2 AL 6 DE JULIO / MORELIA 9 AL 13 DE JULIO / GUADALAJARA 16 AL 20 DE JULIO / PUEBLA 23 AL 27 DE JULIO
Este taller está dirigido principalmente a arquitectos y diseñadores interesados en el aprendizaje del diseño paramétrico y generativo aplicados a la generación y racionalización de geometrías complejas para su implementación en diferentes procesos de diseño. En el curso se abordarán los conceptos básicos y metodología para hacer frente a diversas problemáticas del diseño mediante el desarrollo de herramientas algorítmicas a través de un lenguaje de programación visual y el desarrollo de esquemas de fabricación digital. Se utilizarán Rhinoceros 3D y Grasshopper como herramientas de modelado y V-Ray como motor de renderizado. Lunes a Viernes de 10am a 2pm y de 4pm a 8pm 40 hrs.
No se requieren conocimientos previos de Rhinoceros 3D ni de programación, conocimientos previos de CAD deseables.
Estudiantes: 4,000 MXN Profesionales: 5,000 MXN Info: workshop@3dmetrica.com 044 55 28790084 www.3dmetrica.com
www.facebook.com/3dmetrica
…
t file** - ply file with just x,y,z locations. I got it from a 3d scanner. Here is how first few lines of file looks like - ply format ascii 1.0 comment VCGLIB generated element vertex 6183 property float x property float y property float z end_header -32.3271 -43.9859 11.5124 -32.0631 -43.983 11.4945 12.9266 -44.4913 28.2031 13.1701 -44.4918 28.2568 13.4138 -44.4892 28.2531 13.6581 -44.4834 28.1941 13.9012 -44.4851 28.2684 ... ... ... In case you need the data - please email me on **nisha.m234@gmail.com**. **Algorithm:** I am trying to find principal curvatures for extracting the ridges and valleys. The steps I am following is: 1. Take a point x 2. Find its k nearest neighbors. I used k from 3 to 20. 3. average the k nearest neighbors => gives (_x, _y, _z) 4. compute covariance matrix 5. Now I take eigen values and eigen vectors of this covariance matrix 6. I get u, v and n here from eigen vectors. u is a vector corresponding to largest eigen value v corresponding to 2nd largest n is 3rd smallest vector corresponding to smallest eigen value 7. Then for transforming the point(x,y,z) I compute matrix T T = [ui ] [u ] [x - _x] [vi ] = [v ] x [y - _y] [ni ] [n ] [z - _z] 8. for each i of the k nearest neighbors:<br> [ n1 ] [u1*u1 u1*v1 v1*v1] [ a ]<br> [ n2 ] = [u2*u2 u2*v2 v2*v2] [ b ] <br> [... ] [ ... ... ... ] [ c ] <br> [ nk ] [uk*uk uk*vk vk*vk]<br> Solve this for a, b and c with least squares 9. this equations will give me a,b,c 10. now I compute eigen values of matrix [a b b a ] 11. This will give me 2 eigen values. one is Kmin and another Kmax. **My Problem:** The output is no where close to finding the correct Ridges and Valleys. I am totally Stuck and frustrated. I am not sure where exactly I am getting it wrong. I think the normal's are not computed correctly. But I am not sure. I am very new to graphics programming and so this maths, normals, shaders go way above my head. Any help will be appreciated. **PLEASE PLEASE HELP!!** **Resources:** I am using Visual Studio 2010 + Eigen Library + ANN Library. **Other Options used** I tried using MeshLab. I used ball pivoting triangles remeshing in MeshLab and then applied the polkadot3d shader. If correctly identifies the ridges and valleys. But I am not able to code it. **My Function:** //the function outputs to ply file void getEigen() { int nPts; // actual number of data points ANNpointArray dataPts; // data points ANNpoint queryPt; // query point ANNidxArray nnIdx;// near neighbor indices ANNdistArray dists; // near neighbor distances ANNkd_tree* kdTree; // search structure //for k = 25 and esp = 2, seems to got few ridges queryPt = annAllocPt(dim); // allocate query point dataPts = annAllocPts(maxPts, dim); // allocate data points nnIdx = new ANNidx[k]; // allocate near neigh indices dists = new ANNdist[k]; // allocate near neighbor dists nPts = 0; // read data points ifstream dataStream; dataStream.open(inputFile, ios::in);// open data file dataIn = &dataStream; ifstream queryStream; queryStream.open("input/query.
pts", ios::in);// open data file queryIn = &queryStream; while (nPts < maxPts && readPt(*dataIn, dataPts[nPts])) nPts++; kdTree = new ANNkd_tree( // build search structure dataPts, // the data points nPts, // number of points dim); // dimension of space while (readPt(*queryIn, queryPt)) // read query points { kdTree->annkSearch( // search queryPt, // query point k, // number of near neighbors nnIdx, // nearest neighbors (returned) dists, // distance (returned) eps); // error bound double x = queryPt[0]; double y = queryPt[1]; double z = queryPt[2]; double _x = 0.0; double _y = 0.0; double _z = 0.0; #pragma region Compute covariance matrix for (int i = 0; i < k; i++) { _x += dataPts[nnIdx[i]][0]; _y += dataPts[nnIdx[i]][1]; _z += dataPts[nnIdx[i]][2]; } _x = _x/k; _y = _y/k; _z = _z/k; double A[3][3] = {0,0,0,0,0,0,0,0,0}; for (int i = 0; i < k; i++) { double X = dataPts[nnIdx[i]][0]; double Y = dataPts[nnIdx[i]][1]; double Z = dataPts[nnIdx[i]][2]; A[0][0] += (X-_x) * (X-_x); A[0][1] += (X-_x) * (Y-_y); A[0][2] += (X-_x) * (Z-_z); A[1][0] += (Y-_y) * (X-_x); A[1][1] += (Y-_y) * (Y-_y); A[1][2] += (Y-_y) * (Z-_z); A[2][0] += (Z-_z) * (X-_x); A[2][1] += (Z-_z) * (Y-_y); A[2][2] += (Z-_z) * (Z-_z); } MatrixXd C(3,3); C <<A[0][0]/k, A[0][1]/k, A[0][2]/k, A[1][0]/k, A[1][1]/k, A[1][2]/k, A[2][0]/k, A[2][1]/k, A[2][2]/k; #pragma endregion EigenSolver<MatrixXd> es(C); MatrixXd Eval = es.eigenvalues().real().asDiagonal(); MatrixXd Evec = es.eigenvectors().real(); MatrixXd u,v,n; double a = Eval.row(0).col(0).value(); double b = Eval.row(1).col(1).value(); double c = Eval.row(2).col(2).value(); #pragma region SET U V N if(a>b && a>c) { u = Evec.row(0); if(b>c) { v = Eval.row(1); n = Eval.row(2);} else { v = Eval.row(2); n = Eval.row(1);} } else if(b>a && b>c) { u = Evec.row(1); if(a>c) { v = Eval.row(0); n = Eval.row(2);} else { v = Eval.row(2); n = Eval.row(0);} } else { u = Eval.row(2); if(a>b) { v = Eval.row(0); n = Eval.row(1);} else { v = Eval.row(1); n = Eval.row(0);} } #pragma endregion MatrixXd O(3,3); O <<u, v, n; MatrixXd UV(k,3); VectorXd N(k,1); for( int i=0; i<k; i++) { double x = dataPts[nnIdx[i]][0];; double y = dataPts[nnIdx[i]][1];; double z = dataPts[nnIdx[i]][2];; MatrixXd X(3,1); X << x-_x, y-_y, z-_z; MatrixXd T = O * X; double ui = T.row(0).col(0).value(); double vi = T.row(1).col(0).value(); double ni = T.row(2).col(0).value(); UV.row(i) << ui * ui, ui * vi, vi * vi; N.row(i) << ni; } Vector3d S = UV.colPivHouseholderQr().solve(N); MatrixXd II(2,2); II << S.row(0).value(), S.row(1).value(), S.row(1).value(), S.row(2).value(); EigenSolver<MatrixXd> es2(II); MatrixXd Eval2 = es2.eigenvalues().real().asDiagonal(); MatrixXd Evec2 = es2.eigenvectors().real(); double kmin, kmax; if(Eval2.row(0).col(0).value() < Eval2.row(1).col(1).value()) { kmin = Eval2.row(0).col(0).value(); kmax = Eval2.row(1).col(1).value(); } else { kmax = Eval2.row(0).col(0).value(); kmin = Eval2.row(1).col(1).value(); } double thresh = 0.0020078; if (kmin < thresh && kmax > thresh ) cout << x << " " << y << " " << z << " " << 255 << " " << 0 << " " << 0 << endl; else cout << x << " " << y << " " << z << " " << 255 << " " << 255 << " " << 255 << endl; } delete [] nnIdx; delete [] dists; delete kdTree; annClose(); } Thanks, NISHA…
as one element.
Thank you
Comment by karamba on October 7, 2014 at 11:27pm
Hello Patricio, divide the beams in such a way that each boundary vertex of the shell becomes an endpoint of a beam segment.
Best, Clemens
Comment by Llordella Patricio on October 8, 2014 at 8:30amDelete Comment
Hi Clemens,
I did what you suggested but now assemble element doesn´t work properly. Could you please tell me how to fix it? Thanks in advance, Patricio
8-10-14losa%20cadena.gh
Comment by karamba on October 8, 2014 at 11:59am
Hi Patricio, if you flatten the 'Elem'-input at the 'Assemble'-component the definition works. The triangular shell elements have linear displacement interpolations whereas the beam deflections are exact. In order to get correct results you should refine the shell mesh.
Best, Clemens
Comment by Llordella Patricio on October 9, 2014 at 8:35amDelete Comment
Hello, succeeds in creating the mesh to the slab, and built the beam segment, but when I see the deformations are not expected because the beam is deformed as the slab.
Thanks for the help
PS: maybe I'm using the program for a type of structure that is not the most appropriate, as I saw in the examples of other structures. But this type of structure is that students taught
best regards
Patricio
9-10-14%20Example%201.gh
Comment by karamba on October 9, 2014 at 10:46am
You could use the 'Mesh Edges'-component to retrieve the naked edges and turn them into beams - see attached file:91014Example1_cp.gh
Best regards,
Clemens
Comment by Llordella Patricio on October 15, 2014 at 3:41pmDelete Comment
Dear clemens
I was doing a rough estimate of the deformation, and I can not achieve the same result with Karamba. When I make a rough estimate of the result with Karamba beams and mine are very similar, I think the problem is when I connect the shell, because there are no similar results.
I sent the GH file, and an image of the calculation
The structure is concrete The result I get is 0.58cm
thank youPatricio
15-10-14%20Example.gh
Comment by karamba yesterday
Dear Patricio,
try to increase the number of shell elements. As mentioned in the manual they are linear elements. A mesh that is too coarse leads to a response which is stiffer than the real structure.
Best,
Clemens
…
se enseñan los principios de modelado básico y orgánico en Rhinoceros. En Grasshopper se estudian los principios de Parametrización, panelización y análisis en Grasshopper, así como el proceso de manufactura digital para maquinaria de corte Láser y CNC.
UN solo pago anticipado $5,000.00
Pagos diferidos $5,500.00*
*reserva tu lugar con el 50%
De lunes a viernes de 10 am a 18 pm
Del 23 al 27 de julio de 2012
DURACION: 40 HORAS
SESIONES: 5 DE 8 HORAS
o info@dimensiontallerdigital.com
informes al 55 (50 16 0634) con Mayri Gallegos (o al cel. 55 28 85 24 73)
Incluye material para corte digital.…
nt should stand up to reasonable, Socratic interrogation with logical and descriptive rigor. For example, I find entirely credible an architect who suggests that he placed his buildings 20 meters apart because he thought that it would make people more comfortable in light of his reading of the space relative to its environment, materiality, expected time of habitation/circulation, etc. His "thinking" such things is, for the most part intuitive, and backed by deductive logic. (Of course integration of wind analysis and other harder readings is obviously desirable) But I interpret the active denial of intuition's crucial role in design as at the heart of its current deplorable trending toward misuse of terminology, application of pseudo-science and intellectual over-reach. Architects wade out of their waters precisely when they invoke such things as human psychology or perception.
Furthermore, I believe that architects - student and professionals alike - regularly make formal decisions according to their aesthetic judgement. To suggest that students aren't qualified to make a design decision during their studies because they think it's formally successful seems exceedingly stingy; likewise, suggesting that a professional architect shouldn't rely on it is puzzling to me. I find architects' attempts to justify what are obviously decisions based on formal taste using other means often taking the same form of obfuscation that makes architects appear to be intellectual charlatans to specialists in other fields. Taste is taste. I would agree that it can't be taught. But good architectural design certainly remains at least somewhat grounded in artistic sensibility.
3) I'm by no means advocating that all architects must master every detail in their work. Rather, that architects have at least a generalist's working knowledge of materials and construction systems. Floors don't levitate, and windows require depth; rules of thumb count as vital knowledge.
4) I would say that consideration of performance-driven properties falls under basic understanding of how a building will operate in its given environment. For example, if you've designed a glass house in Arizona, ur doing it wrong. The more simulation and science you have, the better. Indeed, I think that such elements - wind analysis, solar gain analysis, structural performance - represent the most solid opportunities today for architects to assert the harder lines of defense in their design decision making...say for example, being able to demonstrate using basic geometry that your shade keeps the sun out in summer, but lets it in when it's cold.…
Horticulture and Landscape in same time.
The most common plastic materials used as agricultural films are the low density polyethylene (LDPE, with a density less than 0.93 kg m−3), the copolymer of ethylene and vinyl-acetate (EVA)
Also here you can find the characteristics of the flexible materials for greenhouse covers (adapted from CPA, 1992 and Tesi, 2001) as much as i get.
UV-PE Film ( UV-PE~ polyethylene Long life or UV)
Thickness (mm) = 0.18
Direct PAR transmissivity (%) = 90
Diffuse PAR transmissivity (%)= 86
Long-wave IR transmissivity (%)= 65
EVA Film ( EVA~Ethylene vinyl-acetate copolymer)
Thickness (mm) = 0.18
Direct PAR transmissivity (%) = 90
Diffuse PAR transmissivity (%)= 76
Long-wave IR transmissivity (%)= 27
and here you will find the global heat transfer coefficient’ (K in W m−2 °C−1) for the above greenhouse covering materials, measured under normalized conditions (temperatures: exterior: −10°C, interior: +20°C, wind: 4 m s−1). (Source: Nisen and Deltour, 1986.)
Cover Clear sky Overcast Sky
Single PE 8.8-9.0 7.1- 7.2
Single EVA 7.8 6.6
Note : the PAR radiation (photosynthetically active or photoactive radiation and its the amounts to 45–50% of the global radiation; Berninger, 1989)
The name PAR is used to designate the radiation with wavelengths useful for plant photosynthesis. It is accepted that the PAR radiation ranges from 400 to 700 nm (McCree, 1972), although some authors consider the PAR from 350 to 850 nm.
The composition of the radiation changes with time, as a function of the Sun’s elevation and the cloudiness. When the Sun is low over the horizon, the short wavelengths are reduced (less UV and more red). The clouds reduce the amount of energy, greatly decreasing the NIR.
The PAR proportion in relation to the global radiation increases with scattering (diffusion). It is lower with clear sky and in the summer (45–48%).
kind regards
rafat …
ocessed once Grasshopper is done with whatever it's doing now.
3) Grasshopper tells the Slider object that the mouse moved and the slider works out the new value as implied by the new cursor position.
4) The slider then expires itself and its dependencies ([VB Step 1] in this case, but there can be any number of dependent objects).
5) When [VB Step 1] is expired by the slider, it will in turn expire its dependencies (VB Step 2), and so on, recursively until all indirect dependencies of the slider have been expired.
6) When the expiration shockwave has subsided, runtime control is returned to the slider object, which tells the parent document that stuff has changed and that a new solution is much sought after.
7) The Document class then iterates over all its objects (they are stored in View order, not from left to right), solving each one in turn. (Assuming the object needs solving, but since in your example ALL objects will be expired by a slider change, I shall assume that here).
8) It's hard to tell which object will get triggered first. You'd have to superimpose them in order to see which one is visually the bottom-most object, but let's assume for purposes of completeness that it's the [VB Step 1] object which is solved first.
9) [VB Step 1] is triggered by the document, which causes it to collect all the input data.
10) The input parameter [x] is asked to collect all its data, which in turn will trigger the Slider to solve itself (it got expired in step 4 remember?). This is not a tricky operation, it merely copies the slider value into the slider data structure and shouts "DONE!".
11) [x] then collects the number, stores it into its own data structure and returns priority to the [VB Step 1] object.
12) [VB Step 1] now has sufficient data to get started, so it will trigger the script inside of it. When the script completes, the component is all ready and it will tell the parent document it can move on to the next object (the iteration loop from step 7).
13) Let us assume that the slider object is next on the list, but since it has already been solved (it was solved because [VB Step 1] needed the value) it can be skipped right away, which leaves us with the last object in the document which is still unsolved.
14) [VB Step 2] will be triggered by the document in very much the same way as [VB Step 1] was triggered in step 9. It will also start by collecting all input data.
15) Since all the input data for [VB Step 2] is either defined locally or provided by an object which has already been solved, this process is now swift and simple.
16) Upon collecting all data and running the user script, the component will surrender priority and the document becomes active again.
17) The document triggers a redraw of the Grasshopper Canvas and the Rhino viewports and then surrenders priority again and so on and so forth all the way up the hierarchy until Grasshopper becomes idle again.
[end boring]
Pretty involved for a small 3-component setup, but there you have it.
To answer somewhat more directly your questions:
- The order in which objects are solved is the same as the order in which they are drawn. This is only the case at present, this behaviour may change in the future.
- Adding a delay will not solve anything, since the execution of all components is serial, not parallel. Adding a delay simply means putting everything on hold for N milliseconds.
- [VB Step 1] MUST be solved prior to [VB Step 2] because otherwise there'd be no data to travel from [GO] to [Activate]. The only tricky part here is that sometimes [VB Step 1] will be solved as part of the process of [VB Step 2], while at other times it may be solved purely on its own merits. This should not make a difference to you as it does not affect the order in which your scripts are called.
--
The Man from Scene 24…
Added by David Rutten at 4:43pm on December 10, 2009