ino Mc Neel, autore di "Architettura Parametrica - Introduzione a Grasshopper", il primo manuale su Grasshopper. I corsi PLUG IT nascono dalla volontà di promuovere le nuove tecnologie digitali di supporto alla progettazione e condividere il know-how maturato attraverso ricerca, collaborazione con i più importanti studi di architettura e pubblicazioni internazionali. Verranno introdotte le nozioni base di Grasshopper approfondendo le metodologie della progettazione parametrica e le tecniche di modellazione algoritmica per la generazione di forme complesse. Il corso è rivolto a studenti e professionisti con esperienza minima nella modellazione 3D e si articolerà in lezioni teoriche ed esercitazioni. Argomenti trattati: - Introduzione alla progettazione parametrica: teoria, esempi, casi studio - Grasshopper: concetti base, logica algoritmica, interfaccia grafica - Nozioni fondamentali: componenti, connessioni, data flow - Funzioni matematiche e logiche, serie, gestione dei dati - Analisi e definizione di curve e superfici - Definizione di griglie e pattern complessi - Trasformazioni geometriche, paneling - Attrattori, image sampler - Data tree: gestione di dati complessi - Digital fabrication: teoria ed esempi - Nesting: scomposizione di oggetti tridimensionali in sezioni piane per macchine CNC Verrà rilasciato un attestato finale. INFO E PRENOTAZIONI: http://www.arturotedeschi.com/wordpress/?p=2888…
y in English. ○Presenter
Robert (Bob) McNeel (McNeel & Associates founder) Robert (Bob) McNeel is the founder and president of Robert McNeel & Associates (RMA). Founded in 1978, RMA originally focused on developing accounting software for accounting, architecture, engineering, and other personal services firms. Within a few years, RMA expanded its services to include selling and supporting microprocessor-based engineering and design software including AutoCAD. By 1985, the main focus of the business had shifted to AutoCAD sales, service, training, and software development. Bob McNeel grew up in the mountains of southern Washington State on a subsistence dairy farm. To pay for college, he worked in construction as a carpenter, welder, and cement finisher. Bob has a BA in Accounting from Washington State University. Prior to founding McNeel & Associates, he was a practicing Certified Public Accountant and the comptroller for a large construction company in Spokane. Andrés González (Rhino Fablab director) Andrés is a software trainer and developer since the 1980s. He has developed applications for diverse design markets as well as training materials for different CAD and Design software including the community of training materialswww.Rhino3D.TV Andrés has been working with the Rhino Team since the very early stages. He is now the head of the McNeel Southeast US & Latin American Division. He is the worldwide director of the digital fabrication community called RhinoFabLabwww.RhinoFabLab.com as well as the Generative Jewelry & Fashion Design community GJD3D www.GJD3d.com and Generative Furniture Design community GFD3D www.GFD3d.com 1981 -1985 University of North Carolina at Charlotte N.C. - EE.UU. B.S., Bachelor of Science in Engineering
…
Added by Yusuke Oono at 9:28pm on October 16, 2013
Refinement component at first, possibly using MeshMachine instead which is slow but actually gives many fewer triangles and adaptive meshing for tight curves too. Neither are easy to adjust on a deadline!
Then you have to sneak up on workable settings, using only a few lines, or Grasshopper will freeze perhaps indefinitely for 200 lines with extreme settings, especially the CS (Cube Size) setting that can blow up into a huge number if your scale is big.
Cocoon gives lots of nearly flat split quad faces so I quadrangulated those for fun:
Or MeshMachine can refine the mesh to make it efficient:
Whereas the Cocoon Refine component will merely return an equally fine mesh with more equilateral triangles but no serious remeshing to rid so many tiny triangles where they are not needed? Actually, it does seem to remesh also:
David said he used some of Daniel's MeshMachine code in there.…
ra' nella finestra di Grasshopper, in alto, insieme agli altri set di componenti come 'Params', 'Maths', ecc.
Si tratta di un esperimento per cercare di ampliare in qualche modo l'ambito di utilizzo di Grasshopper.
Come sappiamo Grasshopper e' nato per consentire l'utilizzo parametrico di Rhino. Le definizioni di Grasshopper permettono di registrare i passi necessari per costruire gli oggetti, nonche' di variare i dati utilizzati dalla definizione, ad esempio oggetti geometrici, lunghezze, angoli, ecc.
Quando modifichiamo i valori utilizzati dalla definizione Grasshopper automaticamente ricalcola il tutto e ci mostra la preview del risultato.
A questo punto, se il risultato e' soddisfacente, possiamo dire a Grasshopper di inserire gli oggetti in questione nel documento di Rhino, cosicche' li vedremo apparire nelle viste come veri e proprii oggetti Rhino.
Questo modo di lavorare ha avuto un grande successo tra gli utilizzatoti di Rhino, rendendo molto piu' agevole la costruzione di oggetti nel caso in cui sia necessario procedere per tentativi, verificando il risultato prima di stabilire la forma finale da ottenere.
Il successo di Grasshopper pero' ha anche mostrato quanto sia comodo poter definire graficamente le procedure di costruzione, e in generale poter utilizzare Rhino tramite i componenti, ad esempio gli slider, che tutti noi, suppongo, vorremmo avere a disposizione anche quando usiamo Rhino nel modo classico tramite pulsanti e comandi.
Quindi col passare del tempo sono apparsi sempre piu' Add-on per Grasshopper che permettono di eseguire operazioni particolari o anche di utilizzare Grasshopper in ambiti diversi dal concetto originale di 'History programmabile'. Accodandosi a questa tendenza, edoc prova a costruire dei componenti che permettano di operare direttamente sugli oggetti Rhino, cioe' curve, superfici, layer ecc. appartenenti al documento Rhino su cui stiamo lavorando. L'idea e' permettere di utilizzare la comoda interfaccia utente di Grasshopper anche per operazioni che solitamente sono eseguite in modo tradizionale con pulsanti e comandi, o anche tramite script.
Come gia' detto, e' un esperimento. I componenti nascono, muoioni e cambiano molto spesso, nel tentativo di capire cosa puo' essere utile e cosa puo' fuzionare o meno.
Segnalazioni di bug, suggerimenti, considerazioni ecc. sono benvenuti.
se qualche anima pia volesse tradurre questa presentazione gli faremo un monumento equestre!
grazie e scusate
gg
…
ey provide all the means to what I try to achieve.
What I need is to get a fast (as possible) evaluation of passive heat/solar gain from a certain facade. I know my building can cool to a certain degree (lets say 80 W/m2 - now lets forget other internal gains) and I want to be sure my facade is not letting excessive amounts of heat into the room/building. Normally I would make a full blown simulation to count my overheating hours and thereby evaluate my facade. To speed up the process, the idea is just to evaluate overheating hours in a faster way. So what I am thinking is that excessive amounts may estimated by counting high intensity irradiation patches in a critical sky-component or whatever such thing would be called that surpasses my sensible cooling load. My hope is that any facade visible to the sky-patches would very similar to the number of overheating hours if properly calibrated to a simulated model. However I have no idea right now, if this can be done.
Why do this? Speed, convenience, whole building thermal analyses.
@Chris and @Abraham The critical sky-component is made with LBs radiance component radiation and filtering the beam-components with highest effects from a yearly epw-file.
@Chris Conductive heat gains are also important especially if the facade is badly insulated, so the next step is to filter the outdoor temperature parallel with that critical sky-component and then do a static heat transfer analysis and combine that with the effect from direct sun influence. Again, no idea if it works.
Hope it makes sense. I a little embarrassed I drew you into this little experiment. This was not at all the point of the discussion. But now we are into it I like to know what you think. If it works its kinda neat, at least i think it is.
/K…
nza dal centro delle facce ad un punto fisso per determinare quant'è il valore dell'offset per quella faccia.
Prova questa soluzione per ora:
- abilita il componente disattivato all'inizio;
- il componente curve offset non funziona bene, domani vedo se riesco a crearne uno migliore;
- inforna (bake) la brep risultante e convertila in mesh da rhino;
- per dargli spessore, fai l'offset solido della mesh in rhino per l'ultima fase, funziona meglio.
I've used the distance from the center of the faces to a fixed point to determine the value of the offset.
Try like this:
- enable the first component disabled;
- offset curve don't work perfectly, I'll try to fix it maybe...
- bake the brep and convert it into mesh in rhino;
- for the thickness, do a solid offset of the mesh in rhino for last phase, it just works better.…
n account of the position of the sun and weather cannot be expressed in terms of a single set of luminous intensity values (which is what IES files do).
With regards to your example files, I agree with Chris. The primary reason for the low illuminance levels is that the light bounces are getting lost in the tube. Have you checked with the manufacturer/distributor if the location of the IES file should be inside the tube and not flush with the ceiling? Physically modelling such tubes in lighting software like Radiance (which is what HB uses) or AGI32 is a fairly expensive proposition. This is one of the reasons why manufacturers provide photometric data for such devices (however simplistic that data might be).
The candelamultiplier increases or decreases the luminous intensity values. So it will have a direct impact on the calculation. The primary reason for having that input was to enable users to do some testing with different lamp types and environmental factors such as dirt depreciation. You need not change them for your simulation. Assuming that the IES file is inside the tube, in order to make this calculation work inside HB you'd have to crank up the calculation settings to a very high level (start with -ab 10 -ad 4096).
Finally, due to shortcomings in the annual simulation software (Daysim), IES files will not work directly work with annual calculations. However, there is a fairly easy workaround for that issue. In case you are planning to run annual calculations with IES files, please let us know here.
Sarith…
me research involving shades and and solar radiation and I need the sun's path through the entire year to fully optimize the design. This far I've been able to simulate what I want by having my shadders following a mock solar orbit around them, what I need to know is to use a model that simulates solar paths, use it as an attractor point and have my shadding surfaces follow it, pretty much like that I am doing right now (or so I think)
Here's where my questions come around:
I remember finding somewhere on the internet a definiton that simulates the sun's path through the year; I think I can find it again and use it for my purposes. I think that I could just run the GH definition, bake the geometry and then upload it to Ecotect and have it run so I can get the data and keep working over that, then feed the geometry again to Ecotect, ad nauseam. However I think that is a very slow process.
Is there a way that I can run an Ecotect plug in of sorts within GH, that way I can get my data IN grasshopper and model accordingly?
Does that make sense?
Thanks a lot for any input.…
Added by Antonio Tamez at 3:40am on October 24, 2011