e systems?
Architecture can engage technologies to make buildings tectonically transforming and becoming aware of the active surroundings investigating data and responding to environmental change.
The 10-day workshop investigates the design of computational kinetic structural systems, which interact with the behavior inherent in the city, environment and population.
The aim of the workshop is to investigate parametric kinetic strategies that transform according to the ever-changing data system. Like architectural cybernetic machines embedded in a smart city, the projects interact with the population and the environment of Rome proposing another layer of urban strategy. These operations take place both by continually detecting the physical and non-physical data via sensors and by transforming their own forms. The complex dynamic interaction approach leads us to discard the imposition of a fixed form and, instead, create and use a computational kinetic artifice.
Initially students attend lectures on current mainstream and academic research as well as tutorials on parametric modeling software, digital fabrication prototyping and robotic assembly. Then applying skills in a team-based structure, pursue computational design research coupling physical analogue experiments with computer-controlled kinetic prototype programming. The proposals are built assembling digitally fabricated parts and electronic devices, like Arduino boards, sensors and servos to interact with a data exchange and regulating form.
For additional informations go to the AA WEBSITE: http://www.aaschool.ac.uk/STUDY/VISITING/rome
_
…
is an emerging research area that exploits the principles and logic of natural systems in the design of the built environment. Sandworks* investigates the development of a computational design system informed by the sand self-formation behaviour. The knowledge objective is to understand the process of coding the material physical behaviour that follows the geometrical constraints of developable, ruled and hyperbolic surfaces formed by sand. Through physical and digital form-finding exercises, the workshop will explore the relationships between material and its shaping processes in the generation of form in parallel to theoretical lectures and discussion on cutting edge approaches of computational thinking, design and fabrication in architecture and design. Participants will be introduced to ruled based design thinking through digital design and fabrication systems as well as physical modelling and prototyping techniques.. The workshop aim is to provide an understanding framework of how such processes occur in nature and find their translation in the design of the artificial. *This research started at the Architectural Association school of architecture in London exploring the design and fabrication techniques of sand tectonic system and now it is pursued at ENSA Paris-Malaquias focusing on the development of the machinic morphogenetic possibilities of sand self-formation with robotic distribution of material through addition, subtraction and deposition techniques.www.sandworks.org /// Application To apply, please follow this link to fill the application form https://docs.google.com/forms/d/1n_LN2svFTT79kCqndxznKRj4PX8J8__FVNKvjiHTeJU/viewform /// Fees* 1700 EGP for students / 2000 EGP for graduates and young professionals * 20 % discount for early registration and payment before 22 nd of August 2014 more info on the workshop webpage: http://www.encodestudio.net/#!sandworks/c1nn1…
como 2ECTS - Horario: Jueves y Viernes de 16:00 a 18:00 - Inicio: final de Febrero 2016 -fecha exacta por especificar- - Inscripciones: Envía tus datos (nombre, apellidos, NIF, mail, teléfono) indicando tu preferencia a iamadrid.arquitectura@upm.es
-Aprendizaje del entorno de programación visual para la generación de prototipos dinámicos de proyectos completos. Plataformas de programación basadas en nodos (node-based) para su gestión. -Diseño de algoritmos interrelacionados. Planificar y explicitar procesos. Traducción de procesos a lenguajes de programación. Sintaxis básicas comunes entre todos los lenguajes de programación. -Explorar tanto derivas como objetivos concretos. Programar herramientas de proyecto como parte del proyecto mismo. Explorar el proceso como esencia del proyecto. -Incorporar al diseño datos externos al mismo. Aprender a programar, automatizar y después matizar decisiones. Generar proyectos adaptativos y reactivos en continua reinformación. -Explorar los límites de lo codificado: producción de codigos como asistentes y no como imposiciones. -Interrelacionar decisiones de equipos. Generar marcos y rutinas para el diseño colaborativo. -Explorar topologías y prototipos, entornos de incertidumbre y posibilidades. Manejo de bases de datos y flujos de herencia y transporte de datos. - Generación dinámica, evolutiva y modificable. Producción de herramientas de codigo abierto.
http://dpa-etsam.com/iam/iam-cursos
https://www.facebook.com/iamadridETSAM?fref=ts
+34 91 336 6537 / 6589…
, presso la sede Manens-Tifs, nei giorni 26,27 e 28 maggio 2016.
Il comfort visivo e la gestione dell’illuminazione naturale in relazione al risparmio energetico diventano sempre più rilevanti per una progettazione innovativa degli edifici. Ad esempio, il nuovo protocollo LEED 4 riconosce crediti per le simulazioni di daylighting e conferma l’importanza degli aspetti progettuali per “collegare gli occupanti con lo spazio esterno, rinforzare i ritmi circadiani, ridurre i consumi di energia elettrica per l’illuminazione artificiale con l’introduzione della luce naturale negli spazi”. Senza strumenti software per la simulazione della luce non è possibile ottenere risultati di qualità. Radiance è un software validato, utilizzato sia a livello di ricerca che dai progettisti ed è tra i più accurati per la simulazione professionale della luce naturale e artificiale. Non ha limiti di complessità geometrica ed è adatto a essere integrato in altri software di calcolo e interfacce grafiche. Queste ultime facilitano le procedure di programmazione. Le principali e più versatili saranno oggetto del corso (DIVA4Rhino e Ladybug+ Honeybee, plug-in per Grasshopper e Rhinoceros 3D).
Il corso è rivolto a progettisti e ricercatori che vogliano acquisire strumenti pratici per la simulazione con Radiance al fine di mettere a punto e verificare le soluzioni più adatte alle proprie esigenze. Sono previste lezioni di teoria e pratica con esempi ed esercitazioni volte a coprire in modo dimostrativo ed interattivo i concetti trattati.
Le domande di iscrizione devono essere presentate entro il 12 maggio 2016.
La brochure con i contenuti del corso e tutte le informazioni sono disponibili su questo link
Il corso è sponsorizzato da Pellinindustrie.…
hreads where Thread I solves object A1 and Thread II solves object A2. As soon as A1 is completed, Thread I can move on to object B1 and as soon as A2 completes, Thread II can move on to object B3 (whichever comes first). When both A1 and A2 are complete, we can spawn a new thread (III) to take care of object B2.
If B2 completes before B3, then Thread III will terminate. If B3 completes before B2, then Thread II terminates. Whichever thread is last will pick up execution of object C3. And so on and so forth.
This sort of threading is actually not guaranteed to help much though, as it is likely that the bottleneck components in the network will still need to be handled by a single thread.
A more efficient solution would be to divvy up the execution per component to multiple threads. If you're trying to compute the Curve Closest Point for 10,000 points and your machine contains 4 cores, then we can assign 2,500 points to the first core, 2,500 points to the second core etc.
This approach will actually work when there's only a few bottleneck components and it also means the order in which components are solved is no longer important.
An even more fine-grained approach to threading would be to make the Curve Closest Point function in the Rhino SDK threaded. There's a lot of looping going on in any given Curve CP computation so the curve could be broken up into loose spans where each span is solved by a different core. Then the partial results get consolidated once all threads finish.
The benefit here is that it would be multi-core for everyone, not just Grasshopper components.
The bad news: Some functions in Rhino are not thread-safe. Meaning that data structures such as NurbsCurves cannot be modified from multiple threads at once as it will compromise their validity. You might well end up with invalid curves and quite possible weird crashes. In very bad cases it might even be that a specific function in our SDK can only be running once, so even if you were to duplicate the curve it would still not work.
Until our SDK is thread-safe there can be no global threading in Grasshopper. I don't know where we're headed with this, but I do know that we've started using some threaded algorithms in the display as of Rhino5, so it seems we're at least getting our feet wet.
--
David Rutten
david@mcneel.com
Seattle, WA…
Added by David Rutten at 5:47pm on November 17, 2010
r your need.. bravo my friend!
2.Hahaha career first, woman later.
3.maybe it is because my country has yet to use the softwares you mentioned, I am pretty sure they are used by engineer, since my practice is only architectural work, we always collaborate with the engineer to get the job done (by collaborate, I mean passing on the job to the engineer, and then we just let them handle all the stuff, the sad news is architect and designers doesnt get paid much to deal with those)
in contrast with your software, me and my team use Rhino,GH,3dsmax, and Zbrush-- these are standard package you got after u graduate from any arch.college. the level of detail I produce in my modelling is not as "heavy" as yours because the client I face every week only worries about the appearance of the building, as well as convenience in terms of planning, efficiency, and cost control over the design.
I am surprised you don't use REVIT or ArchiCAD? they are the standard for BIM documentation.
4.well, subdiv modeling is one of my fave. (That is why Tspline is my fave tool in rhino, to sketch out the model that I want, however it depends on the type of geometry you are after, for "solid modelling" I prefer Subdiv modelling rather than standard NURBS )
5. no comment for that hahaha
6.I disagree, I think Rhino needs a TOP rendering software as part of rhino system itself. to make it at least on par with 3dsmax. rhino isn't built for restrictive use like CATIA, so its purpose is for "MODELLING and VISUALIZATION". at the moment we only see the "MODELLING" part.
7.+1 for this
8.I have to agree with that. there is no way we can incorporate BIM in Rhino. also it is not widely accepted to somehow have BIM file in Rhino format. but...
9. in the future it might be, because if you compared rhino 4 to rhino 5 there has been a major breakthrough, who knows what will happen in rhino 6 or 7.
10.I saw a post the other day,David said he intended GH to be algorimtic modelling instead of parametric, and GH is like a swiss army knife, it has thousands of tools to do general job, but it requires 3rd party member (we call it add-on) to extend its uses. Id prefer GH to have much broader function to encourage people to expand its wings
PS= are u talking about augmented reality? :)
Peace!…
since only starting with Grasshopper and Kangaroo two weeks ago, I've had a lot of fun making this and it clearly works very well! I'm very impressed and can't wait to experiment some more.
The speed of the simulation clearly suffers at this level of complexity, but I suppose you could set up an timer-stepped routine to control the speed of iteration?
For the sake of carrying out further experiments, I will need to be able to calibrate the performance of materials with real mechanical properties. I look forward to seeing to what extent this is possible. The Kangaroo documentation explains what units and inputs are used and so I think some level of calibration should be feasible. I imagine that the use of the word "strength" in Kangaroo was selected because to non-engineers it is more intuitive - but the engineer in me would prefer to see the word "stiffness" used where appropriate. Furthermore, the units of "I" for bending stiffness for the angle component described in the documentation I think should be [m^4] and not [m]. I wouldn't be mentioning this if I didn't get the impression that Kangaroo wants to embrace engineering analyses and projects?
Thank you again, Daniel for your great support!! I look forward to carrying out more engineering experiments in Kangaroo. This really feels like it could and should be the future of structural engineering simulations.
Regards,
Greg
P.S. I would like to know how the Volume component is working. Is it modifying pressure (force) on particles according to Boyle's law of proportionality? It certainly seems to be behaving as such. What then, is the "strength" input here?
GIF animation - click to open:
…
pecialx")
I'm sure everything else follows the directives in the SDK help file and the bin path is added to the Developer settings path list. I'm sure I'm missing something basic.
Any help to get me going would be appreciated.
(Rhino 4 SR8, VS 2010, WIN7)
…
eded to calculate many Waterplane Areas and the GH Area component was bogging things down. I looked to Basic Ship Theory and the use of Simpson’s Rule which in this case mirrors an intersection between a Half Hull and a waterline and then divides up the enclosed waterplane into an even number of equally spaced segments to calculate the area. The result of which is 99.997% of the Rhino and GH area and about a thousand times quicker (more actually). But when checking my method I lofted the simple section curves and fed this into an Area component and had a result a hundred times quicker than the original. This got me thinking that it was the complexity of the Surface that was a problem so I rebuilt the curve with the same number of points as used in the Simpson’s Rule calculation… This was even worse now taking 4 minutes as opposed to 2.8. Wondering why, I realised that the original surface and my Simpson’s surface where created 90º to each other. One lofted from one side of the vessel to the other whereas the quicker method lofted along the length. So I swapped the UV of the original and low and behold 4.3s….
The methods, results and images of the different area calculations are shown below with Simpson’s Rule at the top followed down by: Simpson’s Surface, Original, Swapped UV, and Simplified at the bottom. Also I attach the Definition AreaQuestion.gh
It’s also interesting to note that Rhino Itself does not take anywhere near as long to calculate.
All achieve as fast as I can select a surface and right click
I know the Area component does a lot more than what Simpson’s rule can achieve i.e. 3D surfaces with complex shapes but it would appear that some sort of evaluation of the surface regarding the UV direction might speed things up or if there was a check for planar surfaces to implement a numerically faster approach such as Simpson’s Rule.
I hope this was all of some use.
Slaynt vie!
Danny
…
symmetrical comparing the lower horizontal building. But the diagram of the sunlight hours on the south facade of the lower horizontal building is a sort of strange. I would expect the point of the facade receiving the least amount of sun light hours is the one where the red arrow is pointing, in the bottom center of the facade. But it is not. Is this possible?
The analysis period is all the hours of the days from 22.04 to 22.08
Lat.59.4 Lon.24.7 timezone+2
The same analysis including the ground produces the same (not expected) results.
To double check I also run a Radiation Analysis and this looks more appropriate, the area of the facade receiving the least radiation it is in the bottom center of the lower horizontal building (despite also in this case the point receiving the least radiation is not the one exactly in the bottom center of the facade as I would expect but this small deviation could depend from the low settings accuracy).
This is the definition.
2 - using the same model I realized that if the timestep input of the Sunlight Hours Anlaysis is changed, for example from the default value (1) to 4 then the totalSunlightHours result also change!! It gets 1/4 or the original result using the default time step of 1 and this is quite confusing.
Thank you in advance.
Francesco
…