en la práctica de nuevos métodos de diseño y fabricación utilizando herramientas digitales. Estos procedimientos emergentes están cambiando radicalmente la manera en que nos aproximamos al proceso de diseño en términos de concepción y producción. Los participantes serán introducidos en el uso de softwares de modelado 2d y 3d para la generación de geometrías que serán posteriormente mecanizadas in situ en una máquina de control numérico CNC de 3 ejes.
¡AL FINAL DEL CURSO TE LLEVAS TU LÁMPARA A CASA!
Profesores: Equipo MEDIODESIGN* + TOOLINGROUP*
*Official Rhino Trainners. Acreditación otorgada por McNeel, desarrolladores del software Rhinoceros.
Lugar: Mediodesign. Pallars 85-91 5-2 BCN
Duración: 16 / 20 horas
Fecha: sábado 9 / domingo 10 julio de 2011
Horario: de 10h a 14h / de 16h a 20h
Plazas: 20 participantes
REQUISITOS
< Dirigido a estudiantes y profesionales de la arquitectura, diseño y profesiones afines.
< Ordenador portátil.
< Softwares instalados. En el momento de la inscripción, los participantes recibirán las instrucciones para la descarga e instalación de versiones gratuitas (trials) de los softwares.
CONTENIDOS
< Introducción al diseño avanzado y la fabricación digital.
< Entorno Rhinoceros y sus plug-ins.
< Herramientas y estrategias de trabajo CNC.
< Materiales y sus características.
< Planteamiento del ejercicio: diseño de una luminaria
< Desarrollo del archivo de RhinoCam para el mecanizado CNC.
< Mecanizado y post-producción.
< Entrega de propuestas: Presentación en formato digital del proceso de diseño y fabricación (pdf, powerpoint, etc…) y del prototipo de luminaria realizado.
INSCRIPCIONES
Precio: 199 € Materiales incluidos.
Forma de pago: mediante transferencia bancaria.
Límite fecha de inscripción: lunes 4 de julio 2011
Se otorgará certificado de asistencia. …
ding is not for the faint of heart and is quite a significant understanding. However, I don't know what your dealing with, so that may be the way to go about it.
Your component if its "finished" has to supply some sort of results that are then used downstream. AFAIK there isn't a way to "prevent" down stream components from calculating until your finished. They have to get some sort of information or else they'll just be waiting. Considering how the results of those components are likely to be invalid until the information gets calculated, it may be better off supplying them with nulls until you have some actual information to give them.
Anyway, I think that you should think very closely about the structure of your routine, and specifically how it will interact and update itself. The way I'm thinking about it now is that there really isn't anything that's done in the "solve instance" function if you will. Essentially the "solve instance" function would either A) start the reading of the file if no data is found, or B) output some data if it is found. This is an extreme undersimplification, but the simpler you keep this the more likely this will work. Here are a few more "details", i guess, of how I could see this potentially working...
Thread A - Initial call to Solve Instance function
+ Check and see if there are any results that exist from reading your file - at this point there shouldn't be. These results should be stored in some sort of class variable that is accessible to both threads. It might also be a good idea to have some boolean flag that will also be accessible that represents whether your reading/writing those variables.
+ Fire a function in another thread that begins the read process. Note that you'll likely have to do this through a delegate and an invoke call, but I'm not 100% sure
+ Fill in some null values for the variables you must supply
+ Output the nulls, thus finishing the Solve Instance function
Thread B - File Read Function running in separate thread
+ Open up the file. Note that its probably a good idea just to pass the file path (as a string) between the different threads. Leave the creation of the file/text stream to the one thread that's using it.
+ Perform all the necessary reading from the file
+ Copy all your data to the variables that are accessible to both threads.
+ Expire either the solution on either the component in question or (at last resort) the whole canvas. I know expiring the whole canvas is defenitely possible, but it should be possible to just expire the one component that's doing the reading.
Thread A - "Second" call to Solve Instance after being manually expired
+ Check and see if there are any results that exist from reading your file, which there now should be.
+ Output those shared results
+ Clear the last results (or cache them in some way) so that the next time the Solve Instance function is fired, you don't find any results and reread the file.
I think there are a few variations to this that could happen too, including having a separate function for reading and writing through the data that's called using its own delegate/invoke call to make sure that its extra safe.
If you haven't already, you should really look into event driven programming, delegates, and asyncronous messaging. These are going to be the 3 things that you'll need to have a decent hold on to make sure this things works. Just to let you know, debugging these things can be a bitch.…
ght on why this is, and some ideas I have for how to improve things going forward.
MeshMachine grew out of some scripts I started developing over 3 years ago (described here), originally just with the aim of achieving approximately equal edge lengths on a smooth closed triangulated mesh.
As time went on, I kept adding things, such as ways of keeping boundaries and sharp edges fixed, different ways of controlling edge lengths that vary across the surface, and different ways of pulling to surfaces.
I was also still experimenting with different rules for the core remeshing operations, such as valence driven vs angle driven edge flips.
All of these things meant many variables in the script. I wanted to share the work so others could play with it, but not really knowing exactly what people might use it for made it difficult to simplify the interface, so I just exposed most of these variables I was using (actually there were originally even more, but I felt a component with 20+ inputs was excessive, and combined some of them and fixed others to default values).
I've never been happy with that component, but some people want a component that you can just feed a surface and get a mesh with 'nice' triangles, without too much fuss or needing to know anything about how it works, while other people want to be able to vary the density based on proximity to the border, and curvature, and attractor points and see the intermediate results, and model minimal surfaces without pulling to any underlying surface, and...
Since then I did the rewrite from Kangaroo to Kangaroo2, and through that process, and associated conversations with Steve Baer, David Rutten and Will Pearson, my ideas about how to structure libraries and make cleaner more flexible Grasshopper components changed. Much of this centres around using interfaces (in the specific programming sense, not to be confused with UI), because they allow separating code into multiple components, while still allowing to edit parts of it within Grasshopper, and other parts in a proper IDE (because I find the GH code editor is not conducive to writing large amounts of well structured object oriented code).
Towards the end of last year, Dave Stasiuk and Anders Deleuran invited me and Will Pearson over to CITA for a few days of mesh and physics coding and beer drinking. During this time I made the first steps to restructuring MeshMachine to be more modular and interface based like Kangaroo2, instead of one giant script. One of the main motivations for doing this was to make it easier to combine the K2 physics library with the remeshing. However, at the time I hadn't yet released K2, so it didn't make sense to post examples that used those libraries. After the launch of K2, this restructured MeshMachine development has been a bit on the back-burner, but this discussion and Dave Stasiuk's work with Cocoon is inspiring me to pick it up again.
Seeing how you are combining the Cocoon and MeshMachine, and how Dave is also using interfaces in his recent work suggests to me it might be possible to integrate them more smoothly...
…
Thermal Comfort Indices incoming long wave radiation uses Ångström clear sky conditions emissivity coefficient, with Maykut and Church cloudiness factor so that it accounts for cloudy sky conditions. So basically this corrections creates the cloudy (all) skies emissivity coefficient.
I was not aware of the fact that .epw files have been shipped with the horizontalInfraredRadiation. Thank you for posting this information!I took a look at EnergyPlus horizontalInfraredRadiation page.In there they state that: "If it is missing, it is calculated".I am pretty much certain that in Serbia, government weather stations do not record the long-wave radiation data.So I thought that this data is probably recorded in more developed countries, definitively in USA.I replicated the formula provided by mentioned bigladdersoftware.com page in cases when "Horizontal Infrared Radiation Intensity is missing".This is the same Clark and Allen (1978) formula mentioned in upper attached Survey of Sky Effective Temperature Models.pdf, Table 3.Once I checked the differences between the Clark and Allen formula and the horizontalInfraredRadiation from the epw file, they were not actually really close, but the same.I checked for a couple of locations in USA, but also outside of it, for different climates. The results were again the same:
I attached the file below.I must admit that there are also hours when -1/+1 difference Wh/m2 exists, but my guess is that this is due to rounding of the values which is performed by EnergyPlus as shown in the example at bigladdersoftware.com page (0.036, 0.815, 340.6).
So it's either the Clark and Allen formula is super precise and can predict the long wave radiation to be exactly the same as the physically measured value, or .epw weather files are in fact not using physically measured values but calculated ones.My assumption would be that it is the second case.
It seems that the only major difference between the two is that the Maykut and Church model seems to be overestimating the long wave radiation loss to the sky in very cold conditions. Otherwise the models show fairly good agreement
Based on upper assumption, I do not think that we can distinguish whether either of these two incoming long-wave radiation methods is more precise or if the other one overestimates the La values.Thank you too for the knowledge and the shared information!!…
ly 26-27-28-29 (digital fabrication)
The third edition of digitalMed Workshop is structured as a design laboratory. Participants will learn the challenging process of producing ideas, projects and research analysis that are to be developed through specific software and concepts that emerge through the use of mapping, parametric design and digital fabrication.
The workshop will take place in the city of Salerno (Italy) and it will last 11 days structured into 3 intensive weekends: July 13-14-15 (mapping); July 19-20-21-22 (parametric design); July 26-27-28-29 (digital fabrication).
Goals and Objectives:
We aim to make clear the theoretical and technical knowledge in the approach to parametric and generative design and digital fabrication. (From collection and data management, to the manner in which these inform the geometries, to the fabrication of prototypes.)
Participants will also have the opportunity to practice the new knowledge gained in the design laboratory through project work.
Project Theme:
"Urban Field" Identify, study and analyze the system of public spaces in the urban area of the city of Salerno.
Connection, mutation, generation and evolution are the themes to be followed in project work.
Brief Description of Topics:
- Mapping. Our reality, in all its forms, has studied through concepts of the theory of Complex Systems. The techniques that will be used to study events and places of reality, will work for the management, manipulation and visualization of data and information. These will form the basis for project management and driven geometry, conducted during the second phase of the workshop.
- Parametric Design. Introduction to Rhino* and Grasshopper. Specifically, we will explain the concepts with which to work with the software of parametric design and how they function. Through these tools, we will arrive at the definition of systems of mathematical and / or geometrical relationships that are able to generate and govern patterns, shapes and objects that will inform the final design.
- Digital Fabrication. In this phase, participants of the workshop are organized into working groups. Participants have access to materials and conceptual apparatus that will take them directly to the fabrication of the geometries of the project, with the use of software CAD / CAM interface and the use of machines for the digital fabrication.
The DigitalMed workshop is organized by Nomad AREA (Academy of Research & Training in topics of Contemporary Architecture), in collaboration with the City of Salerno, the Order of Architects Province of Salerno and the National Institute of Architecture In / Arch - Campania.
Interested parties may download the Notice of Competition at the address www.digitalmedworkshop.com and fill the pre-registration no later than July 10th 2012.
PRESS OFFICE
Dr. Francesca Luciano
328 61 20 830
fra_luciano@libero.it
For information or subscriptions:
e-mail: info@digitalmedworkshop.com - tel: 089 463126 - 3391542980 …
Accidentally that was very close to some project that I have in mind (using solely C# and not components). On first sight I thought that that could be very easy ... only to discover that's not.
This definition is an over simplified version of the other mentioned (only a C# is maintained that does "preparation" work and some sort of naive "topology" checks: the yellow spheres are used as visual aids to the incompatible struts/R values combos).
You can control the 3 options available from that portion:
In a nutshell ... the Exo W behaves with an odd way (at least in my opinion). In order to get the gist of the issue stick to that portion of the def and forget the rest:
This portion of the def attempts to create an usual Exo mesh using a Line list (cleaned and user controlled as regard the min length) derived from exploded mini voronoi (i.e. brep edges). OK, I can understand the red Exo since due to the nature of voronoi breps there's more than possible the presence of small "struts" that may yield non manifold topologies.
But ... the thing is that Exo W is also red in the other mode (non Voronoi) where struts are quite big and no potential "engulfed" situations may occur:
And when the 2d Gate mode is set to Envelope ... there's cases (R values) where Exo W works as expected and cases that it doesn't.
Anyway ... if anyone has any bright idea, drop a world
best, Peter
…
project below- should I be learning Grasshopper & Rhino or just Rhino first?
I'm trying to panel modules with low tolerances- I've prototyped regular shapes like geodesics and am now looking to experiment with irregular shapes with lots of different panel shapes.
I understand some things are best done through Grasshopper when using Paneling Tools- I'm trying to figure out if I can do what I want to achive with PT alone or should do it through Grasshopper (or some other route).
I’m on the MAC WIP - The module was built in Sketchup - all the components seem to be in order as blocks though am having problems running the ptpanel3dcustom command - thinking maybe a bug in the WIP or something wrong with my input or that I imported the sketchup file the wrong way. (I dropped it in the window) - If the 3D command is run it doesn’t do anything - if 2D (ptpanelgridcustom) it crashes.
The tileing pattern - the green rectangle is a refrence. each tile contains 4 blocks with 3 more nested in each.
How the module tiles.
The other thing I'm trying to do is specify that most of the lines in the panels don’t bend/curve when they are paneled (or something like Cage Edited). For my purposes the length & angles can change while the lines must remain straight.
These images show a test tile to be panneled on a ellipsoid. When the tile is mapped to the grid the lines curve, this is an extreme example but notice allot of tiles far from the hemespheres are also bent slightly.
These two questions have me stumped the most for now. What should I look into get a better handle on these problem areas? Maybe I should try recreating the work on a windows machine? or perhaps I should get started with Grasshopper?
Thanks for reading.
Lu…
mers considering extreme sports reject mainstream retailers and like to check out small stores rather of at chains plus malls. Several smaller retailers discuss trends in sports shoe sales. http://skateszone.com/
Though athletic shoes and sports stores and from doorways retailers have reported somewhat uptick in footwear sales due to the increase in extreme sports, the particular beneficiaries inside the trend are independent surf and skate niche stores.
Some West Coast surf and skate shops stated teenagers and even more youthful Generation Xers are not only rejecting traditional sports, but they're also shunning mainstream retailers and malls meant for smaller niche shops transporting hard-to-come-by brands.
Eddie Miyoshi, district manager at Atomic Garage, a 3-store chain situated in Gardena, Calif., stated the soaring recognition of skateboard footwear has boosted the retailer's total footwear business 20-thirty percent this year, rather of '95.
Skate footwear presently represent 80-90 % of Atomic Garage's shoe sales, while couple of years back, Dr. Martens and Timberland drove the retailer's footwear business.
Like many retailers, Miyoshi pointed to Airwalk since the trend's catalyst.
However, if Airwalk broadened its distribution to larger chains, which are frequently located in malls, only a few skate shoe customers adopted. Rather, many youthful males have switched for your skate shops for additional elusive brands like Etnies, Duffs, and Electricity Footwear by Circus. By refusing to market bigger retailers or sports stores, these brands are increasing their cachet among youthful consumers.
"Kids don't want stuff which have been within the shops,In . Miyoshi added.
Searching ahead, Miyoshi forecasted skate shoe sales will remain strong through spring '97 provided "the [hot] vendors don't auction other [non-particularly shop] retailers."
"Skaters and non-skaters are rebelling against mainstream retailers so on to surf and skate shops for many looks," echoed Mark Richards, co-online sources Val Surf, a 3-store chain situated in North Hollywood, Calif. Soaring sales of skate footwear have driven total footwear receipts up 25 percent this year rather of '95.
"The quantity of that increase might be connected while using exposure of maximum games? I am unsure. [Skate footwear] may also be actually the think about the moment,In . Richards acknowledged. And in relation to getting this right look, youthful customers can be very picky.
"Skateboard footwear is a huge category for people, but we're not able to own the brands, Etnies, Duffs, Electricity and Nice, simply because they won't sell us," stated Mark Anderson, buyer at Chick's Sports, a six-store chain in Covina, Calif. "We have people coming every single day requesting them." Consequently, skate footwear have consistently ongoing to obtain about 5 % of Chick's overall footwear business. http://skateszone.com/the-top-8-best-skateboards-for-beginners-reviews-2017/
Nonetheless, some outdoors, niche sports and sports retailers are noting the growing recognition and coverage of maximum sports will receive a modest impact on footwear sales. Trailrunning footwear and approach/outdoors crosstrainers will be the two groups benefiting the very best inside the recognition. Like the skate shoe business, some retailers realize that styling instead of function frequently drives sales of individuals footwear.
"At this time the merchandise is a lot more visual than function," stated Chet James, gm of Super Jock 'N Jill, Dallas, speaking about trailrunning footwear. Still, James noted the current hype over adventure sports helps draw more customer traffic. "The marketing campaigns and media help bring growing figures of people in, nonetheless they frequently occasions day an issue that increases results on their own account,Inch he conceded.
John Wilkinson, executive vp inside the 85-store chain Track 'N Trail, Eldorado Hillsides, Calif., stated the shop has "seen some activity in approach footwear," but he requested the amount of consumers depend in it commercially sport. And, instead of accelerating total footwear business, Wilkinson speculated elevated sales of approach footwear and trailrunners are gnawing away at traditional hiking shoe and boot volume.
But Dan Bazinet, president of Overland Exchanging, a 34-store chain situated in Westford, Mass., believes the company-new looks have breathed existence for the wilting hiking boot category. "[Approach-type footwear] don't represent the lion's participate the hiking market, nonetheless they have elevated the hiking business and provided us extra sales," Bazinet stated.
He designated Timberland's Treeline Series and Rockport's Leadville line as strong performers. Unsurprisingly, he noted the company-new looks are attractive to youthful consumer base than traditional hikers.
For that month of June, sales of men's hikers were up 49 percent at Overland, rather of June '95, while sales of women's hikers were up 17 % for that month. Bazinet also attributed elevated sales that shops walked inside the hiking business, departing that business for that specialists.
Some retailers draw a good example concerning the hiking boom of two yrs ago combined with the current extreme sport phenomenon. "Plenty of bigger chains will get a specific percent in the industry while [extreme] sports remain a fad because they are selling cost-point type gear," described Steven Carre, assistant hard goods buyer at Adventure 16, a six-store chain situated in Hillcrest.
"However individuals [true enthusiasts] will say `we need real gear' and may shown up at us. That will help us after a while. What Size Skateboard good for an 3 4 5 6 7 8 9 10 11 12 13 14 year old
…
nd the challenge "Building the Invisible: Informing Digital Design with Real World Data". Information about each Workshop Cluster can be found here:
Cyber GardensUse the ForceUrban FeedsSuspended DreamsInteracting with the CityAgent ConstructionAuthored SensingPerforming SkinsResponsive Acoustic SurfacingHybrid Space Structure Typologies
The SmartGeometry 2011 Workshop will take place at CITA http://cita.karch.dk/
Applications to attend the SmartGeometry 2011 Workshop in Copenhagen will close on 31st January 2011. General Conference registration will open within 1 month.
We hope to see you there!
****************************************************
Workshop 28th-31st March
Shop Talk 1 April
Symposium 2 April
Reception 2 April
These events follow the highly successful previous SG events in Barcelona 2010, San Francisco 2009, Munich 2008, New York 2007, Cambridge/London, UK 2006 and multiple preceding events.
Click here for more info...
This year's Challenge is entitled:BUILDING THE INVISIBLEInforming Digital Design with Real World Data
THE PREMISEVast streams of data offer a rich resource for designers. By incorporating external information into our design processes the autonomy of the design is challenged. User data, energy calculations, embedded sensing, material and structural simulation, human behaviour and perception, particle flows and force fields allows design to be situated and responsive. From the simulation of megacities to the solid modelling of material systems, design has the potential to be informed by the real. Design sits not separate from is environment but inhabits an ecological system, open, dynamic and interdependent, diverse, partially self-organising, adaptive, and fragile. Across scale and within time we now have the chance to instil architecture with an immanent intelligence creating new relationships between the user, the built and its ecosphere.THE OPPORTUNITYSystems theorists suggest that data is only a raw material. It can be differentiated from information, knowledge and wisdom. Understanding is multi-levelled: understanding of relations, understanding of patterns, understanding of principles. As digital designers our challenge is in harnessing the power of computation to assist us in informing our design process. Computers help us collect, manage and analyse the environment and inform us about an abundance of data. Our challenge is to use these inputs in a meaningful way to help us make better informed design decisions.THE AIMSG 2011 explores how the incorporation of real world data challenges existing design thinking. The SG 2011 workshop aim is to create physical prototypes of design systems to be exhibited in the SG2011 exhibition.
The SmartGeometry Group is a not-for-profit educational organization dedicated to the use of computational tools in architecture and engineering. SG brings professionals, academics, and industry together to explore the next generation of digital design. SG Workshops are non-platform specific, believing it is the methodology, not the tool, that matters.
…
Added by Shane Burger at 11:23am on January 6, 2011