l operations. Aside from its geopolitical position and commercial significance, Thessaloniki has been for many centuries the military and administrative hub of the region, and beyond this the transportation link between Europe and the Levant. A series of design studies will be put forward to rethink the way by which city environment in Thessaloniki have been affecting its’ population according to changing needs and to visualize such urban shifts on a more hyper specific contextualized construction model. Throughout the investigations on the research agenda, current trends on the habits of architectural practice will be re-visited.
Innovative urban interventions informed by bottom-up rules extracted from existing city conditions will formulate the major focus of design proposals. Design teams will be working with simulation tools and digital fabrication methods throughout the design research phase. The design brief will be initially explored through the combinatorial use of different computational design tools. Methods of connecting form‐finding methods with form‐making techniques will be investigated. Various manufacturing techniques enabling a hands‐on experience on the diverse range of digital fabrication systems will formulate the starting point for the physical tests. Finally, the design and fabrication of a one-to-one scale pavilion will unify the goals of the programme.
Prominent features of the programme / skills developed:
- Participants will be part of an active learning environment where the large tutor to student ratio (5:1) allows for personalized tutorials and debates.
- The toolset of AA Thessaloniki includes Autodesk Maya, Rhinoceros, Grasshopper and Arduino.
- Participants will have access to digital fabrication tools such as 3-axis CNC router, laser-cutter, and 3d-printer.
- Design seminars and lecture series will support the key objectives of the programme, disseminating knowledge on new design anatomies including machinic control, computational space, and complexity in systems, and innovative urban design approaches.
Eligibility: The workshop is open to architecture and design students and professionals worldwide.
Accreditation: Participants receive the AA Visiting School Certificate with the completion of the Programme.
Fees: The AA Visiting School requires a fee of £600 per participant, which includes a £60 Visiting membership fee. The deadline for applications is 15 October 2015. No portfolio or CV is required.
Discount options are available. Please contact the AA Visiting School Coordinator for more details.
Online application link:
https://www.aaschool.ac.uk/STUDY/ONLINEAPPLICATION/visitingApplication.php?schoolID=316
Programme Director:
Alexandros Kallegias (AA Greece VS Director): alexandros.Kallegias@aaschool.ac.uk…
ino:
Go to "Windows Control Panel", then "Programs and Features", then find "Rhinoceros 5 (64-bit)" and "Rhinoceros", select and "Repaire".
Permalink Reply by Heath on August 14, 2013 at 1:13pm
I got it to work, thanks.
Permalink Reply by Akche MacEshwa on August 22, 2013 at 8:20pm
Right click the .rhi file and open it with rhino execution wizard which is located in Rhino directory. Good luck.
…
Added by Adam Donner at 5:38pm on September 19, 2013
ences, so not terribly important in the end. After all, it's not really worth going through a lot of trouble to get a 15% speed increase; 15% faster than slow is still pretty slow.
Also processor speed has pretty much peaked these past few years, there have been no more significant increases lately. Instead, manufacturers have started putting more cores on motherboards, which is something GH unfortunately cannot take advantage of.
Multi-threading (very high on the list for GH2) brings with it a promise of full core utilisation (minus the inevitable overhead for aggregating computed results), but there are some problems that may end up being significant. Here's a non-exhaustive list:
It's not possible to modify the UI from a non-UI thread. This is probably not that big a deal for Grasshopper components, especially since we can make methods such a Rhino.RhinoApp.WriteLine() thread safe.
Not all methods used by component code are necessarily thread safe. There used to be a lot of stuff in the Rhino SDK that simply wouldn't work correct or would crash if the same method was run more than once simultaneously. Rhino core team has been working hard to remedy this problem, and I'm confident we can fix any problems that still come up, though it may take some time. If components rely on other code libraries then the problem may not be solvable at all. So we need to make sure multi-threading is an optional property of components.
There's overhead involved in multi-threading, it's especially difficult to get a good performance gain when dealing with lots of very fast operations. The overhead in these cases can actually make stuff perform slower.
There's the question on what level should multi-threading be implemented. Obviously the lower the better, but that means a lot of extra work, complicated patterns of responsibilities and a lot of communications between different developers.
There's the question on how the interface should behave during solutions now. If all the computation is happening in a thread, the interface can stay 'live'. So what should it look like if a solution takes -say- 5 seconds to complete? Should you be able to see the waves of data streaming through the network, turning components and wires grey and orange like strobe lights? What happens if you modify a slider during a solution? Simple answer is to abort the current solution and start a new one with the new slider value. But as you slowly drag the slider from left to right, you end up computing 400 partial solution and never getting to a final answer, even though you could have computed 2 full solutions in the same time and given better feedback. Does the preview geometry in the Rhino viewports flicker in and out of existence as solutions cascade through the network?
…
n lofting, though, it makes perfect sense to scale sections independently from the distance between them.
For practical use, I found the graph mapper clumsy; too course and approximate. So I adapted the code I wrote here (Maths + Divide Curve) so that a list of numbers drives the spacing and, optionally(!), the scaling.
When 'Scale by Distance' is false, the numbers in the list determine scaling; '1' is actual size, '0.5' is half size, '2' is twice the size, etc.
When 'Scale by Distance' is true, the distance between the points is used for scaling. This is an indirect effect of the list of numbers (which determines point spacing) and the size of the original shape relative to the curve length.
'Tangent 0' is the curve tangent at each point. It works well for lofting.
'Tangent 1' is the vector between each point and its successor. It works well for orienting solids.
There are still some mysteries... ("Where there is mystery, there is no mastery.")
Lofting doesn't always work well, 'Cap Planar Holes' doesn't work anymore...
I had hoped that this sequence, ".5,1,2,1,.5", would result in:
two half size shapes, one at each end of the curve.
two full size ("1") and one double size ("2") shapes, spaced appropriately.
But I have a mental block about how to achieve that...? :( Instead, I settled for the last of the five shapes being one point short from the end of the curve, and the spacing is off.
Even so, I find this approach easier to use on a practical basis than the graph mapper.
…
y stages of design, mainly due to the large uncertainties that exist in these phases. Optimisation in the early phases may be helpful, but it does not provide the designers with more information on "where to go from here". Once the designer changes a parameter to suit a client requirement, legal requirement or other, the optimised result may very well be thrown out due to the parameter being changed having such a large effect.
I am hosting several workshops and focus groups in the next month (one for students at Victoria University of Wellington, one for architect practitioners and one for engineering practitioners) to teach the basics of Honeybee and Ladybug within Rhino as NZ is very new to any form of distributed modelling methods (using visual language programmes such as grasshopper and dynamo to communicate between design tools and building simulation program tools). In the focus groups, I am not focusing on the tool of Honeybee so much as I am asking the industry its opinions on the feasibility and wishes of developments such as Honeybee.
I find that many informal interviews I have been having have pointed to the question: Would you rather want to know the optimised concept or the most significant design parameters which you should be wary of at the early stages of design?
I am amazed at the capabilities of Honeybee because it has been such a pain to remodel anything for E+ and Radiance in the past. I particularly love the ability to generate hundreds of idfs with varying parameters within 10min, without having to set up some form of macro to do it. The visualisations of Honeybee are awesome! To say the very least. But as someone who is interested in doing a sensitivity analysis, say with Thermal Autonomy, I feel like there is a lacking element to analysis from an engineer and research/academic stand point.
The way I have set up my files actually create 300+ idfs with all the various different parameters. The parameter ranges only vary from a low, typical and high setting for power densities, WWR, schedules and insulation. These have all been drawn from a large 5 year project where we monitored commercial buildings here in NZ to gain a better understanding of data for purposes like this. I then run them in parallel as batch files and re-insert the data back into Honeybee.
What I am playing around with at the moment though is that, due to the fact that the TA component require so many additional components to then analyse the data in that form, and also that it does not simply give a numerical value in % for the space's performance, I need to re-evaluate the csv that it produces for further analysis.
I have only just begun to try doing some form of sensitivity analysis within Honeybee itself, but I was curious if there were already plugins within grasshopper which may already allow some form of sensitivity analysis.…
iles and rad files in this folder :
C:\Users\Sarith\AppData\Local\Temp\radSources (replace my name with yours).
However, the pic file which is created in that folder crashes. So, I tried to hack the process and get an image through a screenshot and ra_bmp.
This is the rendering that I got through Relux:
This is what I got from my own renderings with image that I got:
This is surprising because the material is called Glass. However the radiance definition of the material is :
void colorpict mat_104~19 red green blue surfpic2.pic alignpic.cal tile_u tile_v -s 0.5850001 0.975mat_104~1 plastic mat_104005 1 1 1 0 0
The colorpict on the the first line only modifies color patterns and does not make the material transparent/translucent. Similarly, the plastic type implies that the material won't be transparent anyway.
I don't think this material as defined in Relux is a physically based material. You are probably better off importing the Raytracer materials to Honeybee. Andy Mcneil had an excellent presentation about GlassBlocks in the last Radiance workshop: http://www.radiance-online.org/community/workshops/2015-philadelphi...
The pic file that I generated is here : https://www.dropbox.com/s/5c32layqehdns72/surfpic2.pic?dl=0…
you working on a PV system which will power a domestic hot water boiler?
To answer your questions:1) Each grasshopper component (ghpython being one of those too) is using grasshopper's data matching algorithm. This algorithm takes care of complex issues which may arise from combining lists with single items, data trees with different number of items per branch and so on.I think there is a way of introducing a call to other processor's threads per each inputted surface, but this will be a very difficult job, as it will require writing a custom data matching algorithm. I do not think I am up to that task.Instead I tried to introduce the multithread only to the final part of the PVsurface component and one of its time consuming parts: calculation of sun angles, solar radiation and ac/dc power output.I attached the test file below, but sadly it didn't go well: the multithreaded version mostly runs at the same time as the regular version.I do not think I am qualified enough to answer why is that so, but I think that it may have something to do with the type of the function that the multithreading is applied to: the code is suppose to run few separate functions a couple of thousand times, and work with a couple of lists. From my experience, the multithreading works the best when a single list or two are supplied to a single function. I may be wrong on this.I am very sorry to say that I can not implement this feature.2) I am not familiar if open source PV modules database has been released.But one can always download the data for specific modules from producers websites. It can then easily be transferred to a .csv file or other text file.Ladybug Photovoltaics are based on NREL's PVWatts model.In comparison with other commercial software applications, PVWatts offers a more generalized system model, with some of the values and characteristics being assumed or embedded.The Fuentes empirical thermal model we are currently using follows the same logic: it generalizes the Module characteristics. The following characteristics are only editable: module efficiency, temperature coefficient and module mount type.It may be possible to replace Fuentes with some other, less generalized 5 parameter thermal model. But as an architect, I would definitively need help on this.
Sorry if my reply did not fulfill your expectations, and thank you for the kind words!…
mething? I think it would be very useful to have a mapping of light intensity over the field of view of the used camera, and possibly and option to overlay it on the luminance mapping. It would in a very visual way provide information about contrast and glare.
Doesn't the falsecolor option already do that for luminance mappings? If not can you post an image/screenshot of such a mapping from Dialux/AGI32 or any other software.
4. It's just a shoebox type simulation. 11x11 luminaires pointing down to simple materials. The default elapsed time was 3m40s. I have found the _RadParameters component meanwhile, and got it down to 0m30s. I have noticed that the simulation doesn't tax multiple cpu threads completely, most of the time cpu is at 25% during execution.
The under-utlization of CPUs is a known issue with Radiance (the calculation engine) on Windows based systems. Unfortunately there isn't much that can be done about it at the moment.
5. Is it possible to map different degrees of translucency, diffuse color, absorptance, reflectance, etc..., by means of a bitmap image, expression, or other?
6. There is a feature that I consider absolutely necessary (and I haven't found it yet), which is the emitting surface feature, with the ability to stipulate homogeneous intensity with luminance values (in cd/m^2) or flux; and by mapped distribution of intensities or luminances (in cd or cd/m^2).
By emitting surface I don't mean just a flat rectangular plane, such as an area light. It would be absolutely amazing! to perform photometric analysis on irregular and convoluted shapes and the light falling on neighbouring surfaces. 3DS Max with MentalRay provides similar functionality, but without the power of GH + HB.
In the image below, the HB logo is assigned as a texture to a glass which then creates a pattern of that on the wall when daylight falls on it.
ln the image below the light from the Batman logo illumninates the scene.
The images above were Rendered with Radiance. While these things are possible with Radiance, and therefore HB, the reason why they aren't incorporated into the code is that these effects are not "physically based" and are not rooted in reality. Radiance is arguably the most intensively tested and validated lighting simulation software in the world. However, once we start applying such "magic" to it, the results from it are no longer reliable and therefore no different from other photorealistic engines such as V-ray, Mental-Ray etc. …
stand completely (i just don't get the math part...).The code can be found here: http://digitalsubstance.wordpress.com/subcode/
So i decided to make my own definition: a cube deformed by 5 attractors and i was wondering if someone can help me solve the meshing at the end of the definition because when i bake it, it gives me an open mesh and i don't understand why ? Waterfall meshes are not suitable for 3d printing... I don't think i've used the clean, weld, and unify faces in the good order ? Maybe there is a problem with the surfaces ?
Secondly i'm not very proud of the result of my cube because it's so deformed that it is a not a cube anymore... so i was wondering if a square grid of points can be deformed by an attractor but still keeping the straight boundary of the grid ?
I had an idea to make that: i make my points, create the vectors between the grid and the attractor points, calculate the distance between the grid points and the attractors: it gives me a list of distances that i remap to control the strengh of my attractors. On the other side i calculate the distance between the boundary of the grid and the grid points and it gives me a second list of numbers. So i wanna average the two list of numbers in such a way that the closest it is to the attractor it takes the distance from the first list and the further it is from the attractor (so the closest it is from the boundary) it takes the distance from the second list ?? I'm sorry for my bad english but even in french it's little bit hard for me to explain it ;). So what can i do to have a grid attracted by a point without moving the boundary points ??
And please don't tell me to cull the boundary points first, to deform the grid and to rebuild the grid after... it gives an ugly cube face at the end, even with a lot of polishing with weaverbird...
If someone has another idea to achieve that please tell me ;)
The first definition "CleanCubeMeshingHelp"is a little bit heavy so watch out if you have a small laptop (any ideas to make it work faster are welcomed !!)
The second one is the one with the two list of numbers.
Also a last questions: what is and when to use the "blur number", "interpolate data" and Weighted Average" under math utilities ??
Thank you in advance for you answers and i apologize for my lack of vocubulary.…