. From the Thermal Comfort Indices component, Comfort Index 11 (TCI-11):MRT = f(Ta, Tground, Rprim, e)
with:- Ta = DryBulbTemperature coming from ImportEPW component- Tground = f(Ta, N) where N comes from totalSkyCover input. Tground influences the long-wave radiation emitted by the ground in the MRT calculation.- Rprim defined as solar radiation absorbed by nude man = f(Kglob, hS1, ac)- ac is the clothingAlbedo in % (bodyCharacteristics input)- I can't find any definition in the code of Kglob and hS1. Could you tell me please what are those values referencered to? --> probably the globalHorizontalRadiation but how?- e = vapour pressure calculated from Ta and Relative Humidity input
Do you agree that in this case the MRT does not depend on these inputs: location, meanRadiantTemperature, dewPointTemperature and wind speed?It does not depend neither on the other bodyCharacteristics like bodyPosture, age, sex, met, activityDuration...?
MRT calculated by the TCI-11 method is the mean radiant temperature of a vector pointing vertically with a sky view factor of 100%?For ParisOrly epw,
2. From the SolarAdjustedTemperature component (that seems to be more used for the UTCI calculation examples on Hydra compared to TCI-11).
In contrast to the TCI-11, this component distinguishes diffuse and direct radiation and contextualizes the calculation thanks to _ContextShading input, right? It can also be applied to a mannequin thanks to the CumSkyMatrix and thus evaluate the dishomogeneity of radiation exposure.This component seems not to consider the influence of vapour pressure on the result --> is it then more precise to put the MRT output (from the TCI) as an input of meanRadTemperature for SolarAdjustedTemperature?The default groundReflectivity is set to 0.25 --> is GroundReflectivity taken into account in the Tground or MRT calculation in the TCI component? If yes, what is the hypothesised groundReflectivity?The default clothing albedo of 37% (TCI-11 bodyCharacteristics) corresponds to Clothing Absorptivity of 63%?
If the CumSkyMatrix input is not supplied, I get 9 results for the mannequin --> where are those points/results coming from?
If the CumSkyMatrix input is supplied,I suppose the calculation of the 482 results correspond to a calculation method similar to the radiation analysis component that is averaged over the analysis period. Right?But I don't understand why the mannequin is composed of 481 faces and meshFaceResult gives 482 results.
Finally, what is the link between the MESH results, the solarAdjustedMRT and the Effective Radiant field ? Is there a paper to have a detailed explanation of the method?
3. Here are some results for the ParisOrly energyplus weather data. You can find here attached the grasshopper definition.There is no shading in this simulation and the result coming from the ThermalComfort indices for MRT is very different compared to the solar adjusted MRT.Why such a big difference and which of the result should be plugged into the UTCI calculation component?
Results for ParisOrly.epwM,D,H:1,1,12
Ta : 6.5°Crh: 100%globalHorizontalRadiation: 54 Wh/m2totalSkyCover: 10MRT (TCI-11): 1.2°C
_CumSkyMtxOrDirNormRad = directNormalRadiation : 0 Wh/m2diffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.64°CMRTDelta: 4.14°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = TasolarAdjustedMRT: 10.47°CMRTDelta: 3.97°C
_CumSkyMtxOrDirNormRad = CumulativeSkyMtxdiffuseHorizontalRad: 54 Wh/m2_meanRadTemp = MRT (TCI-11)solarAdjustedMRT: 5.17°CMRTDelta: 3.97°C
Thanks a lot for your helpRegards,
Aymeric
…
administration, education and consumption, the contemporary world can be increasingly conceived as a global and systemic environment. All our activities are profoundly influenced by a new condition of fluidity and interdependence of various and very often, unpredictable parameters and factors, introducing us progressively to a systemic and parametric understanding of the world and our position in it. Architecture and the building process are reflecting this new conception of the world by redefining themselves according to new principles and means. The fast development of digital techniques to simulate, represent and generate Architecture promises a continuous design process, including the seamless transfer of information between the involved parties and making performance a key issue in the planning process. In this process, concepts of adaptability, transformability and flexibility are replacing already tested and secure solutions, customization is replacing standardization and metrics, and digital tools are replacing analogue representations. In these new conditions the scaleless and the seamless appear as the two key pillars of the requested integration in contemporary architectural practice and education. Do the design and planning practices and construction industries respond with digital synergies to these new requests? Can the curricula of architecture schools escape from the dominance of traditional fragmentation within their structure and the organisation of the modules and academic units towards more holistic concepts and workflow? How can the traditionally separate courses offered by departments and modules of architectural education institutions be redefined in order to assure a scale-less and seamless thinking about form, materiality and its social and cultural representations, its environmental aspects and its urban and contextual references?
The organisers are inviting architects, teachers and researchers of architecture in Europe to present their views, research outcomes and teaching experiences related to the theme of the Conference.
An abstract of 600-700 words must be submitted by September 5, 2012. Please indicate into which of the five aforementioned themes your abstract falls. You will be asked to submit your final paper by the 22nd of October 2012 for the publication of the proceedings, which will be distributed to all EAAE/ENHSA school members.
For any further queries please do not hesitate to contact us on info@enhsa.net or info@scaleless-seamless.org…
th the most crucial and imposing challenges that Mexico City faces and the ways in which architecture and urbanism can shape the metropolis at different scales. In these sense the progamme sees the city as a laboratory where the virtual and experimental tradition of the Architectural Association finds a fertile and concrete ground for the application of its methodology in Mexico.
“Manufactured Landscapes/Manufactured Urbanities” explores the metropolitan condition understood as a manufactured process by and for human beings. Henceforth the traditional opposing concepts, artificial vs nature, are replaced under the premise, nature does not exist, where nature is not natural but naturalised and the artificial is not an external or impose construct but manufactured intrinsically.
With this as a starting point the programme will study 2 instances of Mexico City’s “Manufactured Landscapes/Manufactured Urbanities”: The ravines in the west of Mexico City, last bastion of the existing “Nature” and its crucial role in the viability of Mexico City and social housing, as the fundamental construct of the “artificial” habitat in the metropolis´s urban tissue. These “Manufactured Landscapes/Manufactured Urbanities” and the ways in which they are designed, produced, reinvented regenerated, show a vast spectrum representative of the crucial urban conditions to be address and therefore they posed an enormous urban and architectonic challenge to confront in order to apply contemporary design methodologies.
To tackle the complexities of the “Manufactured Landscapes/Manufactured Urbanities”, the programme will immerse students and staff in a 10 day intensive workshop within a multidisciplinary environment where national and international experts from various fields will enrich their proposals. Students will work in architecture and/or urban scale teams and will critically assess the impact of their multiple scales interventions.
A backbone of lectures, talks and seminars, including local and international speakers, are designed to broaden and reflect the relevance and the importance of the topic for Mexico City. Finally a public exhibition of student’s work will be held at Centro Cultural de España in autumn 2013.
…
ectual property that goes nowhere:
In my opinion it's very dificult to determine when someones intelectual work becomes actual property that you should be able to protect.
There's a big difference between intelectual property and other types of scarce property (like a computer, a chair, etc.). Usually, its a good idea that scarce resouces are bought and sold in the market instead of sharing them because the price mechanism (supply and demand) determines its best possible use in that given moment. Intelectual property on the other hand is not scarce once it has been created, so if a 5 year old with an internet connection downloads a Grasshopper definition i created, it's not preventing an architect to use it for a more suitable purpose. Just like, in a practical sense, the more air I breath doesen't mean the less less air other people have left to breath, because there is so much air it could be asumed (today, at least) that the abuncance is infinite. So trading air in the maket place is nonsensical.
The only reason for copyright and patent laws to artificially make scarce a particular piece of intelectual property is so that people have an economic incentive to innovate and create new intelectual property. The advances in inovation should offset the artificial scarcity.
If that last point is true, it should be a good thing that people are not giving things up for free but rather selling them because it promotes inovation, but I'm personally not sure if this is true. Probably McNeel will agree to the last point on some extent and say that maybe patent laws go too far but copyright laws that protect Rhino and Grasshopper (even though right now it's free, it still 'owned' by McNeel) should be in place.
So I end up as I started, it's very dificult to determine when its a good idea (not just for an individual but in general) to sell or share this stuff.
If someone is interested in an extreme anti intelectual property rant from someone that otherwise defends private property, see this guy: http://www.youtube.com/watch?v=oRqsdSARrgk
…
However I feel that the rhinoscriptsyntax command curvecurveintersection gives a really complex output repeating the same intersection for each curve considered, and it seems to me also having some problems with the tolerances. It would be really nice to have a command which would give out directly just the intersection as a list of points just exactly how it happens when you run the "intersect" command in the usual interface of rhino.
However, I tried to make a function, even if it is far from being accurate.
import rhinoscriptsyntax as rs
def intersect (crv1,crv2,tol): divisions = 1000 pts1 = rs.DivideCurve(crv1,divisions) outpts = [] for i in range(0,len(pts1)): par = rs.CurveClosestPoint(crv2,pts1[i]) pt2 = rs.EvaluateCurve(crv2,par) dist1 = rs.Distance(pts1[i],pt2) if dist1 < tol: if len(outpts)>0: print "it is" index2 = rs.PointArrayClosestPoint(outpts,pts1[i]) dist2 = rs.Distance(pts1[i],outpts[index2]) print dist2 if dist2 > tol*5: outpts.append(pts1[i]) else: outpts.append(pts1[i]) return outpts
crv1 = rs.GetObject("crv1",4)crv2 = rs.GetObject("crv2",4)
pts = intersect (crv1,crv2,.01)
if pts: rs.AddPoints(pts)
don't know if someone has better ideas/solutions.
Just to give an overall idea my main goal was to sample (part of) a surface with equal length segments. To do so i put spheres equally spaced in one direction, than get their intersection with the surface (intersect brep = curves) than starting from one sphere and its intersection (curve) in the other direction get the intersection between the two curves (the curve curve intersection) and draw another sphere from there and so on.
I don't know if it is a conceptual problem or a problem of memory but it looks the iteration at a certain point just stop working, and the intersections, no matter what the tolerance, start to fail.…
ld see were the set of basic tutorials. I've run through a few other folk's video tutorials also.
The test case I chose, I picked because it is a super simplification of an actual space I'm trying to model (a large school sports complex - see below). Ive modelled it as a closed volume, with a few solid objects inside it, and it is a much less box-shaped space, with a ceiling that is not flat, and a significant lattice of acoustic panelling that encloses the roof trusses.
the volume of this space is around 50000 cubic metres, which if I followed the guidelines o0f 50-100 rays per cubic metre, would be 2.5 - 5 million rays. I ran a simulation on the test simplified box space with 100k rays, which took about 2 hours running on a macbook pro booted into windows. Perhaps I need to find a much more serious machine to run this on. would it be a reasonable assumption to think that as more rays are added, the results would converge on a particular solution? if so, if you had to take a guess, how many rays/m3 would be required to get a solid estimate of reverb time +/- 0.1s?
I don't mean to imply that Pachyderm isnt up to scratch - simply that I'm trying to find some way of determining whether a given set of simulation parameters are going to give a result that will be enough to make decisions about surface materials and treatments that will be required. I tried a bunch of different methods and simulation parameters to see if they were even remotely similar, and unsurprisingly, they werent. I'm not an acoustic engineer, I'm an architect who has studied some acoustics in addition to my regular subjects. I know enough to be dangerous, but I'm trying to convert that into enough to be useful. :). I'm totally open to any advice anyone might offer.
One last thing, could you confirm that the T-30 parameter is T-30 (and so needs to be doubled to get RT60)
Thanks for responding,
Ben
…
ys to make use of it.
What it does...
This plug-in allows for one to "connect" a Rhino document with
Grasshopper documents (referred to throughout the plugin as pairing) so
that you can remember which Grasshopper documents are used or reference
data from the Rhino document
How to use it...
Right now, the plug-in is just one command "PairGHFiles" which has
five(5) different options.
PairAllActiveGHDocs - This option pairs all of the documents that are
currently active in the GH Editor to the current Rhino document
PairSelectedGHDocs - This option shows a dialog that allows you to pick
from all the currently active documents in the GH Editor. The selected
documents will be paired to the current Rhino document
OpensAllPairedGHDocs - Opens all the GH Documents that are currently
paired with the Rhino Document
RemovePairedGHDocs - Shows a list of the currently paired GH Documents
and allows you to select which ones to remove.
CurrentlyPairedGHDocs - Prints to the command line all of the GH
Document paths that are currently paired to the Rhino Document.
The plug-in automatically saves all the necessary data, so you don't
need to remember to save any additional files. Do keep in mind that
only GH documents that have been saved and have a valid path will be
able to be paired to the Rhino Document.
Installation
Place the rhp file in a safe, static locataion, then drag and drop it on
top of a running instance of Rhino. Or run the PlugInManager command,
click the Install button towards the bottom of the window, and choose
the rhp file.
If anyone has any questions, feedback, suggestions, or issues, feel free
to post here or email me. Also, for people looking to do the "opposite"
of this (pairing a Rhino Document to a GH Document), check out Visose's
post below.
http://news2.mcneel.com/scripts/dnewsweb.exe?cmd=article&group=rhino&item=353734&utag=
This plug-in is provided without any written or expressed guarantee. By
downloading and installing the plug-in you release the author of any
liability in regards to anything this plug-in may or may not do.
Best Regards,
Damien
Develop | Research | Design
e| damien[AT]liquidtectonics.com
w| liquidtectonics.com…
Added by Damien Alomar at 12:27pm on October 26, 2010
ss 2010.
It is mainly to understand how to create the relationship between rhino / vb.net / rhinocommon, somewhat how grasshopper works.
The error which comes up is the following:
Could not load file or assembly 'RhinoCommon, Version=5.0.15005.0, Culture=neutral, PublicKeyToken=552281e97c755530' or one of its dependencies. The system cannot find the file specified.
It seems to be an issue with the RhinoCommon.dll file.
I am loaded this and made sure the Copy Local was false.
David you mentioned "
To make a .NET plugin for Rhino5 (rhp) you need to reference only RhinoCommon.dll and make sure you don't 'Copy Local'.**"
Now am I going about this the wrong way? Because the setup which I'm doing now is building a windows application file not a rhp. I would assume that you would be able to create an application in this manor to run operations in rhino. Perhaps I am wrong.
I have a gut feeling that the setup to create a plug in much more comples then just importing the rhino, rhino.geometry, rhino.collections libraries. Would you have to create some type of link to the rhino active window/application? Any thought, insights, or greatly appreciated when all have some free moments.
Many thanks as always!
…
Added by Madu Mohan at 10:35pm on January 28, 2011
each space needs to be created out of 4 sweeps, otherwise sweep does not work. I tried loft, but it only works in the direction of one section curve.
One definition uses 4 planes (floors) as basis since spaces interact with each other vertically.
I started applying grasshopper definitions to surfaces in order to bake them. I managed to bake first four components... ...and then at the fifth one all the sudden the sweeps were not as clear as in the previous ones.
The curves just go crazy somewhere near the intersection point. I don't have a clue what could have gone wrong all the sudden, because I haven't changed the definition or any of the sliders.
This is very odd, because even when I open a new rhino file, create new surfaces and try applying the most basic definition for just one suspended space that i know worked a couple of hours ago, same thing happens.
Maybe it is some overall setting either in rhino or grasshopper that went crazy... Could anyone please assist me with this?
I am attaching two snapshots (one is the part of the gh definition where I am suspecting something went wrong and the other is a rhino snapshot of one component)
Thank you so much in advance!
Best
Luka…
ond class to my c# Project to get a second Component, which doesn´t work. when i load the .gha file, only my first component appears.
so i have a C# Project with two classes like this:
namespace myUtilities{ public class My1stComponent : GH_Component { public My1stComponent() : base("My1stComponentName", "MFC", "do something", "myTab", "my1stToolBox") { } protected override void RegisterInputParams(GH_Component.GH_InputParamManager pManager) { //myInputs } protected override void RegisterOutputParams(GH_Component.GH_OutputParamManager pManager) { //myOutputs } protected override void SolveInstance(IGH_DataAccess DA) { //myAlgorithm }
public override Guid ComponentGuid { get { return new Guid("88e6231b-d998-4de2-85dc-451b0158c599"); } }
namespace myUtilities{ public class My2ndComponent : GH_Component { public My2ndComponent() : base("My2ndComponentName", "MSC", "do something else", "myTab", "my2ndToolBox") { } protected override void RegisterInputParams(GH_Component.GH_InputParamManager pManager) { //myInputs } protected override void RegisterOutputParams(GH_Component.GH_OutputParamManager pManager) { //myOutputs } protected override void SolveInstance(IGH_DataAccess DA) { //myAlgorithm }
public override Guid ComponentGuid { get { return new Guid("c5aaf8ea-3a02-4d6e-86ee-a8e35ba2b96d"); } }
Can anybody tell what´s wrong about that?
Another Problem i have is to to get my own icon. i added one to my c# Project like this:
protected override Bitmap Icon { get { return myUtilities.Properties.Resource_icon.myFirstComponent; } }
In Rhino 5 my icon appears, but not in rhino 4 :(.
Thanx for your answers.
…
Added by max wittich at 8:00am on September 17, 2012