s is like flattening your data PARTIALLY - chopping an index off the end of the branch paths without obliterating the tree entirely. When working with one "set" of input data, a flatten works to get these lists to match up - but when working with multiple sets, we need to be careful to preserve the original branch indices that keep all four of your original regions separate. As a rule, whenever you're feeding two data trees into any component, they should have the same number of branches. (or one should have branches and the other should be a flat list, in other cases).
The rule of thumb I tend to teach is this:
In 90% of cases...
For lists, all your inputs should either have 1 item or N items. That is to say, if you're feeding 4 items into one input and 9 items into another, something is probably wrong.
For trees, all your inputs should have either 1 branch or M branches. That is to say, if you're feeding a tree w/ branches {0;0} to {0;3} into one input, and a tree w branches {0;0;0} to {0;3;8} into the other input, something is probably wrong.
Grasshopper essentially matches up branches first, then lists second. By "matching" I mean it processes them together. Simple example of the Line component - it will match the first branch of points in the A input to the first branch of points in the B input, creating lines between those points, then match the second branches, the third branches, etc. THEN, it applies the same logic to the level of the list (with a pair of matched branches {0;2}, match all the items in those branches to each other - first item in one branch to the first item in the other branch, etc.)
This is a tricky concept but it seems like you're already well on your way to understanding it from your definition - "PShift" is a critical tool in your path management arsenal. I hope this (overly long) response helps clear things up for you!
…
he TOF and TSRF indices. They show, how "distant" is your _PV_SWHsurface from the optimal _PV_SWHsurface surface in terms of tilt and azimuth angles.However, in your case we are not interested in TOF and TSRF indices. We would just like to know what are the _PV_SWHsurface optimal tilt and azimuth angles, regardless of the supplied _PV_SWHsurface.
So the circular surface supplied to the "TOF" component's _PV_SWHsurface input is irrelevant. It can be of any area, and any tilt/azimuth angle.The PV_SWHsurfacesArea output of the "PV SWH system size" component depends on a couple of factors:moduleActiveAreaPercent_ (leave it at 90%).
moduleEfficiency_,
systemSize_.Calculation of systemSize_ depends on your electricity demand, cost of the PV system, type of the object, country, local regulations etc. This is something that an engineer needs to determine.For example, in USA for a residential house in the Sunbelt, depending on finances, a household would try to cover 100% of its annual electricity needs with their PV system. Which means that the systemSize_ you chose needs to cover the annual electricity consumption. You can perform EnergyPlus simulation or use any other way to get the annual electricity consumption.
Ladybug "Photovoltaics Performance" component can calculate the optimal systemSize_ by given the annual electricity consumption.However the component is made to address fixed tilt and azimuth PV systems only.An approximate way to overcome this is to calculate the optimal systemSize_ for fixed tilt and azimuth PV system, and then multiply it with the "difference in %s" panel at the very right of the fixed_vs_tracker_PV2.gh file. Again, this is not what Ladybug "Photovoltaics Performance" component is made to do, but it will probably get you in a ball park.
Inputted 32 degrees for north_ direction is actually 328 degrees.This is due to Ladybug Photovoltaics being based on NREL model which uses clockwise angles convention. This convention is also most commonly used in solar radiation analysis.
Dubai weather data files are uploaded in here.
…
GH, same as using sweep2 command in Rhino.
The one on the right is what I got so far (the output smooth our the kink of the original rails). Basically I am just following the methods provided by sdk sample: http://wiki.mcneel.com/developer/sdksamples/sweep2 .
The following is the function I copy and use directly from the SDK sample. By using this function, I can generate the sweep surface at right. But I want to have is the one in the middle with the kink edges. Can anyone show me how and where to modify he settings? I guess some sweep arguments need to be changed? I have try couples, such m_simplify, m_bSimpleSweep, m_bSameHeight, m_rebuild_count... but still cannot find a right combination for this function to output the sweep surface I want. Any suggestions or helps are very appreciated. Thanks for your help and time on this.
'Sweep2 function'----------------
Sub Sweep2( ByVal Rail1 As IOnCurve, _
ByVal Rail2 As IOnCurve, _
ByVal sCurves As List(Of IOnCurve), _
ByRef Sweep2_Breps As List(Of OnBrep))
'Define a new class that contains sweep2 arguments
Dim args As New MArgsRhinoSweep2
'Set the 2 rails
Dim Edge1 As New MRhinoPolyEdge
Dim Edge2 As New MRhinoPolyEdge
Edge1.Append(Rail1.DuplicateCurve())
Edge2.Append(Rail2.DuplicateCurve())
'Add rails to sweep arguments
args.m_rail_curves(0) = Edge1
args.m_rail_curves(1) = Edge2
args.m_bClosed = False
Dim section_curves As New List(Of OnCurve)
'Loop through sections to set parameters
For Each Section As IOnCurve In sCurves
Dim sCurve As OnCurve = Section.DuplicateCurve()
section_curves.Add(sCurve)
Dim t0 As Double = 0
If Not Edge1.GetClosestPoint(sCurve.PointAtStart(), t0) Then
If Not Edge1.GetClosestPoint(sCurve.PointAtEnd(), t0) Then
Dim s As Double = 0
sCurve.GetNormalizedArcLengthPoint(0.5, s)
Edge1.GetClosestPoint(sCurve.PointAt(s), t0)
End If
End If
args.m_rail_params(0).Append(t0)
Dim t1 As Double = 0
If Not Edge2.GetClosestPoint(sCurve.PointAtStart(), t1) Then
If Not Edge2.GetClosestPoint(sCurve.PointAtEnd(), t1) Then
Dim s As Double = 0
sCurve.GetNormalizedArcLengthPoint(0.5, s)
Edge2.GetClosestPoint(sCurve.PointAt(s), t1)
End If
End If
args.m_rail_params(1).Append(t1)
Next
'Set shapes
args.m_shape_curves = section_curves.ToArray
'Set the rest of parameters
args.m_simplify = 0
args.m_bSimpleSweep = False
args.m_bSameHeight = False
args.m_rebuild_count = -1 'Sample point count for rebuilding shapes
args.m_refit_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AbsoluteTolerance()
args.m_sweep_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AbsoluteTolerance()
args.m_angle_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AngleToleranceRadians()
Dim sBreps() As OnBrep = Nothing
If (RhUtil.RhinoSweep2(args, sBreps)) Then
For Each b As OnBrep In sBreps
Sweep2_Breps.Add(b)
Next
End If
Return
End Sub
…
s levels of detail by subdividing a 6 sided cube mesh and projecting its vertices according to a referenced height map. This is one of the standard conventions for building full sizes planets. At the lowest level (0) the mesh planet is made of 6 pieces(each 32x32 resolution). The next level down (1) is made of 24 pieces... 6 divided by 4 = 24. Level (2) is 96 quads etc etc. The script will generate each quad at its sub-division level and compare edge vertices to neighboring quads. It will then make sure any shared vertices are in fact at the same projected vector. This ensures a planet quad with edge vertices that match.
The problems comes in texturing each quad.
If I build the quad as a nurb surface from points I can place the texture easily because each surface UV maps squarely to my texture map (which is also square).
If I build the quad as a mesh I cannot just apply the square texture to the mesh UVs. This is because when you unwrap the UVs from a mesh they will not unwrap like a nurb surface's UVs. Therefore to get the correct mapping I would have to manipulate each UV back to an evenly aligned array (which is 1024 points in a 32x32 resolution UV). Maya and blender have 'relax uv' and 'align UV' functions but they don't do the trick and manual corrections are out of the question. So why not skip the mesh method and use the nurb method?
I did this and there is a trade off. The nurb will accept the material texture I want with no other work on my end but when I export the object as an .obj rhino creates its own mesh to describe the nurb(with various unsatisfactory setting options). This works great up to a point because at some level the interpreted mesh will have vertices that do no match at the edges, ie .. creating visible seams in the mesh. The picture below is the nearly seamless planet at LOD(1) made of 24 quads, each with 32x32 vertice resolution and a 512x512 jpg texture running in Unity3d 5. It works but at close level there are seams. This will be resolved simply by having the next LOD(x) instantiate before getting close enough to see the seam but at core nerd level I want the seamless mesh.
So, I can make the seamless mesh but I can not realistically texture map it. I can also make the nurb surface from points and texture it at the expense of the edge vertices matching. I am at the split in the road but I want to have my cake and eat it too. Thoughts, comments, trolls...?
Thanks for reading =)
Footnote: For you pros I am not using seamless noise across the map I am using grasshopper to sew up my otherwise non perfect edges.
Other programs in the pipeline:
-WorldMachine 2
-Wilbur
-Photoshop
-Unity3d…
ive collaborative environment.
TYPE : Course module and Workshop
The event is open for anybody interested from all the fields of design, including: architecture, interior design, furniture design, product design, fashion design, scenography, and engineering.
1. COURSE MODULE (20-23 April 2014) - optional
+ type: 3 days intensive course regarding basic knowledge in parametric design (LEVEL 1)
+ software: Rhinoceros & Grasshopper
+ plugins: Kangaroo, Weaver Bird, Lunch box, Ghowl, Geco
+ achievements:
- acquainting to the components & the concept of Generative Design
- understanding the strategies in Algorithmic Design
- how to easily insert simple mathematical equation into the project to gain more control
- how to utilize proper plugins with respect to their nature of the project
- interacting with different analysis platforms such as Ecotect & remote controller
- solving several exercises with different scales( 2D- 3D ) during each phase of the workshop
2. WORKSHOP (23-27 April 2014)
A 5 day Design-Based Research Workshop exploring new techniques in Digital Architecture/Fabrication, with a specific focus on the use of generative systems and parametric modeling as tools for creative expression.
Our ultimate goal is to increasing the efficiency of utilizing digital tools in parallel with geometric performance of the primitive design agent.
+ + CONCEPT
Fashion and Architecture are both based on basic life necessities – clothing and shelter.
However, they are also forms of self-expression – for both creators and consumers.
Both fashion and architecture affect our emotional being in many ways.
The agenda of this workshop is to investigate on the overlap between these two areas of design, art & fashion.
Fashion and architecture express ideas of personal, social and cultural identity, reflecting the concerns of the user and the ambition of the age. Their relationship is a symbiotic one and throughout history, clothing and buildings have echoed each other in form and appearance. This only seems natural as they not only share the primary function of providing shelter and protection for the body, but also because they both create space and volume out of flat, two-dimensional materials.
While they have much in common, they are also intrinsically different – address the human scale, but the proportions, sizes and shapes differ enormously.
+ + + OBJECTIVES
So far, Architects have been using techniques such as folding, bending etc. to create space, structural roofs or different other structural shapes.
The agenda of this workshop goes further with the investigation of algorithmic thinking through generative tools Integrated in design.
The challenge is creating a bridge that connects these two areas of design, architecture and fashion that perform at two opposite scales.
+ + + + TECHNICAL BRIEF
In the early stages physical models and low-tech strategies will be used, allowing the participants to gain a greater understanding of materials, fabrication and assembly methods as well as simple, yet pragmatic structural solutions.
Later in the workshop these strategies will be digitalized and elaborated using software visualizing tools such as Rhinoceros and the algorithmic plug-in Grasshopper.…
, Engineer and Researcher from France with broad programming experience. He is the author of the City in 3D Rhinoceros plugin for creation of buildings according to geojson file and with real elevation. Guillaume already created a new component: "Address to Location". It enables getting latitude and longitude values for the given address:
2) Support of Bathymetry data: automatic creation of underwater (sea/river/lake floor) terrain. This feature is now available through new source_ input of the "Terrain generator" component. Here is an example of terrain of the Loihi underwater volcano, of the coast of Hawaii:
3) A new terrain source has been added: ALOS World 3D 30m. ALOS is a Japanese global terrain data. Gismo "Terrain Generator" component has been using SRTM 30m terrain data, which hasn't been global and was limited to -56 to +60 latitude range. With this addition, it is possible to switch between SRTM and ALOS World 3D 30m models with the use of source_ input.
4) 9 new components have been added:
"Address To Location" - finds latitude and longitude coordinates for the given address.
"XY To Location" - finds latitude and longitude coordinates for the given Rhino XY coordinates. "Location To XY" - vice versa from the previous component: finds Rhino XY coordinates for the given latitude longitude coordinates. "Z To Elevation" - finds elevation for particular Rhino point. "Rhino text to number" - convert numeric text from Rhino to grasshopper number. "Rhino unit to meters" - convert Rhino units to meters. "Deconstruct location" - deconstructs .epw location. "New Component Example" - this component explains how to make a new Gismo component, in case you are interested to make one. We welcome new developers, even if you contribute a single component to Gismo! "Support Gismo" - gives some suggestions on how to make Gismo better, how to improve it and support it.
5) Ladybug "Terrain Generator" component now supports all units, not only Meters. So any Gismo example file which uses this component, can now use Rhino units other than Meters as well. Thank you Antonello Di Nunzio for making this happen!!
Basically just forget about this yellow panel:
This panel is not valid anymore, so just use any unit you want.
6) A number of bugs have been fixed, reported in topics for the last couple of weeks. We would like to thank members in the community who invested their time in testing, finding these bugs and reporting them: Rafat Ahmed, Peter Zatko, Mathieu Venot, Abraham Yezioro, Rafael Alonso. Thank you guys!!! Apologies if we forgot to mention someone.
The version 0.0.2 can be downloaded from here:
https://github.com/stgeorges/gismo/zipball/master
And example files from here:
https://github.com/stgeorges/gismo/tree/master/examples
Any new suggestions, testing and bug reports are welcome!!…
Added by djordje to Gismo at 5:13pm on March 1, 2017
(1) I have been exporting small sections of a larger model into Maya from Rhino as FBX. In Maya I rotate and scale the models (-90 in X, Scale XYZ 0.001). The Named Views are being saved, but do not have a successful import into the Maya model. They do not appear as in Rhino, and the problem is not solved by scaling or rotating the cameras.
(2) If I try going the other direction, the cameras exported from Maya as FBX are also not aligning with the model in Rhino as they are in Maya.. I will do my best to post some images of the problem and hope you can help.
error !!
This is what the named views look like
here I am trying to the other way with a good view from Maya
strange placement..
This is the best result I can achieve, after I scale the camera by 1000
Any Advice???
Thanks, Robert.
…
ysim.ning.com/
When you run the simualtion you will notice on the batch terminal that Daysim is also being called, so you may want to consider how Daysim uses Radiance files & data.
Regarding your current problem, I think you stumbled onto something weird and interesting.
Interior and exterior readings appear to differ by 40 in the best case scenarios. Even setting the transmittance to 1 yields similar results. I tried changing from cummulative sky to climate sky and got similar values. Changing the test points did nothing either.
I think, (yet I'm too lazy to prove this) that the difference in values stems from diffuse radiation over the sky dome.
If you delete everything except the glass you'll notice that interior values are like 80-90% of the exterior values (this seems like the expected behaviour with a transmittance of 1). So, if we consider that a vertical window, part of an opaque box, is receiving radiation from 25% of a sphere, as you start to inset the interior test points the radiation they receive will be a fraction of the 25%.
Let me try to explain this better...The exterior surface receives radiation from a section of a sphere calculated by 180degrees on the xy plane (let’s call this angle theta) and by 90degrees (let’s call this angle phi) in azimuthal elevation. If you integrate this over spherical coordinates (theta from 0 to pi; phi from 0 to pi/2) you will find that it comes to a quarter of a sphere. By comparison, the interior surface will not integrate theta from 0 to 180degrees,nor phi from 0 to 90degrees, instead it will be the subtended angle from the exterior surface as a function of their separation; the farther in you go the smaller the view of the outside.
If my hypothesis is correct there shouldn't be that much difference since the separation is only 10cms...the subtended angle would be like 170 instead of 180 for theta and 85 instead of 90 for phi...overall if you integrate both spherical areas there should only by a difference of 10%.
In conclusion, I believe the unexpected behaviour stems from the previous subtended angle thing. If direct radiation was the only factor the difference would be the aforementioned 10%, which suggests that an additional source of energy is also affected by this. Perhaps indirect and diffuse radiation from other areas of the sky dome.
I’m definitely intrigued on why this is happening. Please post if you figure it out.
Regards,
Mauricio
…
TB of RAM. I think I'm going to start a GoFundMe campaign to buy one for myself :)
2- The server's cost is about $13 an hour. I get free access to supercomputer through my university and xsede.org because I earned an NSF Honorable mention last March, however, the supercomputers available through both resources are a little complicated for me to use, as opposed to the one available from amazon that has Microsoft server 2012 already installed.
3- I wanted to run 400 annual glare simulations for 400 different views.
4- I tried a to perform annual glare simulation for one view on my Dell XPS that has Intel Core i7-6700HQ processor and 16GB of system memory. The simulation took 2 hours to complete. Radiance parameter ab was set to 6.
5- I wanted to obtain the batch file for each view so I can run them on the server. So I used the fly component to run all 400 simulations and closed the cmd windows, that wasn't bad ( for me at least) because I asked my son to this job for me, he was just glad to help me :)
6- I created one batch file using this cmd command:
dir /s /b *.bat > runall.bat
This created a file with the path to each .bat file. I edited this file in Notepad++ to include the word "start" at the beginning of each line. This was done using the "find and replace" dialogue box.
7- I split my newly created batch file into 3 batch files, each one has about 130 file names and " start" before the file names.
8- installed radiance on my server
9- Ran the first batch file on the server, this started 130 cmd windows performing my simulations, CPU usage was anywhere between 90% to 100% and about 105 GB of RAMs were used.
10. It took about 5 hours to complete all 130 simulations, I expected to run all in 2 hours but can't complain because this would've taken about 260 hours to run on my laptop. After the simulations done I ran the second and then the third batch files ( total of about 15 hours).
11. I got 400 valid dgb files. Couldn't be happier!
…
he time to work with it.
the project is about facade strips which turns along height. the top angle is
parallel to the facade and the bottom is max. 90 degrees twisted, but the strips
should turn diffrently to achieve more dinamic look.
first i have tried to achieve this by calculating distance between the rotation angle from points of the grid and a single point.
then i have tried to ad some more effecting points and used the distance to the divided surface (the circles are just to control the area of effection):
i manually lofted it.
the result is a bit annoying becouse the points that effect the angle are always visible:
i have triend to solve this by drawing a line and divided it to recieve points along the bottom of the geometry. the result is not working properly:
Anyway,
there must be a better/smoother way to achieve this. i would like to effect the twist of the surfaces by distance to a spline, but im just lost. can you help me please?
the problems im encountering:
0- distance spline to grid to effect the angle
1- list of x/y coordinates and angle of rotation for each point of the grid
2- export points to excel
3- lofting lines in one direction only (x1, x2, x3...)
4- reduce the list data to 2 decimal (0,00)
5- maybe angle from radian to degrees
thx…