what they really mean by that, as in what buttons to push, so I assume it's a Windows Path entry?
2.) Modify PATH
Add the install location on the path, this is usually: C:\Program File\IronPython 2.7
But on 64-bit Windows systems it is: C:\Program File (x86)\IronPython 2.7
As a check, open a Windows command prompt and go to a directory (which is not the above) and type:
> ipy -V PythonContext 2.7.0.40 on .NET 4.0.30319.225
Tutorial on setting a Windows environmental variable (path):
http://www.computerhope.com/issues/ch000549.htm
But this fails to point out that path contains many entries already separated by semicolons so if I merely add a new variable called "path" it's likely that I will destroy existing program function. There's no info on how to just tack on another entry, and the Windows 7 edit box doesn't even show the whole collection, but one item (!), so I copied the existing path into a text editor to see the whole collection successfully and added the C:\Program Files (x86)\IronPython 2.7 entry after an added semicolon, correcting for an Enthought page typo of no 's' on the end of "Program Files". I also checked the others and many pointed to old missing directories so I deleted those entries.
...and the test fails and "ipy" is not recognized as a command, even though the path now shows up using "path" in the Windows CMD window, that is if I copy all by right clicking and pasting the stuff into a text editor to really view it all. I can run it from the source directory just fine.
The rabbit hole was indeed deep. Using the Task Manager (control-alt-delete) to kill Explorer and then Run in the menu to restart "Explorer," along with restarting the Windows CMD window however, worked. I can now invoke Iron Python ("ipy") via command line from any directory. For the "path" I edited path in the System Variables and not the User Variables. No, you don't have to type that whole crazy line above just to test the path variable, just "ipy" (and control-Z to quite IronPython) in the CMD window invoked by typing "cmd" into the Start menu search box.
From the CMD line this step did work fine:
3.) ironpkg
Bootstrap ironpkg, which is a package install manager for binary (egg based) Python packages. Download ironpkg-1.0.0.py and type:
> ipy ironpkg-1.0.0.py --install
Now the ironpkg command should be available:
> ironpkg -h(some useful help text is displayed here)
But of course Step 4 fails, giving pages of what seem to be error messages;
C:\Users\Nik>ironpkg scipy
Traceback (most recent call last):
File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\utils.
py", line 92, in write_data_from_url
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 126, in urlo
pen
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 397, in open
File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 509, in http
_response
...
Why can't I just download Numpy as a normal file and thus also have it easy for other users to install it when they use my scripts? This is just crazy and lazy. The Enthought developer has turned this into a computer game, with a missing registration link and then the last step spits out errors with utterly no information on how to fix it manually.
This Step 4 error is covered here:
http://discourse.mcneel.com/t/trying-to-import-numpy-in-rhino-python-but-im-getting-this-error-cannot-import-multiarray-from-numpy-core/12912/16…
Added by Nik Willmore at 2:36pm on October 11, 2015
ou will see all of the available components on a ribbon at once so there is no need to keep clicking drop down menus.
It's all about discoverability with GH. What if you're a beginner and don't know about the Create Facility (dbl click canvas) how can you find Extr?
Even if you hover over every component or use the drop down lists you will not see the name Extr appear anywhere.
Sure it makes sense that Extr is short for Extrude but it's also the Nick Name of Extrude to Point component
So you can easily miss the fact that one has a Distance Input verses a Point Input.
I think I made the move to Icons around about the move from version 0.5 to 0.6, possibly before. I initially thought that I would go back to text because I loved the mono chromatic look of the text but I soon realised that Icons were the way forward. The greatest benefit is speed. You don't need to digest and decipher every component (which is written 90 degrees to the norm).
I'm not saying you should move to Icons forthwith but at least consider that once you have a better knowledge and understanding of GH, Icons will set you free.
My top ten tips that I would highly recommend to anyone wanting to better themselves with GH.
1) Turn on Draw Icons
2) Turn on Draw Fancy Wires
3) Turn on Obscure Components
4) Use the Create Facility like a Command Line eg "Slider=-1<0.75<2" or "Shiftlist=-1"
5) Use Component Aliases to customise your use of the Create Facility eg giving the Point XYZ component an alias of XYZ will bring it up as the first option on the Create Facility as opposed to the other possibilities.
6) Try to answer other people's questions even if it's not relevant to your own area. By looking into solving a problem outside of your comfort zone and then posting your results it is very rewarding but it also lets you see the other approaches that get posted in a new light.
7) Take the time to understand Data/Path structures.
8) Buy a second monitor - There is nothing that can compare to real estate when working in Grasshopper.
9) Read Rajaa Issa's Essential Mathematics
10) Pick a panel in a tab on the ribbon and get to know every component inside and out and then move on. Start with the Sets Tab > List Panel…
ne diverse digital design methodologies and the use of different tools such as Autodesk Maya, Rhinoceros and Grasshopper.
Building up technical skills will provide the attendees with a solid platform from which to start rethinking and exploring innovative architectural ideas in collaboration with the team and the tutors.
URBAN FIELDS
Phase I
In the first part of the workshop attendees will be looking at field conditions and how to generate and design such fields that can help structure a possible urban condition in Florence.
We will be exploring dynamic systems, geometric systems and network theories to generate and design an abstract field condi- tion that extends the urban experience of the city onto the vertical dimensions of towers. Simple operations that would span variations from an initial state will give rise to high level of com- plexity.
The goal of this exercise is to create a rich and diversified intel- ligible urban space that can be later on subjected to local inter- ventions and zooming in to locally enhance each design.
AGENT - BODIES POLYMORPHISM
Phase II
The second part of the workshop will build upon first phase; par- ticipants will select one archetype (high rise tower) as a study model for further development.
Besides engaging with multi agent algorithms design strategies, attendees will address strategic utilisation of structurally and environmentally generated morphologies to design coherent and highly differentiated tower exo-skeletons.
Tutors will introduce agent-bodies polymorphism in order to explore the generation of structural aware and capable geom- etries through agent based formation of non-linear hierarchies and emergent patterns. These agent-bodies will operate in a complex spatial manner to form structure, partitions or enclo- sure and will operate across scales, creating a poly-scalar level of detail.
Attendees will speculate how autonomous systems can cre- ate new structures and intelligent distribution of structural elements, about new collaborative strategies of construction and the performativity they will evoke (performance, effects, responsiveness, interaction).
Fees
Early registration (before 1st June)
Students 390€ - Professionals 440€
Late registration (after 1st June)
Students 490€ - Professionals 540€
More info and Applications
https://www.ax-om.com/edu/polymorphism/
…
rtical Sky Component (VSC), and now Sky Exposure Factor (SEF). For everyone else following this post, this discussion has been ongoing in these other threads:
http://www.grasshopper3d.com/forum/topics/sky-view-factor-vs-vertical-sky-component?groupUrl=ladybug&xg_source=msg_com_gr_forum&groupId=2985220%3AGroup%3A658987&id=2985220%3ATopic%3A1377260&page=1#comments
https://github.com/mostaphaRoudsari/ladybug/issues/230
Grasshope, you have gone right to Oke, the grandfather of urban climatology, whose papers I have several times and yet I somehow I always missed the finer details of the sky view calculation. From his definition, I had always thought of Sky View Factor as a purely solid angle or "view factor" calculation in the sense of Mean Radiant Temperature. However, the numbers and formulas that you give here clearly show that Oke meant that this metric for quantifying and understanding urban heat island must refer back to the urban surfaces and their orientation in relation to the sky. It cannot simply be the view from points in space.
To clarify the distinction in simple geometric terms: The key difference is that Sky Exposure refers to the sky seen by a point in space while Sky View refers to that seen by a surface. Both of them involve the calculation of either projected rays or solid angle calculations to the sky (since they both are “view” calculations). However, while Sky Exposure treats each patch of the sky with relatively equal weight, Sky View weights these patches by their area after being projected into the plane of the surface being evaluated. In other words, the sky view calculation for a horizontal surface would give more importance to the sky patches that are directly overhead than those near the horizon because these overhead patch are “in front” of the surface (as opposed to on the side).
To express this difference in the trigonometric terms you cite here:
Wall View = 0.5(sin2 θ + cos θ – 1) / (cos θ)
Wall Exposure = θ/π
I both cases:
θ = tan-1(H / 0.5W) - ** This is the solid angle or ray-tracing calculation
SkyViewOrExposure = (1 - 2 (WallViewOrExposure))
To put this in more simpler terms for the View Analysis component, all that I actually have to do to convert sky exposure to sky view is multiply each of the traced view rays by 2cos(ϕ), where ϕ is the angle between the surface normal and the given view ray being traced.
I have done this by adding this line of code () and I have verified that I get the values from Oke’s paper that you cite above, Grasshope. Accordingly, the View Analysis component now has the option to compute either Sky Exposure or Sky View. You can see this happening in this new example file:
http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Sky_Exposure,_Sky_View,_and_Sky_Component&slide=0&scale=1&offset=0,0
To (once and for all!) clearly define the difference between the three metrics at the top of my reply and to explain how to calculate each with Ladybug Honeybee:
Sky Exposure Factor - The percentage of the overlying hemispherical sky that is directly visible from a given POINT or set of POINTS. This is equivalent to a geometric solid angle calculation or ray-tracing calculation from points. It is useful for evaluating one's general visual connection to the sky at a given point and should be applied to cases where direct views to the sky are the parameter in question.
Sky exposure is calculated with the Ladybug_View Analysis component like so:
Sky View Factor – The percentage of the overlying hemispherical sky that is directly visible from a given SURFACE or set of SURFACES. While Sky Exposure treats each patch of the sky with relatively equal weight, Sky View weights these patches by their area projected into the plane of the surface being evaluated. In other words, Sky View for a horizontal surface would give more importance to the sky patches that are overhead and less to those near the horizon. Sky View is an important factor in for modelling urban heat island since the inability of warm urban surfaces to radiate heat to a cool night sky is one of the largest contributors of the heat island effect.
Sky View is calculates with either the Ladybug_View Analysis component like so:
Or with the Honeybee_Vertical Sky Component Recipe like so:
Sky Component - The portion of the daylight factor (at a surface indoors) contributed by luminance from the sky, excluding direct sunlight. This is essentially the same as Sky View Factor but it often incorporates a sky condition that is not uniform, such as a cloudy sky or sky that is more indicative of diffuse sky light. Another way of conceiving of this metric is a Daylight Factor calculation without any light bounces. It is useful for understanding the direct daylight contribution of diffuse skylight and, although many consider it an older (and perhaps outdated) daylight metric, it is still required by some codes and standards.
Sky Component can be calculated with the Honeybee_Vertical Sky Component Recipe like so:
In addition to the added capability in the view analysis component, I have revised the component description to include the definitions above. I have also corrected the Hydra example file in which I cite sky view as an urban heat island metric to use the new formula:
http://hydrashare.github.io/hydra/viewer?owner=chriswmackey&fork=hydra_2&id=Sky_View_in_an_Urban_Canyon&slide=1&scale=1&offset=0,0
Finally, all of this discussion has made me realize that the Vertical Sky Component recipe for Honeybee might not always be evaluating VERTICAL sky. The sky component might be vertical, horizontal, or in any direction that the input test surface is placed and pts vectors are oriented. Accordingly, Mostapha, I think that we should change the name of the component to simply be “Sky Component” instead of “Vertical Sky Component”. Please let me know if you agree.
Thanks again, Grasshope, for all of the great work! All of this never would have made sense without your research.
-Chris…
y in English. ○Presenter
Robert (Bob) McNeel (McNeel & Associates founder) Robert (Bob) McNeel is the founder and president of Robert McNeel & Associates (RMA). Founded in 1978, RMA originally focused on developing accounting software for accounting, architecture, engineering, and other personal services firms. Within a few years, RMA expanded its services to include selling and supporting microprocessor-based engineering and design software including AutoCAD. By 1985, the main focus of the business had shifted to AutoCAD sales, service, training, and software development. Bob McNeel grew up in the mountains of southern Washington State on a subsistence dairy farm. To pay for college, he worked in construction as a carpenter, welder, and cement finisher. Bob has a BA in Accounting from Washington State University. Prior to founding McNeel & Associates, he was a practicing Certified Public Accountant and the comptroller for a large construction company in Spokane. Andrés González (Rhino Fablab director) Andrés is a software trainer and developer since the 1980s. He has developed applications for diverse design markets as well as training materials for different CAD and Design software including the community of training materialswww.Rhino3D.TV Andrés has been working with the Rhino Team since the very early stages. He is now the head of the McNeel Southeast US & Latin American Division. He is the worldwide director of the digital fabrication community called RhinoFabLabwww.RhinoFabLab.com as well as the Generative Jewelry & Fashion Design community GJD3D www.GJD3d.com and Generative Furniture Design community GFD3D www.GFD3d.com 1981 -1985 University of North Carolina at Charlotte N.C. - EE.UU. B.S., Bachelor of Science in Engineering
…
Added by Yusuke Oono at 9:28pm on October 16, 2013
onents (radiation, sunlight-hours and view analysis) which let you study the effect of the orientation of your building and the analysis result. When you come to a question similar to "what is the orientation that the building receives the most/least amount of radiation?" is probably the right time to use this component.
HOW?
I'll try to explain the steps using a simple example. Here is my design geometries. The building in the center is the building to be designed and the rest of the buildings are context. I want to see the effect of orientation on the amount of the radiation on the test building surfaces from the start of Oct. to the end of Feb. for Chicago.
First I need to set up the normal radiation analysis and run it for the building as it is right now. [I'm not going to explain how you can set up this since you can find it in the sample file (Download the sample file from here)]
Now I need to set up the parameters for orientation study using orientationStudyPar component. You can find it under the Extra tab:
At minimum I need to input the divisionAngle, and the totalAngle and set runTheStudy to True. In this case I put 45 for divisionAngle and 180 for the totalAngle which means I want the study to be run for angles 0, 45, 90, 135 and 180.
[Note1: The divisionAngle should be divisible by totalAngle.]
[Note 2: If you don't provide any point for the basePoint, the component will use the center of the geometry as the center of the rotation.]
[Note 3: You can also rotate the context with the geometry! Normally you don't have the chance to change the context to make your design work but if you got lucky the rotateContext input is for you! Set it to True. The default is set to False.]
You're all set for the orientation study, just connect the orientationStudyPar output to OrientationStudyP input in the component and wait for the result!
The component will run the study for all the orientations and preview the latest geometry. To see the result just grab a quick graph and connect it to totalRadiation. As you can see in the graph 135 is the orientation that I receive the maximum radiation. Dang!
If you want to see all the result geometries set bakeIt to True, and the result will be baked under LadyBug> RadaitionStudy>[projectname]> . The layer name starts with a number which is the totalRadiation.
Mostapha…
her people) a tremendous amount of time creating them by hand. Dog Treat was far from perfect, however it was good enough to use almost daily.
Three years is a long time. Since 2016 my Gh knowledge has expanded and I’ve seen how dodgy some of the scripting is. With this in mind I started work on a new build. Many things have been tweaked and some things have been rebuilt from the ground up.
Everything has been designed to be leaner and be a general solution to the problem of creating dog bone corners on geometry for quick, efficient and safe CNC fabrication.
Some of these things are:
Adding prompts about user geometry to make them aware about open curves, varying curve heights and if their geometry had been altered (mostly removing unnecessary points on curves).
Smooth Transfers. If you’re in a rush and need to speed through cutting, smooth transfers mean that a lead in geometry is now created alongside the actual dog bone arc. This means the router bit doesn’t have to come to a minute stop at every corner. This is turned on by default.
Acute Angle Condition If the angle between the two curves adjacent to a dog bone point is acute, previously the dog bone corner was useless. This was because the distance between the end points of the dog bone arc were less than the diameter of the router bit. There are many ways this condition could be addressed. I chose to circumscribe a larger arc based on the original angle between the adjacent curves. While it removes more material from the corner, it minimises tool wear and any potential for material to burn.
Single Curve A single curve can now be input into Dog Treat. It will be output with both internal and external treatments.
I’ll continue to update Dog Treat as the need arises, it’s become somewhat of a hobby now. Maybe one day it will become part of a Plug-in… once I learn to code it though!
Happy Treating!
Hi Everyone,
Here's a tool I've been working on for the past 4 months or so in my free time. It's a dog bone corner generator, however it's a little different to some of the existing ones. It's designed to be used for large amounts of geometry and as such, it avoids using any curve boolean operations that are computationally taxing. You don't have to split your curves up into internal and external lots either, it works it all out so you can be lazy. I've also incorporated Lunch Box's Object Bake Component for a one click operation that bakes geometry back out to Internal and External profile layers.
Let me know how it goes, will update where necessary.
Best,
Darcy
Change Log
06/11/19 - Version 2.0 SECOND DINNER - Rebuild
29/09/17 - Version 1.3 - Now with smooth corners option, True for smooth default/False for original
18/05/17 - Version 1.2 - Now includes variable angle domain input (defaults at 90°) for angled corners
13/11/16 - slight change to enable acceptance of very large interior curves
…
Added by Darcy Zelenko at 8:44pm on November 9, 2016
lly it should not make much of a difference - random number generation is not affected, mutation also is not. crossover is a bit more tricky, I use Simulated Binary Crossover (SBX-20) which was introduced already in 1194:
Deb K., Agrawal R. B.: Simulated Binary Crossover for Continuous Search Space, inIITK/ME/SMD-94027, Convenor, Technical Reports, Indian Institue of Technology, Kanpur, India,November 1994
Abst ract. The success of binary-coded gene t ic algorithms (GA s) inproblems having discrete sear ch sp ace largely depends on the codingused to represent the prob lem variables and on the crossover ope ratorthat propagates buildin g blocks from pare nt strings to childrenst rings . In solving optimization problems having continuous searchspace, binary-co ded GAs discr et ize the search space by using a codingof the problem var iables in binary st rings. However , t he coding of realvaluedvari ables in finit e-length st rings causes a number of difficulties:inability to achieve arbit rary pr ecision in the obtained solution , fixedmapping of problem var iab les, inh eren t Hamming cliff problem associatedwit h binary coding, and processing of Holland 's schemata incont inuous search space. Although a number of real-coded GAs aredevelop ed to solve optimization problems having a cont inuous searchspace, the search powers of these crossover operators are not adequate .In t his paper , t he search power of a crossover operator is defined int erms of the probability of creating an arbitrary child solut ion froma given pair of parent solutions . Motivated by t he success of binarycodedGAs in discret e search space problems , we develop a real-codedcrossover (which we call the simulated binar y crossover , or SBX) operatorwhose search power is similar to that of the single-point crossoverused in binary-coded GAs . Simulation results on a number of realvaluedt est problems of varying difficulty and dimensionality suggestt hat the real-cod ed GAs with t he SBX operator ar e ab le to perform asgood or bet t er than binary-cod ed GAs wit h t he single-po int crossover.SBX is found to be particularly useful in problems having mult ip le optimalsolutions with a narrow global basin an d in prob lems where thelower and upper bo unds of the global optimum are not known a priori.Further , a simulation on a two-var iable blocked function showsthat the real-coded GA with SBX work s as suggested by Goldberg
and in most cases t he performance of real-coded GA with SBX is similarto that of binary GAs with a single-point crossover. Based onth ese encouraging results, this paper suggests a number of extensionsto the present study.
7. ConclusionsIn this paper, a real-coded crossover operator has been develop ed bas ed ont he search characte rist ics of a single-point crossover used in binary -codedGAs. In ord er to define the search power of a crossover operator, a spreadfactor has been introduced as the ratio of the absolute differences of thechildren points to that of the parent points. Thereaft er , the probabilityof creat ing a child point for two given parent points has been derived forthe single-point crossover. Motivat ed by the success of binary-coded GAsin problems wit h discrete sear ch space, a simul ated bin ary crossover (SBX)operator has been develop ed to solve problems having cont inuous searchspace. The SBX operator has search power similar to that of the single-po intcrossover.On a number of t est fun ctions, including De Jong's five te st fun ct ions, ithas been found that real-coded GAs with the SBX operator can overcome anumb er of difficult ies inherent with binary-coded GAs in solving cont inuoussearch space problems-Hamming cliff problem, arbitrary pr ecision problem,and fixed mapped coding problem. In the comparison of real-coded GAs wit ha SBX operator and binary-coded GAs with a single-point crossover ope rat or ,it has been observed that the performance of the former is better than thelatt er on continuous functions and the performance of the former is similarto the lat ter in solving discret e and difficult functions. In comparison withanother real-coded crossover operator (i.e. , BLX-0 .5) suggested elsewhere ,SBX performs better in difficult test functions. It has also been observedthat SBX is particularly useful in problems where the bounds of the optimum
point is not known a priori and wher e there are multi ple optima, of whichone is global.Real-coded GAs wit h t he SBX op erator have also been tried in solvinga two-variab le blocked function (the concept of blocked fun ctions was introducedin [10]). Blocked fun ct ions are difficult for real-coded GAs , becauselocal optimal points block t he progress of search to continue towards t heglobal optimal point . The simulat ion results on t he two-var iable blockedfunction have shown that in most occasions , the sea rch proceeds the way aspr edicted in [10]. Most importantly, it has been observed that the real-codedGAs wit h SBX work similar to that of t he binary-coded GAs wit h single-pointcrossover in overcoming t he barrier of the local peaks and converging to t heglobal bas in. However , it is premature to conclude whether real-coded GAswit h SBX op erator can overcome t he local barriers in higher-dimensionalblocked fun ct ions.These results are encour aging and suggest avenues for further research.Because the SBX ope rat or uses a probability distribut ion for choosing a childpo int , the real-coded GAs wit h SBX are one st ep ahead of the binary-codedGAs in te rms of ach ieving a convergence proof for GAs. With a direct probabilist ic relationship between children and parent points used in t his paper,cues from t he clas sical stochast ic optimization methods can be borrowed toachieve a convergence proof of GAs , or a much closer tie between the classicaloptimization methods and GAs is on t he horizon.
In short, according to the authors my SBX operator using real gene values is as good as older ones specially designed for discrete searches, and better in continuous searches. SBX as far as i know meanwhile is a standard general crossover operator.
But:
- there might be better ones out there i just havent seen yet. please tell me.
- besides tournament selection and mutation, crossover is just one part of the breeding pipeline. also there is the elite management for MOEA which is AT LEAST as important as the breeding itself.
- depending on the problem, there are almost always better specific ways of how to code the mutation and the crossover operators. but octopus is meant to keep it general for the moment - maybe there's a way for an interface to code those things yourself..!?
2) elite size = SPEA-2 archive size, yes. the rate depends on your convergence behaviour i would say. i usually start off with at least half the size of the population, but mostly the same size (as it is hard-coded in the new version, i just realize) is big enough.
4) the non-dominated front is always put into the archive first. if the archive size is exceeded, the least important individual (the significant strategy in SPEA-2) are truncated one by one until the size is reached. if it is smaller, the fittest dominated individuals are put into the elite. the latter happens in the beginning of the run, when the front wasn't discovered well yet.
3) yes it is. this is a custom implementation i figured out myself. however i'm close to have the HypE algorithm working in the new version, which natively has got the possibility to articulate perference relations on sets of solutions.
…
he example file to this file so you can give it a try with any version of Honeybee that you're already using. The only requirement is to have OpenStudio installed as the component is using OpenStudio libraries to parse gbXML files. If you're using the latest version available on github the component is also available under WIP tab.
Why?
The main purpose of developing this component is to save time and effort for importing Revit models for energy and daylight analysis. It bothers me to see a lot of smart people spend a lot of time to just come up with solutions just to get the geometry from Revit to Honeybee for analysis. This component is not solving all the issue but is a first step forward. In an ideal world, the future version of Honeybee, which works both under DynamoBIM and Grasshopper should address this issue but that can take some time to be fully ready!
How?
To use this component you need to Export your Revit model as gbXML and then use the file path to load the file into Grasshopper. There are several resources available online on how to prepare the analytical model in Revit and export the gbXML file. Here is an image for importing the Revit 2017 sample model using the default settings. As you can see the model will be just as good as what your original gbXML file from Revit is.
What can be improved?
Well, there are several items that can be improved and they are mostly not on us. To get it started I add what I think are the 3 main shortcomings and my thoughts on how they can be addressed in the future. Feel free to add what you think needs to be added to this list in the comments section.
1. Revit analytical models and as the results gbXML files, by design, are not intended to be clean. Watch this presentation from the Autodesk University to see the logic behind this approach which in short is it doesn't matter for a large scale early stage energy model. Well, This will be quite a problem for studies that you can do with Honeybee. Included but not limited to daylight and comfort analysis.
The best solution that I can think of, until Autodesk fixes their exporter, is to use Revit Rooms and Spaces and generate a clean model from the scratch. We have already tried this approach in Revit but since the Revit API doesn't provide access to Room openings we had a very hard time to get it to work.
That's why that I opened an idea on Revit ideas to get over this issue. With your support we already have 81 votes, but it hasn't been enough to make them to consider the idea for an official review. If you haven't voted already and you think this will be a helpful feature take a moment and vote so we can have it implemented at some point in the future.
2. There is no way (that I know) to export only part of the model. The way export gbXML is set up in Revit is to export the whole model once together. As a result, if you have a huge model with 100 rooms and you want to get one of the rooms into Honeybee using this component you have to export the whole model, which can take some time, and then import them all back into Grasshopper. To partially address this issue I added an input to the component that allows you input a list of names for rooms that you're interested to be loaded into Grasshopper. You can use the name of the room/space in Revit as an input for the component.
3. The component doesn't import adjacencies, loads, schedules and HVAC systems. I wasn't able to export a gbXML file from Revit with any of this data except for the adjacency, but even if you can do that, the component currently can only import geometries and constructions. I hope we get access to 1 and so we don't have to use the xml file approach at all, but if that takes a very long time then we will add these features to the component.
Happy 2017!
Mostapha…
cribes a set of machine movements in X, Y and Z (Z being Pen Up and Pen Down) directions. It very closely related to G-code in this way - just slightly more simple than G-code overall.
For tool selection you use the Select Pen - SPx - command, x is the number of the pen you are using. As I'm using a vinyl cutter without a pen/tool changer I just use SP1 in the file header/ini of the cutter.
Without knowing the full spec of your machine it is hard to say for certain BUT all of my experience with CNC machines - of all sizes and spec levels - the actual control files are pretty much the same. Very simple text based HPGL or G-code text files run all motion control - even on things like 7 axis robot arms etc. For plotting I'd expect you'd be able to get a usable HPGL/PLT file without a lot of work - its just a matter of matching the file to what the machine is expecting.
To answer your question about getting the file to the printer its maybe best to explain it this way: there are two parts to this project1/ Create the correctly formatted text/hpgl/plt file ready to send to the printer2/ Send the file to printer
For part 1/ the procedure is:
Select the curves you want to printConvert the curves into a set of pointsFormat these points into HPGL Save this HPGL as a text file
For 2/ we need a way to stream the text file to a printer port
To do this I've used an old dos command line technique that allows allow you to 'copy' a text file to a printer LPT or COM port:
copy /b c:\spool\ini.plt LPT1
Type the above into a DOS command line and it will send a text file called ini.plt to the printer on LPT1 port. As you'll see in my attached code I use os.system calls in my python code to send files when needed.
So your original code was doing some strange things with the conversion from curves to points. Lines/Polylines were OK - with the code just using the line end points. For curves and polycurves the code code was exploding these into segments and then dividing into set of points. However this led to two issues: - curves that started off as closed polycurves would end up being plotted as open curve segments - which is not very good for a cut file and not very smooth for a plot file.- the division of the curves to points was by distance - and if this wasn't an exact division of the length of the curve the end point would not match up with the next line - again not ideal for a cutting file which needs to be a closed curve.
To solve the above I changed to using rs.ConvertCurveToPolyline - with the tolerance set to match the HPGL resolution of 0.025mm - this converts all curves needed to plot to polylines, leaves everything closed and ends points line up perfectly.
I had one other problem with my setup - I ran into a file size/curve number/plotting points upper limit. A small number of curves would cut/plot fine, however at a certain number in one file the print driver would throw an error and the plotter would not even start plotting the file. I could not work out where is the system this limit was being imposed. The current working version of my code is attached - it gets around this file size limit by creating a separate print file for each curve required and sending them to the plotter in sequence. Not as completely tidy as I'd like as it flashes up a cmd window on every loop - but plots/cuts are perfect.
The final 'nice touch' for the project is I've created a custom tool bar button to run the script - all I have to do to cut a file is hit the button on the tool bar, select the curves and hit enter = SO EASY!
I've attached my latest code, a sample HPGL file to plot a rectangle, and a screen shot of setting up the custom toolbar button.
Cheers
DK…