GH, same as using sweep2 command in Rhino.
The one on the right is what I got so far (the output smooth our the kink of the original rails). Basically I am just following the methods provided by sdk sample: http://wiki.mcneel.com/developer/sdksamples/sweep2 .
The following is the function I copy and use directly from the SDK sample. By using this function, I can generate the sweep surface at right. But I want to have is the one in the middle with the kink edges. Can anyone show me how and where to modify he settings? I guess some sweep arguments need to be changed? I have try couples, such m_simplify, m_bSimpleSweep, m_bSameHeight, m_rebuild_count... but still cannot find a right combination for this function to output the sweep surface I want. Any suggestions or helps are very appreciated. Thanks for your help and time on this.
'Sweep2 function'----------------
Sub Sweep2( ByVal Rail1 As IOnCurve, _
ByVal Rail2 As IOnCurve, _
ByVal sCurves As List(Of IOnCurve), _
ByRef Sweep2_Breps As List(Of OnBrep))
'Define a new class that contains sweep2 arguments
Dim args As New MArgsRhinoSweep2
'Set the 2 rails
Dim Edge1 As New MRhinoPolyEdge
Dim Edge2 As New MRhinoPolyEdge
Edge1.Append(Rail1.DuplicateCurve())
Edge2.Append(Rail2.DuplicateCurve())
'Add rails to sweep arguments
args.m_rail_curves(0) = Edge1
args.m_rail_curves(1) = Edge2
args.m_bClosed = False
Dim section_curves As New List(Of OnCurve)
'Loop through sections to set parameters
For Each Section As IOnCurve In sCurves
Dim sCurve As OnCurve = Section.DuplicateCurve()
section_curves.Add(sCurve)
Dim t0 As Double = 0
If Not Edge1.GetClosestPoint(sCurve.PointAtStart(), t0) Then
If Not Edge1.GetClosestPoint(sCurve.PointAtEnd(), t0) Then
Dim s As Double = 0
sCurve.GetNormalizedArcLengthPoint(0.5, s)
Edge1.GetClosestPoint(sCurve.PointAt(s), t0)
End If
End If
args.m_rail_params(0).Append(t0)
Dim t1 As Double = 0
If Not Edge2.GetClosestPoint(sCurve.PointAtStart(), t1) Then
If Not Edge2.GetClosestPoint(sCurve.PointAtEnd(), t1) Then
Dim s As Double = 0
sCurve.GetNormalizedArcLengthPoint(0.5, s)
Edge2.GetClosestPoint(sCurve.PointAt(s), t1)
End If
End If
args.m_rail_params(1).Append(t1)
Next
'Set shapes
args.m_shape_curves = section_curves.ToArray
'Set the rest of parameters
args.m_simplify = 0
args.m_bSimpleSweep = False
args.m_bSameHeight = False
args.m_rebuild_count = -1 'Sample point count for rebuilding shapes
args.m_refit_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AbsoluteTolerance()
args.m_sweep_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AbsoluteTolerance()
args.m_angle_tolerance = RMA.Rhino.RhUtil.RhinoApp.ActiveDoc.AngleToleranceRadians()
Dim sBreps() As OnBrep = Nothing
If (RhUtil.RhinoSweep2(args, sBreps)) Then
For Each b As OnBrep In sBreps
Sweep2_Breps.Add(b)
Next
End If
Return
End Sub
…
you working on a PV system which will power a domestic hot water boiler?
To answer your questions:1) Each grasshopper component (ghpython being one of those too) is using grasshopper's data matching algorithm. This algorithm takes care of complex issues which may arise from combining lists with single items, data trees with different number of items per branch and so on.I think there is a way of introducing a call to other processor's threads per each inputted surface, but this will be a very difficult job, as it will require writing a custom data matching algorithm. I do not think I am up to that task.Instead I tried to introduce the multithread only to the final part of the PVsurface component and one of its time consuming parts: calculation of sun angles, solar radiation and ac/dc power output.I attached the test file below, but sadly it didn't go well: the multithreaded version mostly runs at the same time as the regular version.I do not think I am qualified enough to answer why is that so, but I think that it may have something to do with the type of the function that the multithreading is applied to: the code is suppose to run few separate functions a couple of thousand times, and work with a couple of lists. From my experience, the multithreading works the best when a single list or two are supplied to a single function. I may be wrong on this.I am very sorry to say that I can not implement this feature.2) I am not familiar if open source PV modules database has been released.But one can always download the data for specific modules from producers websites. It can then easily be transferred to a .csv file or other text file.Ladybug Photovoltaics are based on NREL's PVWatts model.In comparison with other commercial software applications, PVWatts offers a more generalized system model, with some of the values and characteristics being assumed or embedded.The Fuentes empirical thermal model we are currently using follows the same logic: it generalizes the Module characteristics. The following characteristics are only editable: module efficiency, temperature coefficient and module mount type.It may be possible to replace Fuentes with some other, less generalized 5 parameter thermal model. But as an architect, I would definitively need help on this.
Sorry if my reply did not fulfill your expectations, and thank you for the kind words!…
r graphics get saved as 24x24 pixel images before they are put into the grasshopper application, which means the icons look like crap when you zoom in. This is the aforementioned problem that needs to be addressed in GH2. There have historically been two approaches to this issue:
Provide pixel images with several sizes.
Render vector graphics directly.
Option 1 is common for apps that do not have variable levels of zoom, such as Windows Explorer. When explorer shows file icons it either shows them in 16x16, 32x32, 48x48, 96x96, or these days, various HUGE sizes. As a result *.ico files allow you put in different images for all these target sizes. Since Grasshopper has variable zoom levels, this is not an ideal solution. Also, it requires a lot more work per icon.
Option 2 is becoming more and more popular as increased graphics speed now allows for the real-time rendering of vector graphics. Yet, you still need a renderer that knows how to draw vector geometry crisply at low sizes. All vector renderers I know just interpolate the geometry linearly and if a line happens to end up 'between pixels' it's just fuzzy.
I don't have hard and fast rules for the icons, but I try to adhere to at least these:
Keep a border of 2 pixels free around the icon content. So basically only use the inner 20x20 pixels rather than the 24x24 you're allowed. This is needed because the drop shadow needs to go there.
Only draw silhouette edges around shapes, not inner creases. Typically a 1-pixel line will do. I prefer to use a dark version of the fill colour rather than black for edges.
Loose curves can be drawn in 1 or 2 pixel thicknesses, depending on how important the curve is.
Try to avoid text in your icons (not always possible).
Stick to 1 colour family per icon, preferably per icon family. You can add highlights with another colour if you must, but too many hues make an icon hard to read (for the example the [Voronoi] icon, it has red, green and blue and it's a bit of a mess, on the other hand [Colour Wheel] has the full spectrum and seems to work quite well...).
Very roughly speaking, if there's both black and red geometry in an icon, it means the red is component input and the black is component output.
Drop shadows are pixel effects, applied to the 24x24 image. They have a blurring radius of 2 pixels, a horizontal offset of 1 pixel to the right, a vertical offset of 1 pixel to the bottom and they are 65% black.
When you use high contrast shapes (for example black edges on a light background) the anti-aliasing provided by vector renderers such as Xara or Illustrator won't be enough to make it look smooth. I'd recommend avoiding high contrast if at all possible, but if not possible then draw a 1-pixel line around the dark bits in 95% transparent black. This effectively extends the anti-aliasing range from 1.5 to 2.5 pixels and it helps make things looks smoother.
--
David Rutten
david@mcneel.com…
ive collaborative environment.
TYPE : Course module and Workshop
The event is open for anybody interested from all the fields of design, including: architecture, interior design, furniture design, product design, fashion design, scenography, and engineering.
1. COURSE MODULE (20-23 April 2014) - optional
+ type: 3 days intensive course regarding basic knowledge in parametric design (LEVEL 1)
+ software: Rhinoceros & Grasshopper
+ plugins: Kangaroo, Weaver Bird, Lunch box, Ghowl, Geco
+ achievements:
- acquainting to the components & the concept of Generative Design
- understanding the strategies in Algorithmic Design
- how to easily insert simple mathematical equation into the project to gain more control
- how to utilize proper plugins with respect to their nature of the project
- interacting with different analysis platforms such as Ecotect & remote controller
- solving several exercises with different scales( 2D- 3D ) during each phase of the workshop
2. WORKSHOP (23-27 April 2014)
A 5 day Design-Based Research Workshop exploring new techniques in Digital Architecture/Fabrication, with a specific focus on the use of generative systems and parametric modeling as tools for creative expression.
Our ultimate goal is to increasing the efficiency of utilizing digital tools in parallel with geometric performance of the primitive design agent.
+ + CONCEPT
Fashion and Architecture are both based on basic life necessities – clothing and shelter.
However, they are also forms of self-expression – for both creators and consumers.
Both fashion and architecture affect our emotional being in many ways.
The agenda of this workshop is to investigate on the overlap between these two areas of design, art & fashion.
Fashion and architecture express ideas of personal, social and cultural identity, reflecting the concerns of the user and the ambition of the age. Their relationship is a symbiotic one and throughout history, clothing and buildings have echoed each other in form and appearance. This only seems natural as they not only share the primary function of providing shelter and protection for the body, but also because they both create space and volume out of flat, two-dimensional materials.
While they have much in common, they are also intrinsically different – address the human scale, but the proportions, sizes and shapes differ enormously.
+ + + OBJECTIVES
So far, Architects have been using techniques such as folding, bending etc. to create space, structural roofs or different other structural shapes.
The agenda of this workshop goes further with the investigation of algorithmic thinking through generative tools Integrated in design.
The challenge is creating a bridge that connects these two areas of design, architecture and fashion that perform at two opposite scales.
+ + + + TECHNICAL BRIEF
In the early stages physical models and low-tech strategies will be used, allowing the participants to gain a greater understanding of materials, fabrication and assembly methods as well as simple, yet pragmatic structural solutions.
Later in the workshop these strategies will be digitalized and elaborated using software visualizing tools such as Rhinoceros and the algorithmic plug-in Grasshopper.…
; GH, this one came out and rhinoceros is disappear...like this
it said " Rhinoceros5's working is stopped. because some problems occured so Rhinoceros5 can't work correctly no longer " then I have no choice but terminate Rhinoceros.
There are some discussion about RhinoIronPython installing numpy though, no one has same problem like me. so Please somebody tell me!!
and one more question...just in case, I tried to install numpy into ironPython2.7
C:\Program Files (x86)\IronPython 2.7>ipy "C:\Program Files (x86)\IronPython 2.7\ironpkg-1.0.0..py" --installBootstrapping: c:\users\owner\appdata\local\temp\tmp2nand1\ironpkg-1.0.0-1.egg 118 KB [.................................................................]
C:\Program Files (x86)\IronPython 2.7>ironpkg -hUsage: ironpkg-script.py [options] [name] [version]
.
.
.
C:\Program Files (x86)\IronPython 2.7>ironpkg scipyWrote configuration file: C:\Users\owner\.ironpkg=============================================================================Traceback (most recent call last): File "C:\Program Files (x86)\IronPython 2.7\ironpkg-script.py", line 10, in <module> File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\main.py", line 364, in main File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\indexed_repo\chain.py", line 27, in __init__ File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\indexed_repo\chain.py", line 67, in add_repo File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\utils.py", line 92, in write_data_from_url File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 435, in open File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 407, in _call_chain File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 654, in http_error_302 File "C:\Program Files (x86)\IronPython 2.7\Lib\httplib.py", line 1261, in __init__ File "C:\Program Files (x86)\IronPython 2.7\lib\site-packages\enstaller\utils.py", line 73, in open_url File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 154, in urlopen File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 547, in http_response File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 467, in error File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 429, in open File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 446, in _open File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 407, in _call_chain File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 1240, in https_open File "C:\Program Files (x86)\IronPython 2.7\Lib\urllib2.py", line 1167, in do_openAttributeError: 'module' object has no attribute '_create_default_https_context'
C:\Program Files (x86)\IronPython 2.7>
how can I deal with this error?…
well, very similar input data must result in wildly different hashes. For example, imagine we have an algorithm which computes hashes of text, and the hashes it computes are all numbers between 0 and 999. We then apply this algorithm to a piece of text:
"When Spring comes back with rustling shade" = 385
So far so good. Now imagine we change the text slightly, for example by removing a single "l":
"When Spring comes back with rusting shade" = 973
Minor change -> very different hash. There are of course way more unique texts than there are numbers between 0 and 999. This must therefore mean that a lot of text will result in the same hash. For example "When Spring brings back blue days and fair." may also result in a hash of 385. Because of the pigeonhole principle, there is nothing to be done about this.
Now for the tricky bit. Hashes are often used to validate executable code. Say your friend James at MI6 sends you a small program that will allow you to eavesdrop on Angela Merkel, and -over the phone- he tells you the hashcode for that application. You can then hash the application yourself, verify that it indeed results in the same hashcode and then you know you can trust the executable.
But now Jack from the FBI intercepts the email and adds a few sneaky lines of code to the original application allowing him to determine from your internet search history with up to 95% accuracy whether you like extra cheese on your pizza. The application has now been tampered with, it can no longer be trusted and you should be able to figure this out as it will no longer result in the same hash code.
But wait! Some hashing algorithms are more secure than others. MD5 is now officially considered to be 'hacked' and it is no longer recommended for doing naughty spying. Specifically, Jack will be able to inject his own code in such a way that it does not result in a different hash. Instead, the SHA family of hashers are to be used, as it is not yet known how to trick these hashers.
This is where the problem comes in, because apparently the US government has forcefully disabled the use of MD5 for all purposes. This is a shame because I use it to quickly compare bitmap icons for identicalness so I only have to store an icon in memory once. There is no security hole due to this, because I'm not hashing secure data. MD5 is somewhat faster than SHA, and since I have to hash several hundred icons on Grasshopper start, I opted for the faster one.
(Very) long story short; you're hosed. Grasshopper uses MD5; USgov does not like; Grasshopper does not run on USgov computers.
I'll do some testing to see if I can switch to SHA and then we can see whether or not that solves the problem. This however will take a while as I'm going on a business trip next week and have yet to prepare my presentations.
--
David Rutten
david@mcneel.com…
Added by David Rutten at 12:06pm on March 31, 2014
hopper) and High Definition visualizations (V-Ray) and exploring its scientific innovations supporting the users' platform philosophical ideas.
SESSIONS: 5 sessions of 8 hours (40 hours total)
E-MAIL: educacion@chconsultores.net
REGISTRATION: (55) 56 62 57 93
TECHNICAL INFO: 044 (55) 31 22 71 83
INSTRUCTORS: Have past experience working at Gehry Technologies, and participated at studios with Eric Owen Moss and Tom Wiscombe at SCI-Arc (Southern California Institute of Architecture).
Day 1: Introduction to MAYA tools, 3D exercise start.
Day 2: Continue 3D exercise.
Day 3: Original 3D architecture design.
Day 4: Grasshopper optional application on 3D architecture design.
Day 5: V-Ray Application on 3D architecture design.
30 DAY TRIAL SOFTWARE DOWNLOAD:MAYA 2012: http://www.autodesk.com/products/autodesk-maya/free-triaRHINO 4: http://s3.amazonaws.com/files.na.mcneel.com/rhino/4.0/2011-02-11/eval/rh40eval_en_20110211.exe3DS MAX 2010: http://www.autodesk.com/products/autodesk-3ds-max/free-trialVRAY FOR 3DS MAX: http://www.vray.com/vray_for_3ds_max/demo/thankyou.shtml#thankyouPHOTOSHOP e ILLUSTRATOR: https://creative.adobe.com/apps?trial=PHSP&promoid=JZXPS
www.helenico.edu.mx
www.scifi-architecture.com/#!workshops/c1wua
LIKE US ON: www.facebook.com/scifiarchitecture
…
size component supported only ground PV panels and angled roof PV panels.
Download the newest PV SWH system size component from here (Click on "View Raw" to download it. Then move the downloaded .ghuser file to File->Special Folders->User Objects Folder, an confirm to overwrite it with previously located one).
Just a few opinions on the project you are currently working on:This kind of fixed, non-transparent (overhang) PV panels attached to a building facade are vert convenient for locations with higher latitudes.The reason for this is because they (fixed overhang PV panels) are dimensioned according to the sun position at summer solstice. Elevation angles on summer solstice at higher latitude locations are lower, than those of lower latitude locations.Due to Incheon's low latitude (37), you will get rather short length of the PV panels* : less than 10 centimeters (0.097 meters in the attached .gh file below). As you have mentioned, Galapagos needs to be used too.I will just mention some of the good and bad ways in which the upper issue could be somewhat avoided:1) Increasing the vertical distance between PV panels (PV panels appear above every second window).2) Increase the tilt angle. This will increase the length of PV panels also, but will decrease the final annual AC energy output.An example of this solution has been applied at FKI building in Seoul (latitude: 37N):I already did some tests (with tilt angles: 40, 45, 55) and this does not seem like a good solution, though.3) Shrinking the "sun window" by using the minimalSpacingPeriod_ input. In Photovoltaics, a planner is suppose to make the 9h to 15h part of the sun window free of any obstructions. If you try to decrease the "sun window" to 10 to 14h, the length of your PV panels will increase. You can try to experiment a little bit with this (set your minimalSpacingPeriod_ to 21th of June 10 to 14hours). In general, shrinking the sun window on summer solstice is not a good principle during planning.4) Using tracking PV panels, not fixed ones. But Ladybug Photovoltaics components do not support this kind of PV systems. They only support fixed ones.I would personally go with the first option. You can also experiment with the second and third one.Comment back if you have any other questions.-----------------------* By "length of the PV panels" I mean the: tiltedArrayHeight_ input of the PV SWH system size component.…
ed file and code below:
Color ColorAt(Mesh mesh, int faceIndex, double t0, double t1, double t2, double t3) { // int rc = -1; var color = Rhino.Display.Color4f.Black;
if( mesh.VertexColors.Count != 0) { // test to see if face exists if( faceIndex >= 0 && faceIndex < mesh.Faces.Count ) { /// Barycentric quad coordinates for the point on the mesh /// face mesh.Faces[FaceIndex].
/// If the face is a triangle /// disregard T[3] (it should be set to 0.0).
/// If the face is /// a quad and is split between vertexes 0 and 2, then T[3] /// will be 0.0 when point is on the triangle defined by vi[0], /// vi[1], vi[2]
/// T[1] will be 0.0 when point is on the /// triangle defined by vi[0], vi[2], vi[3].
/// If the face is a /// quad and is split between vertexes 1 and 3, then T[2] will /// be -1 when point is on the triangle defined by vi[0], /// vi[1], vi[3]
/// and m_t[0] will be -1 when point is on the /// triangle defined by vi[1], vi[2], vi[3].
MeshFace face = mesh.Faces[faceIndex];
// Collect data for barycentric evaluation. Color p0, p1, p2;
if(face.IsTriangle) { p0 = mesh.VertexColors[face.A]; p1 = mesh.VertexColors[face.B]; p2 = mesh.VertexColors[face.C]; } else { if( t3 == 0 ) { // point is on subtriangle {0,1,2} p0 = mesh.VertexColors[face.A]; p1 = mesh.VertexColors[face.B]; p2 = mesh.VertexColors[face.C]; } else if( t1 == 0 ) { // point is on subtriangle {0,2,3} p0 = mesh.VertexColors[face.A]; p1 = mesh.VertexColors[face.C]; p2 = mesh.VertexColors[face.D]; //t0 = t0; t1 = t2; t2 = t3; } else if( t2 == -1 ) { // point is on subtriangle {0,1,3} p0 = mesh.VertexColors[face.A]; p1 = mesh.VertexColors[face.B]; p2 = mesh.VertexColors[face.D]; //t0 = t0; //t1 = t1; t2 = t3; } else { // point must be on remaining subtriangle {1,2,3} p0 = mesh.VertexColors[face.B]; p1 = mesh.VertexColors[face.C]; p2 = mesh.VertexColors[face.D]; t0 = t1; t1 = t2; t2 = t3; } }
/** double r = t0 * p0.FractionRed() + t1 * p1.FractionRed() + t2 * p2.FractionRed(); double g = t0 * p0.FractionGreen() + t1 * p1.FractionGreen() + t2 * p2.FractionGreen(); double b = t0 * p0.FractionBlue() + t1 * p1.FractionBlue() + t2 * p2.FractionBlue();
ON_Color color; color.SetFractionalRGB(r, g, b);
unsigned int abgr = (unsigned int)color; rc = (int) ABGR_to_ARGB(abgr); **/ var c0 = new Rhino.Display.Color4f(p0); var c1 = new Rhino.Display.Color4f(p1); var c2 = new Rhino.Display.Color4f(p2); float s0 = (float) t0; float s1 = (float) t1; float s2 = (float) t2;
float R = s0 * c0.R + s1 * c1.R + s2 * c2.R; float G = s0 * c0.G + s1 * c1.G + s2 * c2.G; float B = s0 * c0.B + s1 * c1.B + s2 * c2.B; color = new Rhino.Display.Color4f(R, G, B, 1); } } return color.AsSystemColor(); }
…
bi-directional link, the link is unidirectional (downflow only), because of the use of proxies.
Matrix transforms and persistent constraints: I don't think this is true. The parts can have mates to other parts that preserve geometric relationships like 'coincident' , 'aligned' etc. These are essentially bi-directional. GH's algorithmic approach does not do relationships in the same / flexible way. In GH, the 'relationship' has to be part of the generation method that dependent on the creation sequence. I.e. draw line 2 perpendicularly from the end of point of line 1. If you are thinking about parts or assemblies sharing, or referencing parameters as part of the regen process, this is also possible. iLogic does this, and adds scripting. So does Catia. Inventor/iLogic can also access Excel and have all the parameter processing done centrally, if required.
Consequently, scripting the placement of components is irrelevant in GH, unless you decide that each component needs to be contained in its own separate file.
I wouldn't be too hasty here. Yes, you are right about compartmentalisation. I think this needs to happen with GH, in order to deal with scalability/everyday interoperability requirements. Confining projects to one script is not sustainable. MCAD apps have been doing this for ages with 'Relational Modeling'.The Adaptive Components placement example illustrates that it is beneficial to be able to script some 'hints' that can be used on placement of the component. Say, if your component requires points as inputs, then its should be able to find the nearest points to the cursor as it moves around. I think Aish's D# / DesignScript demo'd this kind of behaviour a few years ago. Similarly, Modo Toolpipe reminds me how a lot of UI based transactions can be captured as scripts (macro recorder etc). Allowing this input to be mixed in and/or extended by GH I think will yield a lot of 'modeling efficiency' around the edges. This is a (mis)using GH as an user-programmable 'jig' for placing/manipulating 'dumb' elements in Rhino. It may even give the 'dumb elements' a bit more 'intelligence' by leaving behind embedded attributes, like links to particular construction planes etc.Even if we confine ourselves to scripting. GH is a visual or graphic programming interface. A lot of 'insert and connect' tasks can be done more easily using graphic methods. If we need to select certain vertices on a mesh as inputs for, say, a facade panel, its going to be quicker to do this 'graphically' (like the AC example), then ferreting out the relevant indices in the data tree et al. The 'facade panel' script would then have some coding to filter/prompt the user as to what inputs were acceptable, and so on.
This also brings up the point that generating components and assemblies in MCAD is not as straightforward. In iParts and iAssemblies, each configuration needs to be generated as a "child" (the individual file needs to be created for each child) before those children can be used elsewhere.
Not sure what you mean here. If the i-parts are built up using sketches /profiles or other more rudimentary features (like Revits' profile/face etc family templates) then reuse should be fairly straight forward. I suppose you could make it like GH scripting, if you cut and paste or include script snippets that generate the desired Inventor features.
One of the reasons why the distributed file approach makes perfect sense in MCAD, is that in industry you deal with a finite set of objects. Generative tools are usually not a requirement. Most mechanical engineers, product engineers and machinists would never have any use for that.
I don't think this is true. Look at the automotive body design apps, which are mostly Catia based. All of the body parts are pretty much 'generative' and generated from splines, in a procedural way, using very similar approaches to GH. Or sheet metal design. It's not always about configuration of off-the-shelf items like bolts. And, the constraints manager is available to arbitrate which bit of script fires first, and your mundane workaday associative dimensions etc can update without getting run over by the DAG(s) :-)
…