are invisible in the picture.
So what you see it's a common band that has lost all those characteristics of the original in order to protect the process.
We also did an "invisible setting" prototype which has built in Flexibility.
If you are in the jewelry industry you would know what I mean and it is close to a miracle.
It's a shame I can not share details and this is why I am planning my next major work on something 10 times more complex then this, at least.
It's will be for my own business and for the jewelry industry as well.
I hate to tease people and then not to be able to produce anything more than an image.
But I thought it would be better than nothing, at least for jeweler designers, so they can see that there are more and more users and that complexity it is not something to shy away from, and it's worth the time spent because the returns on production are far larger than for special orders and this is why GH is useful.
We can design a piece of jewelry usually in less then 1 hour, hence GH is not really worth the time.
But for production with so many variables (Finger sizes controlling most of the outcome together with stone sizes etc.) then GH it's a MUST!
I really appreciate everyone's comments and suspicions and I understand why.
99% of the people out there do not really understand the complexity of jewelry at the industrial level. It' s not just form but the post-production that's the killer.
This industry it's still an hybrid of technology and art with, and due to the lack of the old school pros, unfortunately, we face very lousy and unpredictable execution in the post production (after the casting process). This leaves you with a design process and intention that requires a lot of control over every possible variant of the object.
One wrong design aspect it's multiplied thousands of times at the benches (for every single piece) = bad profits!
It sound more serious that it is but very few companies are willing to do so (delivering good product vs low quality and this also happens because the consumer is not longer aware of the difference. So, who does keep quality, it's only because of integrity, third party QA or just pride).
This is way GH is invaluable. This is why that Def looks like out of proportion for that (Visual) simple band.
It is because there are dozens and dozens of variable effecting everything else. In fact it is not even complete as it is in order to cover everything but the most critical ones.
Sorry for the long replays. I am an instructor and a professional jeweler by trade since I was very young and I love to teach, so I overflow with explanations... and Components :)).
Next time it will be "in the open" as they say...…
uments:
1. You are targeting CATIA don't you? (not exactly tomorrow but ... soon) and/or SolidWorks (hello C# haven't we met before?).
2. You MUST deal with nested block instances instead of what you are trying to do right now (I'm talking about the real MERO things not abstract Lines and points). This is not doable with GH components I'm afraid (but it's rather easy with code).
3. You MUST deal with RDBMS in order to keep track with what's going on in your company per project per case per designer (who sells that bolt? what's his cat name? is he a reliable supplier? what I'm doing in life? ... that sort of "queries"). At this point: CATIA is 1% CAD things and 99% PLM stuff (Product Life cycle Management). We do want that since it's 21st century running don't we?.
I hear you: but these are 3 arguments ... indeed but ... hey who's counting? he he.
Method:
A. This def attached has a very simple C# that gets mesh Pts and makes a nice U/V style collection of points (DataTree in plain English).
B. Then we go to that umbrella sticks thingy: we can calculate anything (already the thing does "some") plus your collections of divided points (with the right way, he he) VS a given node: you said (Skype) that you want to calculate angles with these (from 2 to 6) in mind: obvious since you are doing real-life MERO things.
C. Then we could calculate the appropriate Planes for PlaneToPlane transformations: get a nested instance definition (the red things that you've showed to me yesterday) placed at 0,0,0 (Plane.WorldXY) and put in in every Plane collection related with every node (clash defection is an obvious must).
Case resolved, closed: what about that Vodka?
More in Skype
…
merely automates finding clear intersections between pairs of objects and then splits the objects along those intersection *curves*, deletes the trims, then joins the remains, and cycles on. But within the confusing Rhino Settings tolerance value, wherever surfaces actually just sort of come closely together, there *is* *no* clear intersection curve. So it bugs out and stops working EVERY time you try more than a dozen or two spheres.
Some software can do this by switching to volumetric pixels (voxels). $9K-$30K Geomagic Freeform is an example of this. It also fails sometimes, often due to memory issues, as you can imagine since it needs to fill all inner space of each sphere definition with 3D pixels.
Materialize Magics for $16K can often handle such Booleans well. It will take a seeming lifetime to figure out such often pirate software kludges though.
One thing you can try though is to simply drape a mesh or NURBS plane onto the top of your spheres.
There's a well known *reason* your Booleans are failing. Nobody here has yet even hinted at it:
The main reason is that Rhino/Grasshopper developers don't care about the human element. The math exists to make this work very fast, every time. It just has to join things *right*, incorporating human knowledge of kissing surfaces, instead of acting stupidly, like some pocket calculator. But that would involve hacks that make 99% of complex Booleans work instead of 10%, and we can't have that since it will be SLOWER for the other 1% that just happen to have no nearly kissing or really kissing surfaces.
You could also use the new Cocoon plugin to do a surface *around* your structures, with a given radius of extension beyond the spheres, then offset that surface back the same radius. That is 100% robust, but won't offer quite as sharp of intersections, more rounded, like most everybody wants anyway.
You can *test* Boolean failures, by running a Grasshopper intersection command, to see the intersection curves, and zoom in to see how badly many of them are, all knotted, or twisted, or even with gaps, often with gaps.
It's a math problem nobody at McNeel wants to solve, sorry.
Just write a check for $25K and spend six months taking notes, like I did, and you can merge your simple spheres finally.…
Added by Nik Willmore at 6:33pm on October 20, 2015
lts.
In the visualization, points is an interesting option. It's a matter of aesthetics I guess, I go with surfaces :) Also what you can try is selecting Filters -> Slice (you can also find it in the icons above the pipeline viewer), in the Slice options below the pipeline press Z normal and on the Z coordinate press some height relevant to the buildings (e.g. 1.75m a typical human scale). That would show you the flow around the buildings on that height. Experiment with selecting other normals and values. Keep playing with the filters there's some cool things in there. Also you can check out the mailing list and extensive paraview documentation.
Concerning the errors I apologize because I just downloaded your case.
It appears that the decomposeParDict is not included in the system folder. I am not sure if this is due to BF not going through the whole workflow yet or an ommission on our side. Please feel free to add it in Github. I will also note it down and pass it to Mostaph to check. In the meantime please find attached a VERY detailed decomposeParDict file. I took the liberty to set it at 4 processors (the numberOfSubDomains value) and also selected (that is uncommented) the scotch decomposition method. It's the easiest method to use since it is automatic and doesn't require any more inputs on how the domain is decomposed on the x,y,z directions (which would require you to change values in the attached file).
Now, the different folders created are simply snapshots of the current solution at the specific timestep. To control how often the solver is saving change the writeInterval number in the controlDict file. You can also change almost all these values on the fly, while OF is running.
Finally, concerning the other errors of parafoam it seems somehow parafoam is reading the intial condition names instead of actual results from the solution files and it doesn't like it.
Does this happen only when you open the case (i.e. at 0 time) or does it also happen when you move to an other timestep?
Also, are you using paraFoam, paraview or the paraFoam -builtin method?
The extension of the paraFoam file seems to be .foam which means you are probably using the built in viewer. That might be the issue but I'm not sure.
Can you try running paraview, navigate to your case folder, open the .foam file and see if there is still an error?
Also, if it isn't much trouble can you zip one of the time folders and attach it here? I'd like to take a look at what's inside to check against what the error report says.
Once again thanks for testing!
Kind regards,
Theodore.…
sinergetici associati alla compresenza simultanea di differenti strumenti di analisi e digital design all'interno di un processo di progettazione in svolgimento. I partecipanti utilizzeranno Grasshopper (modellatore parametrico per Rhino): l'uso di questo editor grafico di algoritmi si integra alla perfezione con gli strumenti di modellazione di Rhinoceros 3D espandendo le possibilità di corstruire modelli parametrici altamente complessi. Per generare una complessità simile saranno utilizzati collegamenti live ai diversi programmi elencati di seguito: . Autodesk Ecotect Analysis via GECO . FEA software GSA via SSI Durante questi intensi 3 giorni, i partecipanti impareranno il workflow dei plug-ins con l'aiuto di esempi esplorando una panoramica dei differenti software, le possibilità di testare le performances di un progetto o l'uso di strumenti addizionali non legati ad un singolo sistema (es. accentuazione, formazione, reazione parametrica) [english text] The focus of the workshop is to integrate and correlate the synergistic effect associated with simultaneous presence of different digital design- and analysis tools in an ongoing design process. The main attention is set on easy to handle interface , which should be used at a early stage of conceptual design to respond to external and internal influences in a intelligent and sustainable way. Participants will use the software Grasshopper as a parametric modeling plug-in for Rhino. The usage of this graphical algorithm editor tightly integrated with Rhino's 3-D modeling tools open up the possibility to construct highly parametrical complex models. To generate this complexity we will use live linkages to several programs listed below: . Autodesk Ecotect Analysis via GECO . FEA software GSA via SSI In this 3 intense days, the participants should learn the workflow of the plug-ins with the help of examples and get an overview of the different software's, there possibilities for evaluating the performance of a design or the usage of additional tools to be not chained to a single system . (e.g. parametrical accentuation, parametrical formation, parametrical reaction) [.] Dettagli : Istruttori: Thomas Grabner & Ursula Frick from [uto]. lingua del corso: inglese (saranno disponibili tutor di supporto ma è richiesta una conoscenza di base della lingua unglese).
Quote d'iscrizione (min 12 max 20 posti): educational* : € 280.00 + iva professional: € 450.00 + iva * studenti, docenti, ricercatori, dottorandi e laureati fino a un anno dalla data di laurea OFFERTA EARLY BIRD SPECIAL: le prime 5 domande di iscrizione pervenute entro il 31 Dicembre 2011 avranno diritto ad una quota di iscrizione scontata del 20% Quote d'iscrizione E.B. SPECIAL: E.B. SPECIAL educational* : € 224.00+ iva E.B. SPECIAL professional: € 360.00+ iva. ulteriori info, dettagli e iscrizioni: http://www.co-de-it.com/wordpress/nexus-advanced-grasshopper-workshop-with-uto.html…
nside the zone. I would move your comfort evaluation surface to be 1 meter off the ground in order to be representative of typical human height.
Also, you did not intersect the ground with the rest of the zone geometry, resulting in an incorrect energy simulation. After intersection, you also get one surface of the ground zone that is not inside any buildings. I fixed these two things in the attached file ad it works:
I would also recommend breaking the top surface of the ground up into sub-surfaces so that you can capture the variation in ground surface temperature that happens across the outdoors. Second, I would recommend putting some windows on your buildings as the exterior surface temperature of windows can be very different than that of opaque surfaces. Finally, you should keep in mind that the outdoor maps are assuming a very basic outdoor wind profile by default and, to accurately understand outdoor comfort, you really should be incorporating wind patterns after running a CFD. This discussion has some information about importing CFD from other programs to GH:
http://www.grasshopper3d.com/group/ladybug/forum/topics/import-cfd-result-to-honeybee
-Chris…
innovation technologies and academic realms. We believe that this association will allow participants to be part of the art community of the Guggenheim museum and the academic environment of the UPV -more particularly the ETSASS Escuela Tecnica Superior de Arquitectura de San Sebastian. Moreover, the partnership with Tecnalia will enable us to work together with their most advanced media and research and innovation groups.
The program will start with a symposium in June at the Guggemheim in Bilbao and will be followed by a workshop in July at the Alhondiga in Bilbao and ETSASS digital fabrication laboratory in San Sebastian. The works produced at the visiting school will be exhibited from September on at the Alhondiga in Bilbao:
-To introduce the research topic of the visiting school, the workshop will be preceded by a one-day-long symposium. This event will initiate a debate between professionals, theoreticians and scientists from the field, to discuss about alternative and critic methods of environmental adaptability.
-The workshop will investigate new design processes to produce context sensitive environments from a critical perspective. Local ‘materials’ such as user behaviour, social patterns or environmental analysis will inform the design process.
In order to construct this agenda, the workshop will invest on digital design and fabrication strategies by studying data-feed protocols, environmental simulation software and algorithmic design. To work within actual conditions, the site will be proposed by the Bilbao authority as part of the future city and planning intentions.
-A prototype of the best project from the workshop will be fabricated in the Tecnalia installations. To build this prototype, we will take advantage of the robotics and fabrication department in Tecnalia. In September 2012, an exhibition will be held at the Alhondiga comprising the works developed at the visiting school and the prototype produced in collaboration with Tecnalia.…