e path overlap to reduce tool wear. 50% might be a good estimate.
- Most of the time, you will route in two steps (roughing for volume and finish for surface) both with different tools.
That will get you a rough length of the overall tool path. Add some more for tool positioning.
Now for the time factor:
Cutting speed (vc) for wood is about 300m/s. That's for a single tooth or blade throught the material.
You can calculate the optimal RPM by
n [RPM] = (vc [m/min] *1000) / (3.14 * Ød1 [mm])
the travelling speed can be estimated by
f [mm/min] = n * fz * z
with z = number of edges and fz depending on material and tool diameter (0.05 as an estimate)
There are some online calcualtion tools, that help you... still you will need a bit of knowledge about the technology to use them.…
re 2 solutions: (last boolean toggle)
true - interpolated point while rebuilding surface from points;
this means the final surface will touch the input curve and the curvature is almost the
same as original (but not perfect, this is still a rebuild).
false - not-interpolated;
the final surface will not touch the input curve but the surface nurbs point will...
the surface will be visibly different from original
P.S. the slider at "50" is to decide how % of divisions to put above the curve an under...
If you need an even more relaxed/uniform point grid, we could divide isocurves "V" with an incremental t (curve parameter) values...
tell if...
bye!
maje
…
orders from bottom to top. There are three pre-set heights: 20,000 / 50,000 / 80,000 (or 20, 50, 80 meters, I'm working in mm).
So if the three cantilevering volumes would be numbered 0,1 and 2, then I'm looking for a way to let Grasshopper generate the following:
A = 20m-position
B = 50m-position
C = 80m-position
(A,B,C):
0,1,2
0,2,1
1,0,2
1,2,0
2,0,1
2,1,0
Attached is wat I have so far, I have only managed to make simple translations per geometry.
…
it works is as follows:
The GH_Document has a list of objects that have scheduled a solution (or rather, it maintains a list of callback delegates those objects have registered).
It also contains a TimeSpan field, which remembers the shortest schedule.
If the ScheduleSolution method is called during a solution, the timer won't be started until the solution finishes. So the schedule doesn't control how often the solutions will occur, it controls the delay between solutions.
Let's imagine the following scenario (with insanely scaled up time spans):
At noon exactly, a new solution starts. It doesn't matter what triggered it.
While the solution is still running at 12:01, a component (A) schedules a new solution 15 minutes later. This component registers a callback delegate along with the schedule.
While the solution is still running at 12:02, another component (B) schedules a solution with a 5 minute delay. Since 5 minutes is less than 15 minutes, the document forgets about the 15 minute schedule and instead switches to a 5 minute schedule. (B) does not register a callback.
While the solution is still running at 12:03, a third component (C) schedules a solution with a 10 minute delay. 10 minutes is further into the future than 5 minutes, so the document does not accept this new schedule. (C) does however register a callback.
At 12:05, the solution finishes. The SolutionEnd event is raised, viewports and canvasses are redrawn.
Also at 12:05, the document starts a timer that will fire an event 5 minutes from now, at 12:10.
Nothing happens in this interval, and at 12:10 the schedule timer fires.
The document notices it has a list of two callbacks (registered to A and C respectively), so it invokes them. This allows (A) and (C) to perform some sort of preparation. The most common action is to expire the component that gets the callback, so it'll get included in the imminent solution.
Once the document has invoked all schedule callbacks, it starts a new solution.
Note that (A) and (C) got called back way earlier than they requested. They scheduled solutions for 15 and 10 minutes respectively, but instead got the call only 5 minutes in. They can either play ball and accept the new schedule, or they can choose to not expire themselves and instead schedule a new solution for the future.
If at point 7 instead of nothing happening, a new solution was triggered by some other event (user dragging a slider or changing a wire), all the callbacks are still handled, but now even earlier than they expected.
-----
You can always schedule a solution, which is what makes this solution more flexible than other approaches. It doesn't matter that a solution is already running. It doesn't matter that the schedule comes from another thread*.
On the other hand, schedules are annoying because the time you request is not necessarily the time you get.
Also, if you have 50 components that all want to schedule, you must pick a delay big enough so that they all manage to register their callbacks before the new solution starts. This may be tricky.
* I'm actually not 100% sure about the threadsafety, it could be that there's bugs under rare conditions.…
Added by David Rutten at 7:00am on February 11, 2015
not sure which is the correct term).
So far I have done it pretty well out of an excel file and Grasshopper. I get the number of the cell and the rgb colour and a text (in my case a pantone). From this set up I can seed in Grasshopper different solutions and then decide which poster I want and from that point manually arrange the images but I wonder if I could create with some programing something like 30 or 40 different posters.
My .gh file is a bit messy even if it is simple. It takes the data from an excel sheet (I attach it), but to simplify it I have internalized the data into the .gh file so hope you can understand what I am trying to do.
So the idea is to get different images such as the ones shown in "Capture.jpg", but with the help of some code instead of manually organizing them.
I am trying with python, but I am not that good yet.
Any idea how could I do this or where should I ask for help?
Thanks in advance.
Javier Zaratiegui…
re thrilled by your enthusiasm to develop custom components and widen the base of the Grasshopper platform. We feel it is important your products become an integral part of Grasshopper, and are therefore shocked... shocked and appalled at the quality of icons we've seen so far (yes, Jon, that means you).
In order to facilitate this process, the GDT (unanimously) agreed to make the source for all the original icon images available online. They have been attached to this post for the time being, but we hope to centralize all developer resources someday soon.
You are free to copy and modify these graphics in any way you see fit. They come free of copyright and copyleft restrictions.
You'll need XaraX to open these files (a free 30-day evaluation is available, but it's cheap and really quite good, the GDT highly recommends it for any sort of vector-based computer graphics development).
Also note that almost all icons used in Grasshopper have a drop-shadow applied to them (2 pixel blur, 1 pixel offset towards the right and bottom, 50% opacity). This typically means you should refrain from drawing anything in the 2 outermost pixel columns/rows, otherwise the shadow will visually intersect with the icon edge.
--
GDT…
a machine that is light and very sturdy. I have taken my Macbook Pro all around the world, carry it with me every day, even dropped it a few times and its still totally fine. Its thin and light.
2) You get some actual support for your hardware even a few years down the line. My Macbook Pro is from 2012 and I can still walk in to any Apple Store and get help with it, which I have done many, many times in different places around the world - I never had to show a receipt or was charged any money for help. There is no PC/Laptop manufacturer in the world with anything close to that, because companies like Asus, Dell, etc. bring out dozens of new versions of laptops every year, so its much harder to service them after a few years.
3) This is the most important one, which usually people forget when they say that Macbooks are overpriced: Resale Value. If you have ever tried to sell an old PC/Laptop (I have a few times), you will know how little value they have even after just 2-3 years. Macbooks retain their value very well and even after 4 years you can still get 50% of your original price.
4) Of course you can install Windows on it and it runs perfectly. I have MacOS and Windows on it and both run absolutely fine. On the Windows side I have Rhino+GH, Maya and a few others. Having Windows is good, because some software still only runs on Windows (looking at you, 3DSMax!). Most other software also runs on MacOS. In the interest of sanity it is great to have an alternative to Windows for all the day to day stuff, like Mail, Calender, Photos, Presentations, etc. that just always works.
5) As for performance: Yes, Macbook Pros dont necessarily have the latest and greatest in graphics cards (the rest is on par with PC laptops), but unless you want to play games you will not need it. VRay RT can do GPU rendering, but you wont get great performance from a Notebook GPU anyways and it doesnt make sense to do rendering on a laptop (especially since you have a workstation). You could get one of the older Macbook Pro Retina Late 2013 or Mid 2014 models with the GTX750M by Nvidia, which will be usable to render using VRay RT, but of course not huge performance. Better to invest in a good used graphics card for your workstation like an Nvdia GTX980ti, which is the best value for money for GPU rendering right now (lots of used ones available).
So at least consider also getting a Macbook Pro. You can buy refurbished models (depending where you are) and they are like new, but a lot cheaper or even get an older one thats used. It will be a worthwile investment.
Take it from someone who has used dozens of PCs and Macs in my lifetime and have to do the IT support here at work (where we also use both).
I still have my Macbook Pro Retina from 2012 and its still running perfectly, super fast, and I can use Rhino and GH for huge files, do GPU Rendering with Octane Render and all sorts of other heavy computing stuff.
Hope that helps.…
Added by Armin Seltz at 11:12am on September 19, 2016
I change paramenters in GH sliders I see the progress in rhino with 4/5 seconds lag.
I think it is not an hardware issue, this is my configuration
Mainboard SuperMicro
Dual CPU Xeon x5650 (24 total core)
12GB DDRIII ECC
Hdd velociraptor 300Gb
Vga Nvidia Quadro FX3800
Hi, I attach the project files; when I move the slider bubbles trying to slide from a little value to another (like 50, 120, ecc) the result on screen is slowly to be shown.My question is, with the pc configuration listed above, is it normal with this project?There are parameters I can set in Grasshopper or Rhino to have faster results?There's a specific driver or configuration for my video card to improve performance?I would be very interested to see if the hardware on my system allows optimal performance with grasshopper or is normal with a definition so short ,even if it is the Voronoi , the system so slow to refresh when I move the slider
Thank you for your time…
ram of creating sightline.
In the diagram formula is N=[((R+C)x(D+T))/D] - R where R is the vertical distance of eye above point of focus and D is the horizontal distance from eye to point of focus.
So I have very simple test script.
Call main()
Sub main()
Dim D,R,N
Dim T,C
T=1
C=1
For D=0 To 5 Step 0
D=D+T
For R=0 To 5 Step 0
N=(((R+C)*(D+T))/D)-R
R=R+N
Call Rhino.addpoint(Array(D,R,N))
Next
Next
End Sub
Basically I want to make all variable "D","R","T" and "C" as parametric number slider in GH
and repeat "D=D+T", "R=R+N" and "N=(((R+C)*(D+T))/D)-R" until certain times.
The question is how to make a incremental loop in GH.
If anyone think that there is a better solution to do this please teach me.
Thanks for your time!!
…
bounding box wont work because it will seldom be square shaped. this square bounding surface should always be larger than the open or closed curve
2. dividing the curve into then testing its closest point to the center of the bounding sqaure plane.
3. creating rectangles with the same size as the divisions of the bounding plane.
this works perfectly, but there is one last problem that needs attention.
it generates the same amount of closest points than what is specified for the curve divide. e.g. the curve can be closed for instance with 35 squares , but the curve was divided into let say 50 points. the curve will be closed with the 35 squares but there are an additional 15 squares ontop of the 35 squares.
SO if someone can tell us how to remove duplicate data from a list of points(the points with the same coordinates), then the final solution will have no duplicate data.
the duplicate data can be seen in the second image
Gordon
…