Just thought of an idea, is there a way to feed the rhino viewport camera position into GH? This way it'll be possible to add some distance fog to the images... What do you think?
Yep, you just have to remap the distance from each vertex to the camera and feed it into gradient component.
However when i tried this approach on those spheres, it worked impossibly slowly...
Pretty neat solution, I gave up after trying to use the material diffuse/emissive settings you suggested. I've got a few deadlines coming up but i will have a go when i have some time
Ok i made some time to play with it, its fun. Never really properly considered utilising the preview options of gh geometries. I can see this technique being used to make some fantastic animations.
For each vertex generate a bunch of random rays that shoot from it and go towards a random direction that can't be greater than 90 degrees to the vertex normal (creating a hemisphere).
The percentage of these rays that intersect any other geometry is the occluded value for that vertex. You can also add a distance limitation (if the distance to the intersection is greater than x then it's as if there was no intersection).
If you are working on a subdiv mesh, to speed things up, instead of calculating the AO on the vertices of the final mesh subdivision (say 3 subdivs), first subdivide it once, bake the AO on it, then subdivide it twice. You can also use the blur mesh component from weaverbird. It might even look better since it smooth things out (like using a biased rendering method).
Mmm...shaders are quite different because you code how pixels are drawn directly in the screen...vertex and fragment shaders are usually the way to go. Here we are drawing colors over vertex in 3d space using distances and other parameters and the real opengl shaders used by rhino translate everything into pixels. That is why I was talking about rendering or painting meshes with GH.
Could be amazing to play at shaders level in GH o.O!
This is a very simple raytracer using only regular components. The right image is a mesh plane with vertices being colored by GH. The small plane and surface at the bottom is the camera and the point next to it is the light source (a simple point light used for lambertian reflectance).
I put together the definition 'spur of the moment' due to Angel's comment. However, I remember doing something similar years ago that worked on breps and created bitmaps.
It doesn't really make much sense doing this in GH. The interesting part of building a raytracer is in making it fast.
Yeh i suppose. I was having a 'Inception' moment, if you combined you all the ideas, so your 'mesh render' was built onto an object, and the camera was referenced with Horster. So as you orbited around the object, the geometry you are looking at is rendered onto the object itself.
I don't know if that makes sense or if i'm babbling.. Even if it is possible, i cant think of a use other than it being trippy.
Nevana
like it!
Feb 28, 2013
Nick Tyrer
haha are you ordering people to like this image? Sounds good to me!
Mar 1, 2013
Artyom Maxim
Just thought of an idea, is there a way to feed the rhino viewport camera position into GH? This way it'll be possible to add some distance fog to the images... What do you think?
Mar 1, 2013
Artyom Maxim
Horster camera control + white viewport background + GH material node with a gradient feeded into diffuse and emissive inputs:
Mar 1, 2013
Nick Tyrer
Artyom, currently i'm using the mesh component and 'vertices colour' input. Can that would work with your idea?
Mar 3, 2013
Artyom Maxim
However when i tried this approach on those spheres, it worked impossibly slowly...
Mar 3, 2013
Artyom Maxim
Apr 10, 2013
Nick Tyrer
dude that is sweet! So what pipeline did you end up using? was it what you explained in previous comment?
Apr 10, 2013
Artyom Maxim
Thx, finally decided to play a bit with that skeletal mesh definition)
Here are two parts, one for distance fading and the other for the "fresnel" effect.
Apr 10, 2013
Nick Tyrer
Pretty neat solution, I gave up after trying to use the material diffuse/emissive settings you suggested. I've got a few deadlines coming up but i will have a go when i have some time
Apr 10, 2013
Nick Tyrer
Ok i made some time to play with it, its fun. Never really properly considered utilising the preview options of gh geometries. I can see this technique being used to make some fantastic animations.
Muscle 006:http://www.grasshopper3d.com/photo/muscle-006?context=user
Apr 10, 2013
Vicente Soler
To add more depth can also bake ambient occlusion into the mesh vertices by making a ton of random mesh-ray intersections:
Apr 11, 2013
Artyom Maxim
Apr 11, 2013
Vicente Soler
For each vertex generate a bunch of random rays that shoot from it and go towards a random direction that can't be greater than 90 degrees to the vertex normal (creating a hemisphere).
The percentage of these rays that intersect any other geometry is the occluded value for that vertex. You can also add a distance limitation (if the distance to the intersection is greater than x then it's as if there was no intersection).
If you are working on a subdiv mesh, to speed things up, instead of calculating the AO on the vertices of the final mesh subdivision (say 3 subdivs), first subdivide it once, bake the AO on it, then subdivide it twice. You can also use the blur mesh component from weaverbird. It might even look better since it smooth things out (like using a biased rendering method).
Apr 11, 2013
Ángel Linares
LOL! AMAZING discovering! Rendering with GH!!! It's not a very fast approach but is interesting to make experiments.
Apr 11, 2013
Artyom Maxim
Apr 11, 2013
Ángel Linares
Mmm...shaders are quite different because you code how pixels are drawn directly in the screen...vertex and fragment shaders are usually the way to go. Here we are drawing colors over vertex in 3d space using distances and other parameters and the real opengl shaders used by rhino translate everything into pixels. That is why I was talking about rendering or painting meshes with GH.
Could be amazing to play at shaders level in GH o.O!
Apr 11, 2013
Vicente Soler
I guess "rendering with GH" will be this:
This is a very simple raytracer using only regular components. The right image is a mesh plane with vertices being colored by GH. The small plane and surface at the bottom is the camera and the point next to it is the light source (a simple point light used for lambertian reflectance).
Apr 11, 2013
Nick Tyrer
Now that is impressive! Have you been brewing that idea for a while Vicente? or was that spur of the moment?
Apr 12, 2013
Vicente Soler
I put together the definition 'spur of the moment' due to Angel's comment. However, I remember doing something similar years ago that worked on breps and created bitmaps.
It doesn't really make much sense doing this in GH. The interesting part of building a raytracer is in making it fast.
Apr 12, 2013
Nick Tyrer
Yeh i suppose. I was having a 'Inception' moment, if you combined you all the ideas, so your 'mesh render' was built onto an object, and the camera was referenced with Horster. So as you orbited around the object, the geometry you are looking at is rendered onto the object itself.
I don't know if that makes sense or if i'm babbling.. Even if it is possible, i cant think of a use other than it being trippy.
Apr 12, 2013
Vicente Soler
Beware of "inception moments". Listen to Stan's mom, she's the voice of reason: http://www.youtube.com/watch?feature=player_detailpage&v=8cineV...
Apr 12, 2013