Muscle 002

Grasshopper preview, no baking, no rendering www.tyrertecture.com

  • Nevana

    like it!

  • Nick Tyrer

    haha are you ordering people to like this image? Sounds good to me!

  • Artyom Maxim

    Just thought of an idea, is there a way to feed the rhino viewport camera position into GH? This way it'll be possible to add some distance fog to the images... What do you think?

  • Artyom Maxim

    Horster camera control + white viewport background + GH material node with a gradient feeded into diffuse and emissive inputs:

  • Nick Tyrer

    Artyom, currently i'm using the mesh component and 'vertices colour' input. Can that would work with your idea? 

  • Artyom Maxim

    Yep, you just have to remap the distance from each vertex to the camera and feed it into gradient component.
    However when i tried this approach on those spheres, it worked impossibly slowly...
  • Artyom Maxim

  • Nick Tyrer

    dude that is sweet! So what pipeline did you end up using? was it what you explained in previous comment?

  • Artyom Maxim

    Thx, finally decided to play a bit with that skeletal mesh definition)

    Here are two parts, one for distance fading and the other for the "fresnel" effect.

  • Nick Tyrer

    Pretty neat solution, I gave up after trying to use the material diffuse/emissive settings you suggested. I've got a few deadlines coming up but i will have a go when i have some time

  • Nick Tyrer

    Ok i made some time to play with it, its fun. Never really properly considered utilising the preview options of gh geometries. I can see this technique being used to make some fantastic animations.

    Muscle 006:http://www.grasshopper3d.com/photo/muscle-006?context=user

  • Vicente Soler

    To add more depth can also bake ambient occlusion into the mesh vertices by making a ton of random mesh-ray intersections:

  • Artyom Maxim

    AO! Nice) but how exactly are the rays generated?
  • Vicente Soler

    For each vertex generate a bunch of random rays that shoot from it and go towards a random direction that can't be greater than 90 degrees to the vertex normal (creating a hemisphere).

    The percentage of these rays that intersect any other geometry is the occluded value for that vertex. You can also add a distance limitation (if the distance to the intersection is greater than x then it's as if there was no intersection).

     

    If you are working on a subdiv mesh, to speed things up, instead of calculating the AO on the vertices of the final mesh subdivision (say 3 subdivs), first subdivide it once, bake the AO on it, then subdivide it twice. You can also use the blur mesh component from weaverbird. It might even look better since it smooth things out (like using a biased rendering method).

  • Ángel Linares

    LOL! AMAZING discovering! Rendering with GH!!! It's not a very fast approach but is interesting to make experiments.

  • Artyom Maxim

    Better to say shaders in GH)
  • Ángel Linares

    Mmm...shaders are quite different because you code how pixels are drawn directly in the screen...vertex and fragment shaders are usually the way to go. Here we are drawing colors over vertex in 3d space using distances and other parameters and the real opengl shaders used by rhino translate everything into pixels. That is why I was talking about rendering or painting meshes with GH.

    Could be amazing to play at shaders level in GH o.O!

  • Vicente Soler

    I guess "rendering with GH" will be this:

    This is a very simple raytracer using only regular components. The right image is a mesh plane with vertices being colored by GH. The small plane and surface at the bottom is the camera and the point next to it is the light source (a simple point light used for lambertian reflectance).

  • Nick Tyrer

    Now that is impressive! Have you been brewing that idea for a while Vicente? or was that spur of the moment?

  • Vicente Soler

    I put together the definition 'spur of the moment' due to Angel's comment. However, I remember doing something similar years ago that worked on breps and created bitmaps.

     

    It doesn't really make much sense doing this in GH. The interesting part of building a raytracer is in making it fast.

  • Nick Tyrer

    Yeh i suppose. I was having a 'Inception' moment, if you combined you all the ideas, so your 'mesh render' was built onto an object, and the camera was referenced with Horster. So as you orbited around the object, the geometry you are looking at is rendered onto the object itself.

    I don't know if that makes sense or if i'm babbling.. Even if it is possible, i cant think of a use other than it being trippy.

  • Vicente Soler

    Beware of "inception moments". Listen to Stan's mom, she's the voice of reason: http://www.youtube.com/watch?feature=player_detailpage&v=8cineV...