algorithmic modeling for Rhino
The idea was to explore new modes of perception under the scope of human navigation into a room that is continuously changing. Each movement is tracked and being processed so as to give an output to the room, in creating an immersive virtual reality environment. The projections are controlled interactively in real time by the human, creating illusions of space (e.g. transforming physical boundaries, room shape transformation, rotating room). The tools used for data execution are grasshopper, firefly for Kinect which connects the input device with the digital environment to calculate real time movements. The output result is images that are projected in each side of room by different projectors that are set interactively according to input data. This project aims to understand the difficulties that human mind has to face in an illusional virtual room and also to create new modes of perception to the users that join it in a playful environment.