'In Time'
l
l
l
l
Physical computing
Networking
Machine vision
MAX jitter
Python
Rig design/fabrication
____________________________
l
l
l
l
l
l
Proposal

Our idea of time is a constructed bubble that although works very well in our daily life, it not a fixed thing, it is not absolute, only relative to the world that we construct. There is no objective ‘now.’ We intuitively feel that time flows, that it is part of the fundamental structure of human existence, we can think about reality without space, without things but it’s very hard to think about reality without time. Einstein's theory of relativity tells us however that there is no absolute time, no absolute space, that everything is relative.

This artwork sits at the intersection of physics, neuroscience and philosophy attempting to explore the relationship between how we how we understand vs how we experience time. The installation challenges the notion of linear time by blurring the boundary of the past present and future, disrupting the rules of time relative to yourself. Using an automated mechanical rig to create a computational time bubble a network of twelve cameras positioned around the rig record a time slice of a participant. The image is then activated by walking around the perimeter of the installation, creating an interaction ‘in time’ with yourself.
l
l
l
l
l
l
l
l
l
______________________________
___________
The artwork is essentially a large automated rig that consists of a 4m diameter aluminium ring, hung from the ceiling with 12 fixed cameras positioned evenly around the circumference.

Sequence of operation & technology:

1. Spotlight lights ground in the centre of installation controlled by a raspberry pi zero and relay.

2.The person stands in the light and triggers another set of stoplights (directed on their torso) and also triggers the cameras to start their recordings. The lights controlled by a raspberry pi zero and relay which is triggered by a distance sensor controlled by a raspberry pi 3B+ which communicates via Wi-Fi using python socketsio. This also sends a message to all twelve raspberry pi zero's with cameras to start the filming.

3. The person turns 360 degrees. After 14.4 second the lights are turned off and the recording sent to a main laptop running MAX. The raspberry pi zero's send their respective films automictically using a script written in python via ftp. They are sent directly to the folder which MAX is using which loads the films into the patch. A further spotlight is lit to direct the participant to the centre of the installation.

4. The rear projection screen is retracted using a 12v dc motor controlled by the central raspberry pi and a L298N H-bridge.

5. The main computer screen is cast to a portable projector via WIFI.

6. The participant activates an led badge which is used to locate their position relative to the position of the screen. The central raspberry 3B+ has a wide angle camera connected which is mounted above the projection screen looking at the participant. Using openCV2 / python, the raspberry pi can see how far the red led light is from the centre of the screen (it's frame). This distance is mapped to the speed of stepper motor (also control by the central raspberry pi) so that the further the person is from the centre of the frame the faster the motor/ projection screen turns, i.e. it follows the person around the space.

7. The projector / max patch presents the film relative to where the person is standing, i.e. the projector will play the film closest to the camera that filmed it. A orientation sensor is positioned above the person's head directly opposite the projectors position. This data is send via WIFI using an ESP32 module to the MAX patch. The data ranges from 0 - 359 (degrees), MAX used to position in degrees to trigger the film the closest to the camera which originally filmed it.

8. The max patch also controls the measure of time. Starting at real time then slowing the payrate down to half and then speeding up. Audio of a ticking clock is sent to four speaker surrounding the ring whose speed varies depending on the recordings playback rate.

After 90 seconds of interaction the process is reset and starts again.
_______________
___________
l
l
l
l
l
l
l
l
l
l
l
l
l
l
_______________________
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
_____________________________________
l
l
l
l
l
l
l
l
l
l
l_________________________________________________________________
Using Python sockets and a designiated wifi router the 12 raspberry pis can send data across there own personal network.
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
___________________
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
Projector and Stepper motor/ driver.
A raspberry pi controls a relay to turn on/off the spotlights which light the user while being filmed and also direct them around the installation.
l
l
l
l
l
l
_______
__________________________________________________________
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
Home
Video documentation of installation.
A DC geared motor is used to retract the rear projection screen.
There are 12 raspberry pi camera modules equidistantly positioned around the ring.
ESP32 Feather Board sends data over the Wi-Fi using open sound control with an Adafruit BNO055 Orientation Sensor. This data triggers the relevent video to play on the projector.
A distance sensor linked to a raspberry pi is used to trigger the twelve cameras to record the per standing in the center of the installation.
A wide angle lens camera module is linked to a raspberry pi to track the position of a person using openCV. This infomation is used to control the stepper motor to position the projector and screen so it is always in facing the viewer.
A stepper motor is used to rotate the projector and projection screen so it is always facing the viewer.
Installation diagram.
Video documentation of the internal aluminium frame with the rear projection screen, rotating inside the suspended external frame.
Documentation of 'In Time', viewer is being recorded by a network of twelve raspberry pi cameras.
Documentation of 'In Time', the viewer looking at a 1:1 scale projection of themselves. They are able to walk around the 360-degree video of themselves.
System diagram.
Diagram of installation in proposed site. Stage one: recording the participant.
Diagram of installation in proposed site. Stage two: replaying the videos of the participant as they walk around the installation.
Screen shot of all the raspberry pi's connected using Python sockets on their own intranet network using an independent router.
Details of the installation fabrication.
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l
l___________________________________________________________________________________________________________________
________