id |
acadia20_188p |
authors |
Puckett , Nick |
year |
2020 |
title |
Pulse V2 |
source |
ACADIA 2020: Distributed Proximities / Volume II: Projects [Proceedings of the 40th Annual Conference of the Association of Computer Aided Design in Architecture (ACADIA) ISBN 978-0-578-95253-6]. Online and Global. 24-30 October 2020. edited by M. Yablonina, A. Marcus, S. Doyle, M. del Campo, V. Ago, B. Slocum. 188-191 |
summary |
Pulse v2 is an interactive installation designed to investigate how real-time lidar data can be used to develop new spatial relationships between people and an autonomous digital agent through dynamic visual expressions. The first iteration of this research, Pulse v1, used a single point lidar with a 160o FOV in conjunction with 240 servo-actuated antennas that visualized the position and movement of visitors via their vibrations. This second iteration blends digital and physical materiality to create a synthetic organism that fully integrates sensing, computation, and response into its form. Simultaneously, the raw data feed it “sees” is projected onto the wall in real-time, allowing visitors to experience both the response and the logic. The data feed is supplied by a 360o FOV, 2d lidar scanner. This type of scanner is typically used by small autonomous robots to map and navigate their environments. However, in this installation, the relationship is inverted to allow a stationary agent to respond to a dynamically changing environment. The sensor is mounted under the displays and provides a real-time slice of the space at the height of 20cm. An algorithm filters this data stream into trackable blobs by recognizing people via their ankles. The agent analyzes this stream of data and filters it through a series of micro and macro expressions that play out on the screen in the form of a digital microorganism. |
series |
ACADIA |
type |
project |
email |
|
full text |
file.pdf (1,744,638 bytes) |
references |
Content-type: text/plain
|
last changed |
2021/10/26 08:08 |
|