unit-code
Wings combines pneumatic elements with curve folding principles to generate adaptive and lightweight structural elements that change shape in response to human behaviour. This research focused on the exploration of different shapes, topologies and curve folding principles, generating a catalogue of shape-changing building elements. In parallel to the material development, a custom feedback control system was developed to sense humans’ motions and emotions and translate them into material states, in real-time. This work questions the relationship between humans and space, proposing a soft and responsive environment connected to human emotion and perception.
Wings aims to create a responsive lightweight building system, combining pneumatic elements with curve folding principles. The system is robotically controlled enabling shape changes in response to human behaviour.
Wings units can be combined into a larger system, generating a wide range of transformable spaces and dynamic spatial experiences.
People spend more than 80% of their time in indoor spaces, and most of people's experiences are based on daily routines. It is important that architecture responds to the emotional needs of people.
Wings modules are designed using numerical form-finding techniques. Different patterns and topologies can be tested and materialised through a custom fabrication approach to generate inflatable folding elements.
WINGS systems comprised of airbags modules with creases. The shape of the airbags and the pattern of the creases determines the folding behaviour. Multiple topologies and patterns have been simulated, fabricated and tested in the physical word.
Wings' modules can be arranged to interact in multiple ways. They can self-form and morph into multiple shapes to interact with users and articulate the space.
A human interacting with Wings' physical robotic system at a closed state.
The robotic control system process input data (from ultrasonic sensors and camera) to compute control data (air valves and lights), through a custom design and control interface.
User's movements are detected by the camera and processed by the custom control algorithm. This translates them into control values.
User facial expressions are detected by the camera. An algorithm based on open-cv and CNN machine learning translates the expressions into control signals.