Installations. Exhibitions. Research Projects.Set in London's busy Covent Garden Piazza, the Air We Share was a public art group installation created to raise awareness and inspire action against air pollution. We designed an immersive, interactive environment using color, data, and balloons to communicate our research and information. Heaven to the Cloud was my final MA research project presented as an exhibition and live performance exploring digital rituals as culturally familiar forms of ceremony in an attempt to better illuminate the complexities of the commodified internet and the algorithmic integrity of artificial intelligence. Amid, with Inés Cámara Leret. was a real-time sonification of air in response to minute-to-minute changes of nitrogen, carbon monoxide, nitrogen dioxide, and ozone gases. The aim of the project is to recontextualize the experience of climate change by creating a new experience that transcends political biases. Each gas is assigned specific tones and notes. As detected levels increase and decrease in the space, the musicality follows up and down scale. Royal Academy of Arts Rrose Selavy's Dada Extravaganza was a group installation for the Royal Academcy of Arts celebrating the Salvador Dalí / Marcel Duchamp exhibition. I provided live sound, performance, scripting, and production. Tadra is a sonically guided drawing experience created with Processing software, which explores the relationship between drawing, communication, and sound. As a participant draws, changes in audio frequency and color occur along the X-axis and Y-axis. There is possible development for new HCI creative tools and alternative learning methods. DataScent combines online behavior with the sensory power of perfume via Artificial Intelligence. It began in March 2018 at the Royal College of Art as a speculative digital afterlife project using the freely available IBM Watson personality analysis API. Collected Future is an ongoing knitted material exploration of data and online identity as memory object. Personality analysis output derived from my social media engagement via the IBM Watson natural language AI system was translated into binary code. The binary patterns were then applied by hand to the punchcards and knitted with a Brother KH836 punchcard knitting machine, the same technology that eventually evolved into punchcard computing.