Fall 2017
What if, instead of navigating the world primarily via perception of light through our eyes, we had to "see" with sound?
This object is an exploration into my intrigue over human perception. Specifically, I wonder why and how our perception is composed of a mix of different "energy sensors," such as eyes, ears and skin (tactile), and how that influences the way we navigate the world. In order to investigate, I explored what it would be like if the physics of perception were different. What if we could "hear" sonic energy with our skin, "smell" aromas with our eyes, or "see" light with our ears.
The resulting object, pictured below, is a "camera" which converts what you see into sound. Specifically, the hue and brightness of light are transformed into a frequency and volume of sound. The resulting pulses of sound transmitted to the wearers ears are a representation of the color and relative darkness or lightness of where they are pointing the camera.
This project was my first experience programming in C. Most of the work involved taking input from the Raspberry Pi's camera module and transforming it into a format that could be output as sound. A challenging aspect of this was managing the difference between the frame rate of the camera and the sample rate of the audio library.
To transform the camera input into sound, I averaged all pixel data from the camera then calculated a single hue value as a combination of the red, green, and blue channels of the pixel data. The hue value was then mapped into a wave frequency within the human audible range, and the alpha value (light vs dark), also averaged from all pixels of the camera, was mapped to a volume range. The resulting sound is a continuous tone that gets higher as you point the camera towards warmer colors, lower when pointed towards cooler colors, quieter in dark spaces, and louder in bright spaces.
If I were to push my design further, I would love to make the program recognize multiple colors that make up an image and layer multiple resulting frequencies and potentially rhythms on top of eachother to create a more complex sound.
To house the electronic components, I constructed a low cost box out of white core board. It's top is secured with velcro for easy access to the internals. On the exterior, there is a leather pad that serves as a hand hold.
I am passionate about crafting magical experiences around complex data, information, or sensory landscapes. I strive to lower the bar for non-technical people to succeed in utilizing technology for work, curiosity, and play. My work has manifested as artistic experiments, prototypes of devices and machines, software applications, and developer tools.
I co-founded and am the Head of Product at Comake, where we're working to improve interoperability and composability between software tools for productivity, data ownership, and enterprise intelligence. This work has centered around developing methods and abstractions for scalable software integration, data deduplication & correlation, and information retrieval.
When not designing or building digital systems and experiences, I love to run, cycle, hike, paint, sculpt, and sometimes build furniture.
An open source software package providing developers with a single SDK to integrate and interact with any API.
2023
An open source Typescript implementation of a mapper for the RDF Mapping Language (RML).
2023
A living memory for your browser to help curb tab, app, and account overload.
2020 - 2021
A productivity enhancing browser extension. Never touch your mouse again!
2020
A machine built to quickly prototype 3-dimensional forms using wire.
Fall 2018
An experiment in seeing with sound
Fall 2017
A delivery service for hotel guests replacing room service with food from local restaurants.
2015 - 2016
Fall 2017
What if, instead of navigating the world primarily via perception of light through our eyes, we had to "see" with sound?
This object is an exploration into my intrigue over human perception. Specifically, I wonder why and how our perception is composed of a mix of different "energy sensors," such as eyes, ears and skin (tactile), and how that influences the way we navigate the world. In order to investigate, I explored what it would be like if the physics of perception were different. What if we could "hear" sonic energy with our skin, "smell" aromas with our eyes, or "see" light with our ears.
The resulting object, pictured below, is a "camera" which converts what you see into sound. Specifically, the hue and brightness of light are transformed into a frequency and volume of sound. The resulting pulses of sound transmitted to the wearers ears are a representation of the color and relative darkness or lightness of where they are pointing the camera.
This project was my first experience programming in C. Most of the work involved taking input from the Raspberry Pi's camera module and transforming it into a format that could be output as sound. A challenging aspect of this was managing the difference between the frame rate of the camera and the sample rate of the audio library.
To transform the camera input into sound, I averaged all pixel data from the camera then calculated a single hue value as a combination of the red, green, and blue channels of the pixel data. The hue value was then mapped into a wave frequency within the human audible range, and the alpha value (light vs dark), also averaged from all pixels of the camera, was mapped to a volume range. The resulting sound is a continuous tone that gets higher as you point the camera towards warmer colors, lower when pointed towards cooler colors, quieter in dark spaces, and louder in bright spaces.
If I were to push my design further, I would love to make the program recognize multiple colors that make up an image and layer multiple resulting frequencies and potentially rhythms on top of eachother to create a more complex sound.
To house the electronic components, I constructed a low cost box out of white core board. It's top is secured with velcro for easy access to the internals. On the exterior, there is a leather pad that serves as a hand hold.