Interactive Audio-Visual Artwork

Created 2015

This goal of this group project was to design and develop an interactive visual artwork with OpenFrameworks, demonstrating the skills we had learned using  XCode such as MIDI, keyboard entries, vectors, Haarfinder, Vidgrabber, ofMesh, VBO, classes.

Concept

Our project deals with the visualization of sound, specifically visualizing the human voice in an interactive and immersive way. The driving idea behind this piece is relative to how we use specific communication channels in order to perceive sound and visuals. Ordinarily, it is the auditory system that deduces sound characteristics where as visual perception calculates what we see. Our project intends to explore the crossover that can occur between these two. The bahaviour of the visual is completely dependent on the interaction of users. Results in output vary. It adds an interesting element to the mix when you consider how different people’s interactions may contrast and what the resulting visuals will look like. We wanted to emulate the visual process of hearing in our piece. Real life vibrations in the air come to fruition through our ‘exploding sphere’.

Design

The design of our visualization aspires to be simple and concise in an effort to make how the user connects their vocal input to the mirrored abstract visual on screen seamless.  There are two key components that make up the output: the 3D sphere and a moving graph. These exist virtually within a 3D spherical grid space, introducing an element of depth. Focusing on the sphere first, it rotates on a fixed position and is made up of a vector of triangles that can be split. How it behaves is dependent on the ofSoundStream. As amplitude is monitored, it is scaled and mapped in real time. The scaled volume data determines the behaviour of the triangle vectors that make up the sphere. The louder the incoming sound, the greater distance the triangles disperse. Unless the system is in Mode 3, the sphere will always resume its original shape.Additionally, three light sources shine upon the circumference of the sphere enhancing the 3D presence using ofLight.

The graph on screen acts as a secondary source for audio representation. The graph collects the recent activity of sound within the buffer, iterating through the data volHistory, and deleting the oldest input data. It exists within the background of the piece. This is also achieved using a vector and further maintains the bond between sound and visuals. Information display moves from left to right.

There are also various states that the piece can assume. Ultimately, the behaviour of the shape shall represent user interaction but aesthetics and dynamics can be altered using the mouse and keyboard. There are three modes that the system can be in. Mode 1 uses a .png image as the texture of the sphere. Mode 2 uses the live webcam feedback as the texture while Mode 3 uses the former .png image. Mode 3 differs in dynamics because as the triangles disperse they do not reform to the original shape, introducing an interesting deconstruction of the shape that will remain until state change.

In addition to being able to shift between these modes using the left and right keys, a user can choose the amount of triangle splits by selecting a key with 1-4, four consisting of the largest amount of triangle splits. The user can also press the mouse to control the speed of rotation. The speed is relative to the position of the cursor on the X-axis while pressed. A series of Booleans also turn on and off states such as wireframes, points of splits and fill of the shape.

Demonstration

FYP Video Gallary

Demonstration Presentation:

Brief Code Overview:

Demo Day:

Prototype Logging:

FYP Image Gallery

Demo Day

Demo Day

Demo Day

Demo Day

Outdoor Station

Outdoor Station

Observer

Observer

Visualization

Visualization

 

 

 

 

 

 

Finished Station

Finished Station

Screen Shot 2015-04-15 at 02.26.54

User Interface

 

 

 

 

 

 

Screen Shot 2015-04-13 at 01.38.18

Final Product Visuals

Circuit

Circuit

 

 

 

 

 

 

 

 

 

 

 

Schematic

Schematic

Exposing Wind Sensor

Exposing Wind Sensor

 

 

 

 

 

 

 

PROCESSING

Flow Chart Visuals

MAXMSPFLOWNEW

Flow Chart Audio

 

 

 

 

 

 

Max Patch Development

p3 averaging

Weather Average to select scale

 

 

 

 

 

Scale Intensities 1-10

Scale Intensities 1-10

 

 

 

Processing: Twitter Balloons

Created 2014:

The objective of this project was to create an audio-­visualizer of data from a web-­stream. It works in fullscreen on a screen of any resolution. It is an an exported application that makes use of a settings.txt file  to setup the parameters of the system. Info-visualizations need to tell the story of the data in a minimal & attractive way. The system should:

1. Acquire & Parse the data stream
a. Computer Science

2. Filter & Mine for only the data that you need
a. Mathematics & Statistics

3. Represent as information reveals story/pattern behind the data
a. Graphic Design

The real time data that I used for this visualization is what could be acquired from Twitter. From using the Twitter API, peoples names, screen names, keywords, topics, followings, location etc. could be streamed. This data stream would then be filtered for only the information that I desire. The values obtained are used to scale outputs i.e. the length of the screen names, determined how big a balloon would look. The nature of the data stream Twitter provides reflects more on the personality of a user. It creates a digital clone of a user that exists in this ‘cloud’.

Initial Design

Initial Design

Twitter is used as a medium for users to share their thought at that particular instance with the world. People are then willing to let go of that information to higher power, which is this ‘cloud’. All this data is accumulated from all over the world with thoughts, opinions, topics etc. all co existing. The fundamentals of the graphic is design is to depict how all these opinions of the world exist and float out there separately but held together by the common denominator which is Twitter. I created what looks like balloons floating in the sky.

Development with PImage

Development with PImage

As the programme runs, the amount of tweets shall increase. As they accumulate, the user can observe and study how opinions may vary in certain topics. All these different opinions are tied together by this cloud of data. The center circle will slowly increase in size as the tweets build up and things become more chaotic. The user can also compare the ratio between people having followers to following people through the number/bar display. If they change (or add to) the, txt. file, they can compare the frequency of keywords being tweeted. Things appear calmer with less amount of tweets on the screen.

Demonstration:

Music – YogaBrickCinema: https://www.youtube.com/watch?v=BUaFugdLWyE

Video Explanation:

What I enhanced:

  • The use of classes
  • Better understanding of arraylists
  • How to import real time data and use it as I like.

Processing: Generative Screensaver

Created 2014:

For this project, I created a standalone generative visual to run in fullscreen on a computer screen of any resolution. This was done using Processing. The visual is open­‐ ended so that it can run indefinitely. In the initial stages of my project, I researched what kind of animation could be most engaging to a viewer. I wanted to create something that looped over and over but changed its variables such as colour, scale, and direction of movement repeatedly. I wanted to create some form of kaleidoscope in its simplest form. The image changing but previous elements of movement remain. I drew influence from the kaleidoscope works of Jordi Bofill of Cosmo Arts.

The graphic design is made up of five stars that each have an ellipse with in the center of them. These stars and ellipses all have outer stroke colours also. The colours of each star and ellipse are the same(bar the center), yet it’s the outer strokes and how they behave with movement are what makes this visual interesting. It is the center ellipse that is considered the center focus. With this ‘screensaver’ there are numerous dynamics. The piece is always zooming in and out. It is continuously rotating (based on the key selected for desired direction). It can be clicked and dragged to change the center points of the image. What is most interesting is the random colour changes to the outer strokes and the diameter of the center ellipse. Pressing space also resets angle and zoom to begin again. Its behaviour is somewhat similar to Spiro graph.

Using ‘if’ statements I was able to implement limits to the dynamics. It can only scale so far until the if statement sets a boolean true and changes the value of the rate I have designed causing the scale to either increase or decrease. The direction keys could also serve as a threshold as they dictate with direction/angle the image should move.

Final Product: