
Week 1
This week I’m figuring out how to animate video pixel arrays using a particle system in Processing. Also applying forces locally with the mouse position. And return scattered pixels back their original position.

Week 2
The work is above is a variation of the code I created last week. I changed the global gravity forces into a wind force coming from the side. I coded a sine wave LFO to vary the wind strength. Again the forces are applied locally via mouse position.
I also spent a bit of time looking into how to apply a dynamic border to particle positions. Previously I coded an update function within the particle system that reverses the particle position if a particle reaches the end of the screen. The particle then appears to bounce off the side of the window. I was interested if you use a different shape to restrict the path of the particles, eg a circle, or a complex shape. This would be useful if projection mapping onto a non rectangular surface as part of an install.

Week 3
A rock is a river... I’ve been looking at Maya Rochat’s work this week for inspiration. This is a quick work in progress that uses simplex noise vectors to melt and flow image pixels in Processing.

Week 4
Playing around with pixel arrays in Processing. This time I’d thought I’d try messing around with time displacement by putting video frames into an array and use Simplex noise to drive the displacement effects.

Week 4
This week I tried a slightly different approach to manipulating the pixels. I thought I’d try incorporating time displacement techniques. The code takes a video stream and then layers the video frames into an array. Simplex noise to drive the displacement effects by looking up the pixel from each layer by the brightness of the noise. I used an image sequence by Eadweard Muybridge as a test.
When a horse trots or gallops, does it ever become fully airborne? This was the question photographer Eadweard Muybridge set out to answer in 1878. Railroad tycoon and former California governor Leland Stanford was convinced the answer was yes and commissioned Muybridge to provide proof. Muybridge developed a way to take photos with an exposure lasting a fraction of a second and, with reporters as witnesses, arranged 12 cameras along a track on Stanford’s estate.

Week 5
A reconfigured eye. This is some work in progress to turn numbers into an image. Thought I’d try extracting RGB color data from a video and then arrange the colour values into a TV subpixel array to recreate the image in Processing.

Week 6
Gravitational collapse. I’ve been reworking some code for an event coming up. The black hole image released yesterday gave me an idea to see if I could apply gravity forces to image pixels using a particle system.

Week 7
This is a quick little idea. I thought I’d see if I could make an image precipitate out of a cloud of particles.
.

Week 8 - First Site
Here’s some documentation for my projection work that activates the Archway in First Site Gallery.
Photographs were once recognized as the epitome of truth. Now photographic information can be manipulated via software to create images that can be an indistinguishable simulation. Raster considers the malleability of the electronic image by transforming the pixel information of the image of the supermassive black hole at the center of the Messier 87 Galaxy into flowing abstraction. The work explores how our experience of the world is increasingly based on images created by algorithms and displayed through screens and virtual media rather than tangible reality.

Week 10
This week I have been knee deep in HTML and javascript. I’m making progressive web apps out of P5.js sketches so you can install them on your iphone/ipad/mobile device. Works on a desktop through a web browser too.

Week 12
I am exhibiting an interactive online work in Site Eight Gallery at RMIT (and everywhere else) for the next few days as part of our honours trial presentations. You can mess around with it at https://realness.wesleydowling.com .

Week 13
This week’s work grew out of the trial presentations. In the trials, I presented an artwork label with a QR code installed in the gallery space so viewers could access the work via their phone/device. I noticed that people seemed to consider the label as an artwork which I think diminished some of the concepts of the work. My work is about the condition of the network digital image which has qualities of simultaneity and multiplicity which informs why I am creating website based work. The work I felt should have multiple points of entry rather than a singular point, via the label. My idea to get around this problem is to distribute QR codes in multiple sites and take an image of the QR code as a document. I then had an idea to use those images to create a photo mosaic – this refers back to how the image of the black hole was created – Algorithms using image data sets to recreate an image. This is a quick idea made with processing. Next step is to see if I can get this to run on a website.

Week 14
This is based on the previous photo mosaic code in processing. I couldn’t get the code to port over to HTML…very frustrating. To get around this, I generated a photo mosaic in processing and saved a frame as a jpg. I then used some of previous realness code, and reworked it so it loaded the QR images as particles. Then it is possible to disperse the image. I think I’ll have to leave this one and move on to a new project. Working with the reworked code, helped me to figure out how to load images as sprites/particles instead of drawing shapes as I had done previously. I’ll work on that next.

Week 15
This week I'm building on the code I made previously to use cut up and fragment the image as the mouse passes through an image. I'm using the webcam on the desktop for now, it would be great to get this running on a phone but I think performance is going to be an issue.

Week 16
Last week's work gave me a clue of how to use a mobile phone camera with a particle system and get smooth performance. I think the main issue is using alpha values with the tint function. This seems to cause a massive slowdown. With that in mind, I ported the code using the Realness website as a template to a new code. I coded the site so it appears as a mobile phone camera on a device, then applied fragmentation once the shutter button is clicked.
Queering the Photograph: Primary Archive - Semester One














