In addition to still still photography and video, we had to produce a certain amount of other imagery for the Daeriejon: The Proxy War. In the end, we used two different pieces of software to generate footage for “Daerijeon: The Proxy War.” We’ll discuss both here:

  1. Universe Sandbox²
  2. Google DeepDream

Universe Sandbox²

Bzzlalz Candidate

Universe Sandbox² (website) is a fun piece of software that lets you create situations starting on a planetary or solar-system level, and moving on up to the Galactic level (so far). As the name implies, you can run simulations modeling he physics (with limitations) and play with them to see how changing variables changes the result. Because of our needs, we barely scratched the surface of what the software is capable of doing, but it served our purposes well in rendering beautiful images of unreal planets, and animations of two stylized star systems: Earth’s, and an imaginary one called the Bello System where a Dyson Swarm fills available space.

Dyson Swarm (Still Shot)

Universe Sandbox² is still in alpha, so it’s still being developed. That means some of its functions aren’t really built up, and others are a bit glitchy. That said, even now it’s an incredible and fun application to work with. Here’s a gallery of some of the more interesting images. Note: there’s video in there, but apparently you need to click through to the set on Flickr to see the videos (the last few items in the set):

Google DeepDream

Just as we were putting the finishing touches on the video and casting for a way to make the “transmission glitches” more interesting, Google released its Deep Dreaming code into the digital wild. By now probably everyone online has seen the images, and has some vague idea about how it works and what it’s for: figuring out what the neural networks that Google has trained to do image recognition actually see when they are “looking” at our jpgs online. Google’s good a good writeup if you don’t already know, though, here: Inceptionism: Going Deeper into Neural Networks.

Daerijeon DeepDream Processed Images

For us, this was a windfall: Daerijeon is integrally concerned with FTL (faster-than-light) interstellar communications of a brain-to-brain type, whcih makes it probably that the ansible system has some kind of AI bridge mapping across cognitive barriers and manipulation of neural circuitry. Images and video processed this way seemed like a wonderful fit.

Daerijeon DeepDream Processed Images

Video was beyond our capabilities, unfortunately, but Ryan Kennedy’s guide and code let us do single-image shots, which we then composited and manipulated into the three transmission glitches in the film. Kennedy’s guide was a big help, though if (unlike us) you’re running a the latest MacOS and you want to play, you might just consider shelling out for the Deep Dreamer software that installs like a normal app, automates a lot of the work, adds a nice GUI, and processes video too.)

Homeless Ajeoshi: Get Me Outta Here!

But the glitches pass by in an instant, while we found we actually managed to produce some really interesting images worth sharing. Some were pretty, some very creepy, and a few were downright fascinating. Here’s a gallery of the most interesting ones:

Categories: News & Announcements

One Response so far.

  1. […] also wrote up a little post (with image galleries) on some of the software we used to produce images like these for the […]

Leave a Reply