
I really wanted to understand the Vuforia AR kit, and had a couple of ideas that would prompt me
to actually sit down and get it working. I found that the online tutorials are a bit confusing,
and nothing seemed to work ‘out of the box’ (at least for me).
Who knows, it could be my lack of understanding or my personal machine configuration.
However, once I got things working, it turned out to be an absolute delight.
The process of registering an image with the database and a project is actually pretty simple.
The idea was this:
have an image target embedded into a spreadsheet. When recognized, the visualization would appear.
Simple enough. However, the target image could exist anywhere — on a computer screen, on a phone,
printed on a piece of paper. This gave me the idea that you could simulate the process of taking your
3D visuals with you, and use any device to rotate, tilt or zoom in and out. The video that I captured here
illustrates the process on an iPhone and an iPad.
For the moment it is just conceptual. The image target is static (I just used a QR code that I
generated and downloaded it to the device), which displays a simple 3D scatterplot of the same data
in the spreadsheet. I also put a translucent copy of the QR code behind
the data. However, I can imagine a process whereby the data (a spreadsheet, SQL query... anything)
updates the visuals, and creates a unique image target marker (which is then uploaded to the Vuforia
image target database).
Ideally, you could encapsulate the data to be visualized into a visual target. It’s not very realistic
right now, but I foresee the technology progressing to the point that a high resolution, multi-color image
can actually encode multi-megabytes of data. This would clearly need very high resolution cameras and
ultra-fast processing to decode the image and re-interpret the source data.
The video, illustrating both iPhone and iPad use can be seen here.
This is a slick global population growth visualization that was created by Abdullah Aldandarawy. I found it on
the Unity store, and, with a little bit of effort, ported it to the HoloLens. The actual asset did allow for some
interactivity, but this version simply shows off the beauty of its execution.
The Unity store asset can be found here.
Often, in time-series visualizations, it is effective to give an indication of trend or direction. I was hoping that the
Unity 'trail renderer' would be useful to show where a visual element was, and therefore add meaning to where it currently is,
at that point in time, as one is stepping through a time-series.
This example also illustrates some subtle animation between subsequent dates the way Gapminder
does so elegantly.
For whatever reason, and for every technology, I start with some variation of this simple,
hypnotic curve. It looks pretty and organic, and I could watch it for hours.
It also is a bit of a quick cheat. I can use it for testing positive and negative values,
and there aren’t any nasty spikes, or awkward edge cases, and is naturally well-suited for
animating.
This exercise gave me my first chance to dive into Unity vertex shaders.
In the example above, I adapted a simple red to green function, based on the Y value
(height) of the mesh. Moving forward, I then plugged in my favourite test data sets,
US stocks, and a US crime data.
Since this data set did not have negative values, I decided to change the colour scale
(low values green, high values red). It also made me realize that organizing data in this
way doesn’t exactly reveal any insights. However, I would consider this a successful test
of the surface visual, done completely from scratch. They are both fairly small, however,
with only 20 by 20 ‘cells’ displayed. I was a bit reluctant to push too much geometry on
the device, especially as it was my first attempt.
The biggest thing missing is the ‘tooltip’ drill-down, which gives the actual numeric
values in the form of a floating text grid as the viewer gazes at a ‘cell’ in the visualization.
The visualization affords a basic level of interaction (sort of) in the form of cycling
through different metrics as you air-tap/click on the X or Z axes.
This is an experimental holographic CAVE created for
the HoloLens. The idea is to immerse yourself within the visuals. In a typical CAVE, the visuals are presented using rear-projection screens, or panel displays.
Presenting them holographically obviously requires much less hardware, and can be used wherever there is enough space. In this experiment, I used five different visualizations, all using the same S&P dataset.
1. Scønes view. This is a thought experiment using scatterplot of inverted cone-like objects, with a mini sparkline graph displayed on it's inverted base.
The idea here was to be able to show the current state of the element, but also it's historical value.
2. Grid Array view. This is a simple 3D bar chart with row and column layout that allows the rows and columns to be sorted.
3. 3D scatter plot. This is the view on the right wall. Each data row is represented in X, Y, Z space as a spheroid.
4. Stalactites view. This is the view that is mapped on the CAVE ceiling (hence the name) This uses the TubeRenderer object and
creates a spiralized tube shape. For now, this is just using random data.
5. Ring view. This can be seen on the left wall. I wanted to re-purpose the rings used for the Këpler visualization, and
realized that there are many additional visual features that can be used with the ring. Inner, and outer thicknesses can set to create
interesting shapes. How useful it is remains to be seen.
Four different visual representations of stock market data.
1. The Këpler view. Each row is represented as a 'planet' with a ring. Different metrics are applied to planet and ring size, planet and ring color, orbit speed, rotation speed etc.
2. Grid Array view. This is a simple 3D bar chart with row and column layout that allows the rows and columns to be sorted.
3. 3D scatter plot. Each data row is represented in X, Y, Z space as a spheroid.
4. Tubes view. This is something that is still at the concept stage. I was thinking that it would be interesting to
present data in a sorted grid view, but allow the visualization to show where that element was over time. Imagine a stock
market dataset, showing end of day prices. However, instead of a bar, it would be presented as a tube, with the opening
price at the base of the 'tube' and the closing price at the top. This would highlight how much (or little) the stock had
changed over the course of the day. The data repreented in the video is simulated random data.
A more customary scatter plot visualization of the S&P 500 data set. Once again, drilldown details are given in the form of a tooltip
as the user gazes at each spheroid. It actually started it's life as a scatter plot of cubes, but when I changed the shape prefab in
Unity to a sphere with a metallic material, it looked so glossy and shiny I couldn't resist. This implementation has clickable axis labels
so that the viewer can cycle through the metrics in the data set by 'clicking' on them (I'm using the clicker peripheral instead of air tapping).
The interactivity of this one makes it far more useful (and really speeds up the testing process). How the user will select the colour
still elludes me... Unless there is some sort of Color Picker UI object out there (or if I need to build one from scratch)
A visualization of all stocks in the S&P 500, using the planetary visualization construct that I have been experimenting with. Drilldown details are given in the form of a tooltip as the user gazes at each planetary element. Unfortunately, the video does not do it justice. The text in the tooltip is very crisp, and highly readable, and does not compete with the visuals. Noteably missing is a way of dynamically setting the visual features and applying them to the data dimensions. I'm hoping that voice command will not be too arduous to implement, as this would give the most amount of control without losing the immersiveness of the visualization.
This is an experimental scatter plot visualization that represents many dimensions simulatneously, in what I'm calling,
Project Këpler. This uses visual features like planet color, ring color, size, ring arc, orbit speed and direction, rotation speed
(among others).
In this example, random data is used, and the lines were created with the very useful Vectrosity Unity asset.
This started purely as a thought experiment, to see how many visual dimensions I could add to a scatter plot. The hardest part, amazingly,
was building the mesh for a partial 3D ring. There wasn't anything built into Unity, and I had to build it from scratch, including
the 'end caps' of each partial ring.
I wanted to try my hand at a simple game for the HoloLens. This is a loving tribute to one of my favourite games of all time, Space Invaders. However, I wanted to explore some of the features of the HoloLens, and see how integration of the XBox One controller would work. There are some interesting twists with this treatment, namely the addition of 'shields' to some classes of the invaders, as well as a 'fire-and-forget' seeking missle weapon for the player. I can't take credit for the particle effects used for the laser and explosions. Those were created using the Hyperbit Arsenal Particle Systems Unity Store asset.
This is a prototype of a game I created to test out the XBox Bluetooth controller and the HoloLens. The idea here was to have a shooter type game, with the option of three air-based missle launchers, similar to the game Missle Command. Missles are launched with the X, A, and B buttons on the controller, and a laser is fired with the trigger. This is also my first real attempt at integrating particle effects and explosions. The video was captured in a nearby park (Saint Andrew's playground) on a bright fall day, and I was impressed at how well the HoloLens performed.
I wanted to extend my experiments with the HoloLens in an outdoor setting. This creation was made using a TubeRenderer Unity construct, with textures by Toronto artist Mike Parsons. The idea here was to create some organic-looking structures that could then be overlaid onto a park setting, and see how well they looked. I was blown away by the ability of the device to render the visual elements on a bright (but overcast) day. Interestingly, some passerby was pacing through the park, on his cell phone and oblivious to my HoloLens recording. It really does look like he his moving between the 'creatures'.
This in an homage to an old Cinematronics video arcade game called Star Castle. The premise of the original game
was to blast away at an enemy that was protected by three layers of counter rotating shields. I thought it
might be interesting to re-create a HoloLens version that did a similar thing, but in 3D (of course).
It took quite a bit of time to locate a hexagonal sphere type of geometry - something that wasn't too
heavyweight, and could be customized and textured.
The missle logic was implemented using a very inexpensive Unity Asset Store package called '3D Homing Missiles'.
I thought it might be interesting to have the 'base' launch defensive missiles at the player, who would then
have to shoot them down to avoid being destroyed.
I wanted to experiment with animation controllers in Unity, and for my first stab at it, decided to go with
the excellent (and free... for now) Mixamo models and animations. These are professionally rigged 3D models
that can be specifically targetted for Unity development. I picked one (he's called Whiteclown) that looked like
it might be fun, and
added two sets of animations. The cubes were added just to experiment with the collision detection settings.
Once I recorded the videos, it looked so rythmic and chotic, I thought it might go nicely with the music of
Yusuke Hasegawa, who performs these amazing percussion + didgeridoo pieces here in Toronto.
My first attempt at a game (more of a simulation, really) for the HoloLens. This is a 3D representation
of the old school Lunar Lander video arcade game. There's no point system here, and just a very basic
win/lose kind of situation. The challenge here was trying to get meaningful input with the HoloLens,
give (at the time) all there was available was the clicker peripheral. The application takes click events
from the device and applies a small amount of vertical thrust for each click. Too much vertical speed on
touchdown means a crash. Just the right amount earns you a polite round of applause.
The ground textures were done by Mike Parsons
This is a simple experiment with a Force-directed graph using the HoloLens. Additional nodes are added randomly, with the layout handled by the EpForceDirected project on CodeProject. Adding the lines with Unity was actually the most difficult part. I ended up using the Vectrosity asset on the Asset Store, which ended up saving quite a bit of time (and avoid a lot of headaches).
This is another scatterplot that uses size and color, as well as a rudimentary force-directed
algorithm. The real experiment here was to use semi-transparent 'nodes' to limit occlusion. Rather
than use straight-up transparency, I created some geometry with Blender that had holes in it.
Not the most elegant looking thing, but interesting to look at!
This is an experimental use of the HoloLens for a GIS visualization, displaying oil platforms
and pipelines in the Gulf of Mexico using a simulated data set. The idea here was to present a
large network of data for the purposes of monitoring or security, using layers to offset key elements
Because of the density of the circular elements, it was a problem to display all of them at once - there are
just too many of them. Offsetting from the base plane allowed many more dots to be displayed at a
greater size.
This was the first time I really bumped into the limitations of the HoloLens, and needed a way to cap the
geometry that was being sent to the device.
The background map is nothing fancy. It's simply an image of the region for visual context.