The Projected Instrument Augmentation system (PIANO) was developed by pianists Katja Rogers and Amrei Röhlig and their colleagues at the University of Ulm in Germany. A screen attached to an electric piano has colourful blocks projected onto it that represent the notes. As the blocks of colour stream down the screen they meet the correct keyboard key at the exact moment that each one should be played.
Florian Schaub, who presented the system last month at the UbiComp conference in Zurich, Switzerland, said that users were impressed by how quickly they could play relatively well, which is hardly surprising given how easily we adapt to most screen interfaces these days.
But while there is real potential for PIANO as a self-guided teaching aid, in my view it’s the potential for a really tight feedback loop that makes this most interesting, and potentially more widely applicable.
When a piano teacher corrects a student’s mistake, they will perhaps specify one or two things that need improving, but this approach would sense each incorrect note and could provide an immediate visual response, flashing red for instance, conditioning the student to success more quickly.
I’ve written about Augmented Reality extensively in the past, but since the days of immersing myself in the purely theoretical potential for the medium, a few key players have rooted themselves in a very commercial reality that is now powering the fledgling industry.
And while B2B-focused vendors such as ViewAR remain behind the scenes, the likes of Aurasma and Blippar have soared in notoriety thanks to some quite excellent packaging and an impressive sales proposition. They are the standard bearers, at least in the eyes of the public.
I like Aurasma. But I also like Blippar. So which is better? Well, let’s find out… Here are some provocations I’ve been toying around with. See if it helps you decide, and let me know which side you fall on in the comments.
[twocol_one][dropcap]A[/dropcap]urasma has more technological power behind it. They have (supposedly) incorporated academic research into their proprietary tech and have a heritage in pattern recognition systems – remember their core business though: integrating with business critical processes and then slowly ramping up prices. They do this across all other Autonomy products! Also consider they are an HP property, whose business is hardware, not software. I believe Aurasma are only using this period of their lifespan to learn what does and doesn’t work, get better at it, gain status, equip users to enjoy AR, and then develop a mobile chipset (literally, hardware optimised for AR) that can be embedded in mobile devices, making HP buckets of royalties. They are chasing install base, but not because they want advertising bucks: they want to whitelabel their tech (i.e. Tesco, Heat & GQ) and then disappear into the background.[/twocol_one]
[twocol_one_last][dropcap]B[/dropcap]lippar have a proprietary AR engine, but are listed as using Qualcomm’s Vuforia engine – which is free to use. They seem focused on innovations in the augmented layer. Reading their interviews, they speak of AR not as a tech, platform or medium, but as a kind of magic campaign juice: stuff that reveals they are extremely focused on delivering a good consumer experience paid for by advertisers, with them as connective tissue. To this end, they too are chasing install base, but ultimately they have a different goal in mind. Being Qualcomm-backed, their future is in flexing their creative muscles and helping make AR a mass market medium through normalising behaviour. Big rivals: Aurasma in the short term, but I imagine that one day, Aurasma will revert back to being a tech platform, and companies like Blippar will provide the surface experience: where good content, not tech, will be what sells.[/twocol_one_last]
At work, where social features & discovery apps help me find new stuff
On my mobile, where offline playlists provide the backdrop to my travel
And since I no longer play physical CDs, nor use iTunes or other media player (barring web apps such as SoundCloud, Hype Machine, Mixcloud etc.) Spotify has become the main hub and jumping-off point for whatever type of music I’m after.
Spotify leaves it to its users to build, subscribe to and share playlists, their primary organisational schema, however they see fit. But with millions of tracks and carte blanche to curate a personal library of preferences comes a unique challenge: how should one filter, organise and archive their preferences with access to the worlds biggest music collection?
There is no self-populating iTunes-esque ‘smart playlist’ feature, no editorialised ‘recommended playlists’ feature, and until recently there was no way to search playlists without third-party involvement. Users have to come up with their own organisational approach, and I use my patented Star System™. Here’s how it works:
Play whatever music you want
Star the tracks you particularly love
These self-populate a ‘Starred Tracks’ playlist
Set this playlist to ‘Available Offline’ and they’ll download automatically
Carry on jamming, removing stars from any tracks if they get boring
After a period of time, move all starred tracks into playlist of their own
Release this playlist to the public to critical acclaim!
Repeat steps 1-7 with a blank slate
So without further ado, here are my Star Mix Playlists for your listening pleasure, along with some tasting notes.
As a graphic designer, Brodbeck is drawn to particular style details, but as a generative coder he’s interested in exploring the role for graphic design in analysing these same details.
He picked the medium of film as his ‘data-set’ and came up with something actually very unique: rather than analysing the meta-data around a film (i.e. from IMDB), he’s using movies themselves.
The project seeks to ‘fingerprint’ films (a bit like the recent moviebarcode site) and turn them into interactive models. The models can be manipulated to allow users to identify differences or trends in the graphics via a sexy looking interface, all of which he’s now open-sourced on GitHub.
Here’s a demo:
Brodbeck defines the project as “an experiment to find out if the data that is inherent in the movie can be used to make something visible that otherwise would remain unnoticed.” It’s a really interesting area for academic inquiry, one which he set out the following goals:
Measuring and visualizing movie data to reveal the characteristics of movies and to create some sort of unique “fingerprint” for them.
Extracting and analyzing information – such as the editing structure, use of colors, speech or motion – and transform them into graphic representations, so that movies can be seen as a whole and easily be interpreted or compared.
Working experimentally and presenting the work both in print and digital media.
A side effect is that the system he’s built is great at comparing films, so as to see differences between originals and remakes; within similar genres; among a string of sequels, similar filmmaking styles or certain directors.
In this post I’ll introduce you to my new pet project: an experiment in Twitter automation. The Strategy Bot (pictured) is ‘programmed’ to select & retweet key digital media resources, case studies or news items that provoke a higher understanding of the formation of good digital strategy.
Some context… I will typically have the odd side project on the go at any one time. Recent examples have included:
Recategorising all my RSS feeds for mobile, web & iPad
Linking up Instapaper / ReaditLater / Pinboard & Twitter
Testing Facebook ads to see if I can drive Twitter followers
Playing with XFBML, the new Follow button and Google +1
Sketching people’s Twitter avatars with my new stylus
All of the above would be worthy of a blog post, and that might happen for a couple of them, but there’s been one project I’ve been thinking about for a while that I reckon just needs to be shared, because, dear reader, I need your help!
I’ve been interested in getting the most out of Twitter for a while, and I’ve been certain there is some utility among the network’s parasites: the lowly twitterbot. I’d love to perform an autopsy on one to see how they really work, as there are some excellent cases of these automata being actually quite useful or cool. For example:
Spotibot – @replies suggested music based on your requests
Easy Joke – RT’s with “that’s what she said” on certain phrases
There are loads more listed on the Twitter Fan Wiki, and of course there are millions of spambots that behave in similar ways. But I wanted to make something that would be primarily useful to me, and that others might enjoy too.
The idea arose from the need to detect, share and archive truly excellent links, without cluttering my personal Twitter feed. Did you know you can automatically add Twitter links to Pinboard for archiving? It’s a bloody useful way to passively log the stuff that’s held your attention. And did you know you can create a self-hosted archive of all your tweets? I use Tweetnest to this end, where I’ve been logging my personal tweets here. Try searching for something!
Mr. Strategy Bot is just another way to add useful stuff to my own personal content library. But throughout the course of his life, I’d like him to be useful to everyone. Or at least, everyone that works in digital media (you gotta have a niche). So how should I automate him to this end?
In my attempts to pin down what makes these robots work, I found a number of approaches, typically making use of Twitterfeed (a pretty blunt RSS syndication tool) or the Twitter API (way over my head). I needed something that would let me ‘scrape’ the top links from a list of Twitter users, and automatically RT the top five links.
I have totally failed in my attempts, even after a whole evening spent in the depths of Yahoo! Pipes. For now, I’ve had to settle on the manual way. Yep, I’m manually RT’ing the links until I find a better solution, five a day, with a bit of prose each time to help round out his character.
I will continue to research means of automating his behaviour, as I think the idea of one’s own personal virtual pet social robot is a really powerful idea. Wouldn’t you agree?
[box]Please leave a comment if you can help create virtual life! Let’s give this guy his own A.I. existence out in the digital ether.[/box]
In the meantime, you should follow him on Twitter here.
He’s programmed to follow back!
ONE upload per day.
Each piece must not have taken longer than a day to make.
So with these in mind, here is a slideshow of their submissions, best viewed fullscreen (but it can take a while to load):
So far the group has attracted 2,972 members, who’ve contributed 33,950 unique creations. That’s 11.4 submissions in total, so it’s evident people aren’t being religious about uploading every day, but what the hell.
What I find really interesting about the group is that there is very little conversation – a lowly 97 comments in total – in the main group discussion forum. All of the chatter is around individual works of art (especially the best stuff) . This tells me that viewers and contributors are far more interested in the content than in the delivery framework. Rightly so, I think.
The lesson to learn here is that despite Flickr enjoying a highly creative user base, it is very hard to engage those users with a campaign idea (and I’m not just talking advertising). Flickr just wasn’t designed for community engagement, as I’ve learned on past advertising campaigns that have used it as a platform.
But that’s OK, because people upload great art to the site every day, and the quality of comments that they do attract far outweigh Facebook’s throwaway commentary and (largely) poor photography, any day of the week.