Anyone for Tonsil Tennis?

This is pretty cool I guess. The idea is that your partner “helps you” to play a video game by letting you snog them in different ways (while you’re looking at a computer screen and therefore not really paying attention).

It’s a bit gross, but it’s still a novel idea, so have a look:

What’s the mechanic here?

The Kiss Controller interface has two components: a customized headset that functions as a sensor receiver and a magnet that provides sensor input. The user affixes a magnet to his/ her tongue with Fixodent. Magnetic field sensors are attached to the end of the headset and positioned in front of the mouth. As the user moves her tongue, this creates varying magnetic fields that are used to control games.

We demonstrate the Kiss Controller bowling game. One person has a magnet on his/her tongue and the other person wears the headset. While they kiss, the person who has the magnet on his/her tongue, controls the direction and speed of the bowling ball for 20 seconds. The goals of this game are to guide the ball so that it maintains an average position in the center of the alley and to increase the speed of the ball by moving the tongue faster while kissing.

And what’s the point?

I literally do not know. If I were the developers I’d have focused on highlighting their innovative technique to use the tongue as an input device: it’s the most dexterous muscle in the body and it’s use is often one of the few remaining facilities among paralytics.

Can’t this be a remote control for wheelchairs or similar, rather than a Wii Sports ripoff? Come on guys…

More details here: Kiss Controller.

Web Discoveries for December 13th

These are my del.icio.us links for December 13th

Bibliography

So that’s it, my series is over. All that’s left to do now is credit the academic sources that influenced and aided in the construction of my argument. Thanks to everyone below, and thanks to you, dear reader, for coming along for the ride.

References:

Baudrillard, Jean (1983). Simulations. New York: Semiotext(e).

Baudrillard, Jean (1988). Selected Writings, ed. Mark Poster. Cambridge: Polity Press.

Baumann, Jim (date unknown). ‘Military applications of virtual reality’ on the World Wide Web. Accessed 20th March 2007. Available at http://www.hitl.washington.edu/scivw/EVE/II.G.Military.html

Benjamin, Walter (1968). ‘The Work of Art in the Age of Mechanical Reproduction’, in Walter Benjamin Illuminations (trans. Harry Zohn), pp. 217–51. New York: Schocken Books.

Bolter, J. D., B. Mcintyre, M. Gandy, Schweitzer, P. (2006). ‘New Media and the Permanent Crisis of Aura’ in Convergence: The International Journal of Research into New Media Technologies, Vol. 12 (1): 21-39.

Botella, Cristina.M, & M.C. Juan, R.M. Banos, M. Alcaniz, V. Guillen, B. Rey (2005). ‘Mixing Realities? An Application of Augmented Reality for the Treatment of Cockroach Phobia’ in CyberPsychology & Behaviour, Vol. 8 (2): 162-171.

Clark, N. ‘The Recursive Generation of the Cyberbody’ in Featherstone, M. & Burrows, R. (1995) Cyberspace/Cyberbodies/Cyberpunk, London: Sage.

Featherstone, Mike. & Burrows, Roger eds. (1995). Cyberspace/ Cyberbodies/ Cyberpunk: Cultures of Technological Embodiment. London: Sage.

Future Image (author unknown) (2006). ‘The 6Sight® Mobile Imaging Report’ on the World Wide Web. Accessed 22nd March 2007. Available at http://www.wirelessimaging.info/

Genosko, Gary (1999). McLuhan and Baudrillard: The Masters of Implosion. London: Routledge.

Kline, Stephen, DePeuter, Grieg, & Dyer-Witheforde, Nick (2003). Digital Play: The Interaction of Technology, Culture, and Marketing. Kingston & Montreal: McGill-Queen’s University Press.

Levinson, Paul (1999). Digital McLuhan: a guide to the information millennium. London: Routledge.

Liarokapis, Fotis (2006). ‘An Exploration from Virtual to Augmented Reality Gaming’ in Simulation Gaming, Vol. 37 (4): 507-533.

Manovich, Lev (2006). ‘The Poetics of Augmented Space’ in Visual Communication, Vol. 5 (2): 219-240.

McLuhan, Marshall (1962). The Gutenberg galaxy: The Making of Typographic Man. Toronto, Canada: University of Toronto Press.

McLuhan, Marshall (1964). Understanding Media: The Extensions of Man. New York: McGraw-Hill.

McLuhan, Marshall and Powers, Bruce R. (1989). The Global Village: Transformations in World Life in the 21st Century. Oxford University Press: New York.

Milgram, Paul & Kishino, Fumio (1994). ‘A Taxonomy of Mixed Reality Visual Displays’ in IEICE Transactions on Information Systems, Vol. E77-D, No.12 December 1994.

Reitmayr, Gerhard & Schmalstieg, Dieter (2001). Mobile Collaborative Augmented Reality. Proceedings of the IEEE 2001 International Symposium on Augmented Reality, 114–123.

Roberts, G., A. Evans, A. Dodson, B. Denby, S. Cooper, R. Hollands (2002) ‘Application Challenge: Look Beneath the Surface with Augmented Reality’ in GPS World, (UK, Feb. 2002): 14-20.

Stokes, Jon (2003). ‘Understanding Moore’s Law’ on the World Wide Web. Accessed 21st March 2007. Available at http://arstechnica.com/articles/paedia/cpu/moore.ars

Straubhaar, Joseph D. & LaRose, Robert (2005). Media Now: Understanding Media, Culture, and Technology. Belmont, CA: Wadsworth.

Thomas, B., Close. B., Donoghue, J., Squires, J., De Bondi, I’,. Morris, M., and Piekarski, W. ‘ARQuake: An outdoor/indoor augmented reality first-person application’ in Proceedings of the Fourth International Symposium on Wearable Computers, (Atlanta, GA, Oct. 2000), 139-141.

Wagner, D., Pintaric, T., Ledermann, F., & Schmalstieg, D. (2005). ‘Towards massively multi-user augmented reality on handheld devices’. In Proc. 3rd Int’l Conference on Pervasive Computing, Munich, Germany.

Weiser, M. (1991) ‘The Computer for the Twenty-First Century’ in Scientific American 265(3), September: 94–104.

Williams, Raymond (1992). Television: Technology and Cultural Form. Hanover and London: University Press of New England and Wesleyan University Press

Further Reading:

Bolter, Jay D. & Grusin, Richard (1999). Remediation: Understanding New Media. Cambridge, MA: MIT Press.

Cavell, Richard (2002). McLuhan in Space: a Cultural Geography. Toronto: University of Toronto Press.

Galloway, Alexander R. (2006). Gaming: Essays on Algorithmic Culture. Minneapolis: University of Minnesota Press.

Horrocks, Christopher (2000). Marshall McLuhan & Virtuality. Cambridge: Icon Books.

Jennings, Pamela (2001). ‘The Poetics of Engagement’ in Convergence: The International Journal of Research into New Media Technologies, Vol. 7 (2): 103-111.

Lauria, Rita (2001). ‘In Love with our Technology: Virtual Reality A Brief Intellectual History of the Idea of Virtuality and the Emergence of a Media Environment’ in Convergence: The International Journal of Research into New Media Technologies, Vol. 7 (4): 30-51.

Lonsway, Brian (2002). ‘Testing the Space of the Virtual’ in Convergence: The International Journal of Research into New Media Technologies, Vol. 8 (3): 61-77.

Moos, Michel A. (1997). Marshall McLuhan Essays: Media Research, technology, art, communication. London: Overseas Publishers Association.

Pacey, Arnold (1983). The Culture of Technology. Oxford: Basil Blackwell.

Salen, Katie & Zimmerman, Eric. (2004) Rules of Play: Game Design Fundamentals. Cambridge, MA: MIT.

Sassower, Raphael (1995). Cultural Collisions: Postmodern Technoscience. London: Routledge.

Wood, John ed. (1998). The Virtual Embodied: Presence/Practice/Technology. London: Routledge.

Web Discoveries for June 24th

These are my del.icio.us links for June 24th

What is AR and What is it Capable Of?

Presently, most AR research is concerned with live video imagery and it’s processing, which allows the addition of live-rendered 3D digital images. This new augmented reality is viewable through a suitably equipped device, which incorporates a camera, a screen and a CPU capable of running specially developed software. This software is written by specialist software programmers, with knowledge of optics, 3D-image rendering, screen design and human interfaces. The work is time consuming and difficult, but since there is little competition in this field, the rare breakthroughs that do occur are as a result of capital investment: something not willingly given to developers of such a nascent technology.

What is exciting about AR research is that once the work is done, its potential is immediately seen, since in essence it is a very simple concept. All that is required from the user is their AR device and a real world target. The target is an object in the real world environment that the software is trained to identify. Typically, these are specially designed black and white cards known as markers:

An AR marker, this one relates to a 3D model of Doctor Who's Tardis in Gameware's HARVEE kit
An AR marker, this one relates to a 3D model of Doctor Who's Tardis in Gameware's HARVEE kit

These assist the recognition software in judging viewing altitude, distance and angle. Upon identification of a marker, the software will project or superimpose a virtual object or graphical overlay above the target, which becomes viewable on the screen of the AR device. As the device moves, the digital object orients in relation to the target in real-time:

armarker2
Augmented Reality in action, multiple markers in use on the HARVEE system on a Nokia N73

The goal of some AR research is to free devices from markers, to teach AR devices to make judgements about spatial movements without fixed reference points. This is the cutting edge of AR research: markerless tracking. Most contemporary research, however, uses either marker-based or GPS information to process an environment.

Marker-based tracking is suited to local AR on a small scale, such as the Invisible Train Project (Wagner et al., 2005) in which players collaboratively keep virtual trains from colliding on a real world toy train track, making changes using their touch-screen handheld computers:

crw_80271
The Invisible Train Project (Wagner et al., 2005)

GPS tracking is best applied to large scale AR projects, such as ARQuake (Thomas et al, 2000), which exploits a scale virtual model of the University of Adelaide and a modified Quake engine to place on-campus players into a ‘first-person-shooter’. This application employs use of a headset, wearable computer, and a digital compass, which offer the effect that enemies appear to walk the corridors and ‘hide’ around corners. Players shoot with a motion-sensing arcade gun, but the overall effect is quite crude:

100-0007_img_21
ARQuake (Thomas et al, 2000)

More data input would make the game run smoother and would provide a more immersive player experience. The best applications of AR will exploit multiple data inputs, so that large-scale applications might have the precision of marker-based applications whilst remaining location-aware.

Readers of this blog will be aware that AR’s flexibility as a platform lends applicability to a huge range of fields:

  • Current academic work uses AR to treat neurological conditions: AR-enabled projections have successfully cured cockroach phobia in some patients (Botella et al., 2005);
  • There are a wide range of civic and architectural uses: Roberts et al. (2002) have developed AR software that enables engineers to observe the locations of underground pipes and wires in situ, without the need schematics
  • AR offers a potentially rich resource to the tourism industry: the Virtuoso project (Wagner et al., 2005) is a handheld computer program that guides visitors around an AR enabled gallery, providing additional aural and visual information suited to each artefact;

The first commercial work in the AR space was far more playful, however: AR development in media presentations for television has led to such primetime projects as Time Commanders (Lion TV for BBC2, 2003-2005) in which contestants oversee an AR-enabled battlefield, and strategise to defeat the opposing army, and FightBox (Bomb Productions for BBC2, 2003) in which players build avatars to compete in an AR ‘beat-em-up’ that is filmed in front of a live audience; T-Immersion (2003- ) produce interactive visual installations for theme parks and trade expositions; other work is much more simple, in one case the BBC commissioned an AR remote-control virtual Dalek meant for mobile phones, due for free download from BBC Online:

A Dalek, screenshot taken from HARVEE's development platform (work in progress)
A Dalek, screenshot taken from HARVEE's development platform (work in progress)

The next entry in this series is a case study in AR development. If you haven’t already done so, please follow me on Twitter or grab an RSS feed to be alerted when my series continues.