Mobile LiveTouch

Mobile LiveTouch is a multi-platform framework that allows multiple users to interact with digital displays using touch-based interactions in real-time. The framework is based on open source technologies and is usable on any phone with a Webkit based browser.



Capture user input and send it to a digital display in real-time using AJAX and Socket-based connections.

Touch Input

Input includes touch, swipes (left, right, up, down, or vector), pinch, rotate, orientation, geo-location, and text

Low Barrier to Use

No app to install. Users are able to interact with the display within seconds. Developers are about to drop the libraries in their project and start developing quickly.

Rich Display

Using Flash, the digital display can turn the input into a rich interactive experience. Create multiplayer games, allow a user to interact with a store display, or just do something fun.

Downloads - Coming Soon

The framework is being re-written to be more developer friendly. If you'd like to be notified of it's release, send us your email address using the form on the right. You will only receive email regarding the launch of Mobile LiveTouch.




Webkit provides a javascript API for capturing touch based input. Latest versionf webkit browsers are now showing up on the iPod Touch, iPhone, Palm Pre, and Android devices.

JavaScript / AJAX

Input commands are parsed and sent to a server in real-time using AJAX / XMLHttpRequest. A JS library is provided that includes events for most gestures (swipe, pinch, zoom, rotate, etc).

Web Server / PHP

Apache (or any server capable of running PHP) is used to handle the requests from the devices. The PHP script handles the request, connects to the socket server and sends the command.


Socket2Me is used as the XML socket server on the server to send requests between the Client and Display. Socket2Me is an open-source generic XML Socket Server that is developed seperately by the team.

Flash / Flex / AS3

Flash is used on the Digital Display. It connects via Socket to Socket2Me Server, retrieves the latests commands and displays the result onscreen. Commands can be sent back to individual clients if necessary.


Yahoo OpenHack Demo

Mobile LiveTouch came out of a Yahoo OpenHack competition in October 2009. 24 hours of coding turned into a "LiveTouch Disco", an experience that allowed anyone with an iPhone create an avatar and perform dance moves on the big screen. We had about 50 users are the same time on screen. The display uses a rag-doll physics engine for the avatars.

Valid commands are:

  • Move left arm => swipe left
  • Move right arm => swipe right
  • Jump => swipe up
  • Do the Splits => Swipe down
  • Rotate Head => Perform cirlce gesture
  • Scale Head => Pinch / Zoom

=> Click Here to go to the display!
Then, go to: on your compatible mobile device to interact.
Razorfish Company Beach Party

Before the Yahoo event, Jack Howell came up to me with the idea of having a website talk to another website in real-time. Later on he showed me the new webkit gesture support. A day later, we were coding away at what ended up being the "Company Beach Party". At the party, we setup a few LCD displays with the "Display Application", and gave people a short code they could text message in order to interact with the display. The users were text messaged back a URL that allowed them to customize they "beach ball" with a name and color. Then the users were all put in a "beach environment" where they could bounce around and into other user's beach balls and send messages.

=> Click Here to go to the display!
Then, go to: on your compatible mobile device to interact.