Subscribe: Machin-X: Digital Puppetry
http://machin-x.blogspot.com/feeds/posts/default
Preview: Machin-X: Digital Puppetry

Machin-X: Digital Puppetry



A blog about real-time animation, interactive technology and how pixels and puppetry are coming together



Updated: 2017-07-09T06:48:40.087-04:00

 



Digital Wayang Kulit

2015-04-13T20:30:45.511-04:00

allowfullscreen="" frameborder="0" height="315" src="https://www.youtube-nocookie.com/embed/pcig_BN2tKg?rel=0" width="560">

Here's a nine minute demo of a Digital Wayang Kulit (Indonesian shadow puppetry) program developed at the MSc in Digital Education program at University of Edinburgh. The 2D figure is controlled by the digital Dalang (puppeteer) using a Gametrak controller and a Wiimote.

There have been a lot of shadow puppet inspired digital puppetry demos created over the years (this one is from 2013), but I love how fluid the movement of this one is.

Special thanks to Jane for submitting this!



Character Animator - A 2D Real-Time Animation tool from Adobe

2015-04-12T22:36:30.932-04:00

width="560" height="420" src="https://www.youtube-nocookie.com/embed/lmPo0_WZyPU?rel=0" frameborder="0" allowfullscreen>

Adobe has just unveiled a new 2D digital puppetry - or have we all agreed to call this field real-time animation now? - application called Character Animator. A demo of the software is provided in the video above, but essentially it's a tool for animating 2D bitmap (Photoshop) and vector (Illustrator) still characters in real-time using a camera, microphone, head tracking and facial mo-cap.

In addition to basic mo-cap and lip sync capabilities, it also allows users to create programmable behaviors and will support the creation of 3rd party plug ins. I haven't tried it myself yet, but it looks like it could be a very easy-to-use and potentially powerful tool for creating basic 2D animated characters in real-time.

Adobe Character Animator is currently in beta and available for testing by After Effects CC users. You can learn more here.

Via The Labyrinth.



Unbelievably impressive demo of the Unreal Engine 4

2015-03-20T00:20:55.453-04:00

allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/JNgsbNvkNjE" width="560">

It's been far, far too long since I posted an update here on Machin-X, but I may be (finally) returning to some digital puppetry work in the near future and when I saw this demo for the Unreal Engine 4 I had to share it.

It is, well, pretty unreal:
The Kite open world demo created in Unreal Engine 4 features a diverse and beautifully realized 100 square mile landscape. Everything is generated completely in real-time at 30fps and includes fully dynamic direct and indirect illumination, cinematic depth of field and motion blur, and procedurally placed trees and foliage.
Real-time 3D sure has come a long way since I first worked on a TV pilot for a proposed kids' series using cardboard cut-outs and blob tracking with a webcam to create some 2D flash animation almost a decade ago.

I wonder where this technology will go in the ten years or so?




Henson Digital Puppetry Studio

2014-01-28T15:06:29.099-05:00

allowfullscreen="" frameborder="0" height="315" src="//www.youtube.com/embed/kuzF22sR-lo?rel=0" width="560">

This is a brand new promotional reel for the Henson Digital Puppetry Studio, the patented real-time animation/digital puppetry system developed by Jim Henson's Creature Shop.



0 Comments

2014-02-04T17:30:22.791-05:00

allowfullscreen="" frameborder="0" height="360" src="//www.youtube.com/embed/GkmxPea_xjI?rel=0" width="480">

This is a simple, but nonetheless very effective example of 2D digital puppetry created using Unity and a Kinect.



Animatic Digital Puppetry System

2013-07-03T17:06:05.101-04:00

allowfullscreen="" frameborder="0" height="375" src="//www.youtube.com/embed/TMCUeav-XDw?rel=0" width="500">

A look at "Animatic", a digital puppetry system that was developed by Luis Leite (see previous post) using 3D Studio Max and Macromedia Director in 2006. The system was developed as part of his research thesis Marionetas Virtuais.



Digital Puppeteer Mario Mey

2013-05-23T13:54:22.742-04:00

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/BF53HSzLPXY?rel=0" width="560">

This is a new demo reel for Argentinian digital puppeteer Mario Mey that shows off his digital characters performing en Español at various live events (his character Pinokio 3D was mentioned here back in 2010) . He creates and performs his "Marionetas Digitales" (digital puppet) characters using Blender 3D and PureData, a real-time graphical dataflow programming environment for audio, video, and graphics.

You can see Mario at work and get a look at his production process in this video, however it was recorded in Spanish.



Faceshift Markless Motion Capture

2013-05-16T14:30:01.506-04:00

allowfullscreen="" frameborder="0" height="281" src="http://www.youtube.com/embed/0AFFWPkcOmE?rel=0" width="500">

Faceshift is software that promises "markless motion capture at every desk". It works with consumer-level cameras like the Kinect to track and analyze the facial expressions of a performer and uses them to animate a virtual character in real-time. It also offers the option of recording a performance so that it can be edited and polished in post-production.

There are lots of potential applications for this kind of software in game and film production and, of course, digital puppetry applications!

You can learn more at www.faceshift.com.



Hakanaï: Dancing with Digital Puppetry

2013-05-15T07:14:42.060-04:00

allowfullscreen="" frameborder="0" height="281" mozallowfullscreen="" src="http://player.vimeo.com/video/65175919?title=0&byline=0&portrait=0" webkitallowfullscreen="" width="500">

Hakanaï is one of the more unconventional examples of a digital puppetry performance I've discovered (although, is there anything truly "conventional" about any form of digital puppetry?). Its creators describe it as a "haiku dance performance taking place in a cube of moving images projected live by a digital performer".

The performance involves a dancer performing live, whose movements are tracked in real-time and used as the basis for an interactive, digitally animated environment that is projected around them:




It was created by the French Company Adrien M / Claire B using their proprietary software eMotion. Here's more from their description of the project:
 ...Performed by an artist as a “digital score”, it is generated and interpreted live. The dancer’s body enters into a dialogue with the moving images in motion. These simple and abstract black and white shapes behave according to physical rules that the senses recognise and to mathematical models created from the observation of nature.
The audience experiences the performance in several stages. They first discover the exterior of the installation. As the dancer arrives, they gather around to watch the performance. When the choreography has ended, the audience can then take some time to wander amongst the moving images.

Through a minimalist transposition, this piece is based on images drawn from the imaginary realm of dreams, their structure and their substance. The box in turns represents: the bedroom where, once the barrier of sleep is passed, walls dissolve and a whole new inner space unfolds; the cage, of which one must relentlessly test the limits; the radical otherness, as a place of combat with an intangible enemy; the space where impossible has become possible, where all the physical points of reference and certitudes have been shaken.

Through the encounter of gesture and image, two worlds intertwine. The synchronicity between the real and the virtual dissolves and the boundary that was keeping them separate disappears, forming a unique space filled with a high oneiric charge. 
Very cool, no? You can learn more from the video's description on Vimeo.



Activision unveils impressive real-time character demo

2013-03-31T01:37:24.999-04:00

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/l6R6N4Vy0nE?rel=0" width="560">

Activision unveiled some new real-time rendering technology for human characters at the Game Developers Conference last week.This is the result of several years of research in to creating photo realistic human characters for video games. Although the animation itself a bit off and suffers from the infamous "Uncanny Valley" effect, just on a purely technical level this is pretty impressive.

From the video's description on YouTube:

This animated character is being rendered in real-time on current video card hardware, using standard bone animation. The rendering techniques, as well as the animation pipeline are being presented at GDC 2013, "Next Generation Character Rendering" on March 27.

The original high resolution data was acquired from Light Stage Facial Scanning and Performance Capture by USC Institute for Creative Technologies, then converted to a 70 bones rig, while preserving the high frequency detail in diffuse, normal and displacement composite maps.

It is being rendered in a DirectX11 environment, using advanced techniques to faithfully represent the character's skin and eyes.

More technical details can be found here.

Via Cartoon Brew.



A digital dragon puppet

2013-03-27T18:35:37.283-04:00

allowfullscreen="" frameborder="0" height="315" mozallowfullscreen="" src="http://player.vimeo.com/video/48985899?title=0&byline=0&portrait=0" webkitallowfullscreen="" width="560">

A nice example of a digital shadow puppet, made by Luis Leite using Kinect and Unity 3D. To animate the puppet, a human body is tracked in real-time using the Kinect sensor, with one hand controlling the head and the other controlling the tail. The physical movement of the performer's body is remapped on to the virtual shadow puppet using Inverse Kinematics via Unity's Mecanim animation system.

Luis was also responsible for a Kinect-based digital puppet that was mentioned in a post about Kinect-based digital puppetry on Machin-X two years ago.



Digital Puppetry with the PS4

2013-02-22T18:58:00.429-05:00

width="560" height="315" src="http://www.youtube.com/embed/whc7kAi5QgU?rel=0" frameborder="0" allowfullscreen>

Earlier this week game development studio Media Molecule gave a R&D presentation at the launch event for Sony's new PlayStation 4 (PS4) video game console, which appears to have some amazing new capabilities like the ability to sculpt, create and animate in real time that offer phenomenal potential for digital puppetry applications. Media Molecule won't too much about what they're working on (yet), but their demo utilizing the PS4 and the often-derided PlayStation Move controller looks amazing (skip ahead to the 5:15 mark to see all the digital puppetry goodness).

Very exciting!

Via Puppeteers Unite.



Karagoz goes digital

2013-01-30T01:26:49.944-05:00

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/Y0AXvuDoccA?rel=0" width="560">

Indonesian shadow puppetry has gone digital, so why not Turkish shadow puppetry too?

iKaragoz is an app for iPhone and Android developed by a Turkish firm called Anakule. They are promoting it as the first puppet application for mobile devices, but it's definitely not (several others including iPuppeteer and Pollock's Toy Theatre app have been on the market for years).

The app allows the user to control the characters onscreen intuitively by simply moving their smart phone or tablet.In addition to the traditional Turkish Karagoz puppets, additional packs with characters from Cambodian, Chinese, Indian, Indonesian, Thai and Greek puppetry traditions are also available.

Here's the Wayang Kulit version in action:

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/uKCccvSP4bo?rel=0" width="420">

iKaragoz was designed by Uğur Doğan with the assistance of Turkish puppeteer Mehmet Saylan. It's available to download from the iTunes Store and Google Play. I haven't had a chance to try it out myself yet, but if someone does please let me know what you think!

Via Puppetry News.



Bryn Oh, Imogen and the pigeons

2013-01-30T01:00:02.513-05:00

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/85udU1fVFX8" width="560">

Bryn Oh is the avatar and pseudonym of a professional oil painter who lives here in Toronto who has been creating mesmerizing and challenging multilayered installations inside the virtual world of Second life for several years (see previous post). I'm especially impressed by Bryn's latest work Imogen and the pigeons, an "immersive narrative exhibited in the virtual world called Second Life...a layered story told through poetry."

I find it difficult to classify work like this. Is it Machinima, immersive interactive art, digital puppetry, all of the above, or something else entirely? While I'm not entirely sure what the answer to that question is, I do know that I like it. A lot. It's inspiring to see the innovative ways that Bryn Oh is exploring and expanding how this still new medium can be used.

You can explore Imogen and the pigeons inside Second Life (click here and follow the instructions to join Second Life) and/or see more of Bryn Oh's previous Second Life builds on YouTube.



Digital Shadow Puppet Installation

2013-01-20T21:05:23.088-05:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" mozallowfullscreen="mozallowfullscreen" src="http://player.vimeo.com/video/57154561?title=0&byline=0&portrait=0" webkitallowfullscreen="webkitallowfullscreen" width="560">

A nice look at an interactive, Kinect-based digital shadow puppet installation created by New York based motion designer Yang Yang.



Digital Guignol Theater

2013-01-10T00:08:46.385-05:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" src="http://www.youtube.com/embed/o_u0xy5qDUQ" width="420">

I've been doing a lot of Punch and Judy research lately and I stumbled across this Digital Guignol Theater that was created by Wizarbox in 2009. A proof-of-concept demo intended to aid in children during rehabilitation, it allows them to control a digital Guignol (the French equivalent of Punch) show using their voice and one or more Wiimotes.



Digital Wayang Kulit Music Video

2012-12-05T03:25:48.392-05:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" src="http://www.youtube.com/embed/yzuGheu09Cg?rel=0" width="560">

We don't see a lot of music videos that feature digital puppetry, so when one comes along it's worth noting!

This video for Boshra Al Saadi's Snowyman was created using a modified version of Antonius Wiriadjaja's Java-based Wayang Kinect project. Hit the link to learn more about his work and download the digital Wayang Kulit applet so you can try it out yourself!



Flaming Skull Face Tracking Demo

2012-11-23T02:02:31.621-05:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" src="http://www.youtube.com/embed/SVt62qagEcA?rel=0" width="560">

Some clever face tracking from a Spanish studio called Paradox D&D. They've written a custom application that positions a 3D skull over top of the user's head, capturing the user's body movements and using them to control a 3D model in real-time. This is an example of a "virtual dresser" that allows a user to "wear" 3D elements in a manner very similar to augmented reality.

Via KinectHacks.



Researchers Demo New Kinect-based 3D Puppetry System

2012-11-04T11:26:33.759-05:00

allowfullscreen="allowfullscreen" frameborder="0" height="360" src="http://www.youtube.com/embed/Z_G_ESvDmIY?rel=0" width="480">

This video highlights a new Kinect-based 3D puppetry system developed by researchers at the University of Washington and the University of California, Berkeley that allows users to create 3D animation with minimal experience.

What's somewhat unique about this system is that unlike most Kinect-based systems, instead of the using body movement to control an onscreen character or object, users manipulate physical objects to control corresponding 3D models on a virtual set. I like this approach to digital puppetry a lot, because it retains the basic concept of puppetry (real-time manipulation of a physical object). To demonstrate how effective they believe this approach is, the researchers used novices in puppetry and animation to test the system.

Although the system these researchers have developed appears to be somewhat limited - note that the objects in the demo are solid objects without any kind of skeleton or use of Inverse Kinematics or Forward Kinematics - systems like this are very positive developments for digital puppetry. It would be great to see this get out of the lab and be developed in to some kind of open source platform.

A technical paper on this project can be found here.



Multitouch Puppetry

2012-10-17T22:34:16.079-04:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" mozallowfullscreen="mozallowfullscreen" src="http://player.vimeo.com/video/28926436?title=0&byline=1&portrait=0" webkitallowfullscreen="webkitallowfullscreen" width="560">

There has been an explosion of experiments with various forms of digital puppetry in the past couple of years, but we're still lacking a true "killer app" that makes real-time manipulation of digital characters simple an intuitive. That doesn't mean that progress isn't being made though; little breakthroughs are being made all the time. I've seen a number of interesting experimental interfaces in the past year, including this one created by Quan Nguyen and Michael Kipp, who have attempted design a simple, user-friendly multitouch system that can be used to control the complex movement of the human arm.

From their research:
"Controlling a high-dimensional structure like a 3D humanoid skeleton is a challenging task. Intuitive interfaces that allow non-experts to perform character animation with standard input devices would open up many possibilities. Therefore, we propose a novel multitouch interface for simultaneously controlling the many degrees of freedom of a human arm. We combine standard multitouch techniques and a morph map into a bimanual interface, and evaluate this interface in a three-layered user study with repeated interactions. The multitouch interface was found to be as easy to learn as the mouse interface while outperforming it in terms of coordination...Our results show that even complex multitouch interfaces can be easy to learn and that our interface allows non-experts to produce highly coordinated arm-hand animations with subtle timing." 
Quan and Michael are members of EMBOTS (Embodied Agents Research Group), a research group based in Germany. They've worked on a number of interesting projects with digital puppetry applications; you can find an overview of their work here.



Kinect Digital Puppetry Experiment

2012-10-05T18:09:27.550-04:00

allowfullscreen="allowfullscreen" frameborder="0" height="420" mozallowfullscreen="mozallowfullscreen" src="http://player.vimeo.com/video/50856914?title=0&byline=0&portrait=0" webkitallowfullscreen="webkitallowfullscreen" width="560">

Here's a recent experiment in 2D digital puppetry, created using the Kinect and KinectSDK block for Cinder / C++ (compiled using Microsoft Visual Studio 2010). If you're able to program, you can find the code that was used to create this digital puppet here and try it yourself!



How to make your own Waldo

2012-08-17T10:15:00.210-04:00

allowfullscreen="allowfullscreen" frameborder="0" height="308" mozallowfullscreen="mozallowfullscreen" src="http://player.vimeo.com/video/22761859" webkitallowfullscreen="webkitallowfullscreen" width="560">

Ever wanted to create your own Waldo-like control and perform real-time digital characters? Friedrich Kirchner demonstrates how it's done using an arduino microcontroller, sensors, buttons and his free Moviesandbox software.



Georgia Tech unveils new way to control CG characters without skeletons

2012-08-15T16:21:02.030-04:00

allowfullscreen="allowfullscreen" frameborder="0" height="315" src="http://www.youtube.com/embed/quB2iZsGrAk" width="420">

A team of researchers at Georgia Tech have developed an innovative new way to control CG characters and objects without a skeletal structure like worms, Jello-like blobs of goo and even the human tongue. The system eschews the typical bones-based skeleton control approach that is commonly used in computer animation in favour of a new approach that involves simulating soft bodies.

The researchers claim that the result is non-skeletal characters that can be easily animated using simple point-and-click mouse movements and/or touchscreen gestures, which opens up all kinds of exciting possibilities for accessible animation including possible applications for real-time animation. You can find more information and a detailed technical explanation here.

Via Cartoon Brew Biz.



How to use anything as a digital interface

2012-05-06T22:49:32.436-04:00

allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/E4tYpXVTjxA?rel=0" width="560">

Here's an interesting project from Disney Research and Carnegie Mellon University...a system called Touché that can be used to create gesture controls on everyday, non-computer objects. Cool stuff.

Hmmm...can anyone think of possible digital puppetry applications for this?

Via Ars Technica.



Scooter on Puppets vs. CG

2012-03-01T16:33:09.686-05:00

frameborder="0" height="315" src="http://www.youtube.com/embed/DoQyB7ayeZE?rel=0" width="560">

I've always believed that computer graphics and puppetry are complementary art forms, but there are of course a lot of people who insist that the two mediums have a competitive - and possibly combative - relationship. While I don't personally subscribe to that theory apparently certain Muppets do.

Earlier this week Scooter appeared at the TED 2012 conference, making a tongue-in-cheek presentation called Can Tactile Icons Survive in an Integer-Driven Environment? in which he made the case for puppets over computer generated characters and got pretty worked up about it. Unfortunately, the quality of the video isn't great, but hopefully a better version will surface online soon.

Cross posted from PuppetVision; via Tough Pigs.