Ultimate Coder Challenge – Going Perceptual and Testing the Apps

Posted on 17 April 2013, Last updated on 21 September 2019 by

2013-04-17-2278(0) Collage

After a few days of setting up and testing I’ve spent a good amount of time with all of the submissions to the Ultimate Coder Challenge where showcase applications are being built for a Lenovo Yoga 13 Ultrabook to demonstrate the Intel Perceptual Computing hardware. It has at times been frustrating but as I write this I do feel a rush of excitement for a new wave of input methods. It will take some years to perfect this technology, software and methods but, without a doubt, there’s something special going on here. Pay attention!

image006Before I talk briefly about the apps I want to mention a number of points I noted during testing this week. Firstly, there’s the word ‘fuzzy.’ It cropped up in Week 3 of  my reports and remains an issue right until the end. The data coming from the cam and being offered to the programmer through the SDK is very raw and very jumpy which has meant a lot of work on smoothing. Smoothing needs to be part of the hardware or related silicon and offered through the SDK. It will improve the experience and save a lot of CPU-generated heat and battery usage.

My second note is about creating a common set of hand gestures. SOMEONE NEEDS TO DRIVE A COMMON GESTURE SET. It will help early adoption of new gesture-driven software and will help the SDK designers optimize the features. How about a common gesture for ‘undo’ or ‘back’? Should we use thumbs-up, the divers ‘OK’ or a victory sign for ‘select’?

On-screen feedback needs to be part of the OS. The OS needs to be able to tell you 1) if the hardware is being used (privacy will play a part here) and what elements of the camera/hardware are active and 2) It needs to provide a detailed on-screen feedback of the tracked object. The same applies to voice feedback.

CollageThere’s a very important note to be made about ‘magic-ness’ too. The iPhone was said to have a magical user interface. The reality was that it was simply very responsive – below the move-response latency required by a human in order to get a feeling of reality. For gaming this is critical. Hardware and SDK engineers need to work on this and create tools for testing it.

Another issue that I have to mention is that hands get in the way of the screen and that’s because the camera is centre mounted. The solution would be hand gestures either side of the head or to the sides of the screen. Intel might want to think about dual side-pointing gesture camera hardware to help with this big issue. It would encourage some easy gestures too, just like many users enjoy with touchscreen activity in the bottom left and right corners of a screen.

I had a buggy time testing all the software, not least because no-one has ever tested all the software together. A crash or lock form one application meant others would fail so we can’t pull anyone up on that issue. What we can mention is how applications using new forms of input need to guide the user slowly and solidly into using them. Training, demos, how-to’s, quick gesture guides (what’s the global gesture for help?) and calibration are all part of that.

Finally I need to mention power. Most of these apps are taking a huge amount of power and require the full capability of an Ultrabook. These apps would kill a battery-powered Ultrabook in under 2 hours in most cases and the heat generated is so high that you shouldn’t expect to have the Ultrabook anywhere other than on a desktop. I’ve been thinking about All-in-Ones (AIO) recently and see a better opportunity for gesture control there. The screens are solidly placed, there’s space for venting and higher-power components and given the thickness restrictions of the Ultrabook you have a better chance of building the ‘Percie’ hardware in.

7 teams have had 7 weeks to learn about the perceptual computing hardware, the SDK, the methods, the problems and then create an app. That exercise has given all of us a head-start and I sincerely hope that Intel takes the lessons learnt and feeds them back into software and hardware engineering. I also hope they think about publishing a white paper or case study on the lessons learn and I hope they take some of these apps forward by sponsoring teams to find the limits and push them further.

It’s been a tough few days working my way through the apps. Some apps have instant appeal and others require some understanding of what’s going on underneath to appreciate. I’ve had to consider team sizes and the way those teams have publicised their work through code, blogs and interaction with other teams. I have a short-list in my head and over the next few days I’ll be working through a scorecard and submitting that back to Intel who will announce the winner next week.

See below for my feedback on all the apps. Bookmark the links here for future reference. 

Important links for the Intel Ultimate Coder Challenge: Going Perceptual Competition

2013-04-17-2278(0)

Puppet In Motion

The concept here is very simple to understand and what I like is that Sixense have found an application that works best with gesture and hand control and couldn’t really be done with the keyboard and mouse. The ‘gamification’ of the puppets in a (recordable) storyboard is a great idea too. Fuzziness features a lot with the puppets who jump about a lot and even get lost sometimes but points go to Sixense for their calibration/training mode.

2013-04-17-2282(0)

Stargate Gunship

A gesture-driven shoot-em-up was always going to be fun to see. Through the Code Monkeys we’ve learnt a lot about the problems of continuous or long-duration gestures. The end result is something that works although I find the head-peeking movement very unnatural.  Fuzziness affects the game, just as it affects many of the applications here, and there’s a slight issue with providing the level of responsiveness required in a shooting game. Latency can be hidden in a flight simulator, perhaps, but not in a fast-action shoot-em-up. It’s polished though and it makes a great demo.

2013-04-17-2284

Kiwi Catapult Revenge

This is a cute game that allows a wide-ranging set of inputs. ‘Head-peek’ is implemented well and Infrared 5 seem to have avoided jumpiness although as their main control method doesn’t include the perceptual computing hardware, they didn’t have quite as much of a challenge as the other teams. Brass Monkey was highlighted well but is that really what we’re looking for in a Percie app?

 

2013-04-17-2285(0)

Huda

Peter O Hanlon is one of the three single-person teams in this competition and we have to remember that when we compare against the products from the professional teams. His idea, a sequencer for image edits and effects, is a really interesting one but does it really lend itself to gesture control? I immediately noticed that I wanted on-screen feedback and an Undo gesture. Some of the gestures chosen are a bit strange too. If you do a thumbs-up, you’ll apply a red filter, for example. Gesture mapping can be changed though. If Huda could be developed with a better feedback mechanism and fast-response it would be a great demo.

Untitled

Betray

Seduce is Eskil Steenbergs UI layer built on Betray, the hardware extraction layer, that Eskil has also written from scratch for this competition. It’s a huge undertaking and at a more core layer than the other products in this competition. It’s difficult to grasp how wide the scope is here and how much it could help gesture user interfaces in the future and as such, it’s not the best showcase app. There’s one thing that’s really worth highlighting though and that’s the head-tracking. Eskil has got the response times down to near ‘magical’ levels and that’s important for all gesture-driven software to take note of. That alone is a great showcase.

Percie 1

Perceptucam

Take a 3D representation of the upper body and transmit that, with voice to a second user, and a third, and a forth. Add in a virtual 3D background and you’ve got a very interesting conferencing concept that even allows you to draw on the desk and make fireworks in the air! This is another concept that puts the 3D tracking into a unique position because you can not do this any other way. Lee Bamber is fighting with fuzziness in the data and hasn’t quite got a slick application ready but it’s a great demo. Remember too that Lee is also a single-person team.

Unfortunately our friends at Simian Squared have had to pull out of the competition but I hope they consider making further progress with their very interactive pottery application.

Next Wednesday we will be announcing the winner of the Ultimate Coder Challenge – Going Perceptual.

Over the 8 weeks of this competition the most important thing I’ve learnt is that perceptual computing is not easy and companies like Intel will have to step away from this competition and fold back the knowledge into further silicon and software development.  Perceptual computing hardware could be something you’ll see integrated in AIO PCs in 2014 but even then, there’s a whole lot of software to be written. Don’t forget we’re looking at ground-breaking technology and ground-breaking applications.

Other judges in this competition are linked below. Check out their thoughts.

Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Ultrabooknews.

1 Comments For This Post

  1. Peter says:

    Chris, thanks for the post. I’m disappointed that Simian Squared had to pull out. I was really looking forwards to playing with their app.

Find ultra mobile PCs, Ultrabooks, Netbooks and handhelds PCs quickly using the following links:

Acer C740
11.6" Intel Celeron 3205U
Acer Aspire Switch 10
10.1" Intel Atom Z3745
HP Elitebook 820 G2
12.5" Intel Core i5 5300U
Acer Aspire E11 ES1
11.6" Intel Celeron N2840
Acer C720 Chromebook
11.6" Intel Celeron 2955U
ASUS Zenbook UX305
13.3" Intel Core M 5Y10a
Dell Latitude E7440
14" Intel Core i5-4200U
Lenovo Thinkpad X220
12.5" Intel Core i5
Acer Chromebook 11 CB3-131
11.6" Intel Celeron N2807
Lenovo Ideapad Flex 10
10.1" Intel Celeron N2806