Air Gestures, facial analysis, object tracking, advanced voice recognition, touch and Ultrabooks. It’s all in the Ultimate Coder Challenge where 7 teams are building showcase applications on a Lenovo Yoga 13 that will demonstrate the sort of interaction capability that could be built into a Ultrabook in the future.
I’m involved with the judging of the Ultimate Coder event and every week you’ll find an update from me as I analyse the teams progress.
This is week 1. The teams have started their journey and have all published some interesting and useful content on the subject. Will they have enough time? Have they planned for problems? Who’s going to win? I’ve dived into the projects and give my thoughts below.
You can find all our Ultimate Coder posts here
Our Ultrabook software developer resources are here
All our Perceptual Computing posts here
Sixense
Skin bending, human deformation and skeleton creation – all part of the puppet making process apparently although I’m a bit worried I won’t be able to demo this to my kids. Sixsense have worked with tracking and 3D analysis before, in a big way. Their introduction post talks about their success with the Portal 2 Motion Pack and Motion Creator Software so I’m expecting big things now! Basically the team is looking to create a Puppet Platform. I’m not talking about straw men here though because there’s talk of multiplayer action and movie-set components. Sixsense have chosen Unity as their engine and SDE and have already proved that the Intel Perceptual Computing SDK works within it. Read the Sixense week 1 post here.
Simian Squared
The two-man London-based team known as Simian Squared have also chosen Unity to accelerate their development. Giuseppe and Rob have worked with 3D space before although this is the first time they’ve handled it from the real-world perspective so it’s very much the ‘next-level’ for the team. What are they planning? One image describes the plan…
I love this. Simple, fun and potentially educational but real-time spinning deformable 3D models controlled by perceptual computing sensors? This is a serious computing challenge but the team has a big advantage; If they re-create the pottery scene from Ghost, they’ve won in my eyes. Symian Squared Week 1 post here.
Lee Bamber
Lee was a participant in the previous Ultimate Coder challenge so has the advantage of knowing how short the Ultimate Coder weeks can be when there’s a fixed deadline. He’s planning to write software for a “new kind’ of webcam experience. Instead of transmitting a video of the user, a 3D representation of the user will be sent to the client. This has bandwidth implications that could open-up virtual webcam experiences to a wider audience and if you think about what recent MPEG AVC codecs do to try to find and track objects in a scene, there’s a parallel here that could form part of live video codecs in the future. Exciting (and something Lee should patent immediately despite you reading it here first.)
Lee hasn’t started coding yet but plans to use Dark Basic Professional. In his first post he gives us an insight into his hardware environment. You’ll see the development computer, the perceptual computing hardware and the Lenovo Yoga 13 being demonstrated on video in Lee’s Week 1 post. He also owes me 200 of his fine English pounds for getting to the end of his long blog post!
Infrared5 / Brass Monkey
They look like they’re ready to go! Infrared5 created Brass Monkey in 2010 and it’s now become a new business. I’ll call them Brass Monkey from here-on. Brass Monkey is a software technology that uses smartphones as controllers and a web browser as the console. Confused as to how that works? I am, but it looks cool and very sensor-related. Do they want to drop the smartphone from the equation? No, they want to add head-tracking to the equation. Woah! Head, shoulders, eyes and a bunch of sensors in the hand? Kiwi Catapult Revenge is the name of the project and again, this team is using Unity as the dev platform. Not only is this team combining the PC SDK and the Brass Monkey SDK they’re adding a C library for more advanced face tracking. They provide information about their choice of face tracking library, some detail about the application architecture and a lot more information in their Week 1 post.
Eskil Steenberg
“…no other platform could exist without the PC.” says Eskil in his opening paragraph. He’s thrown in the ‘Amiga’ keyword, the mention of a Commodore 64 in paragraph 2 and in paragraph 3 he talks about the Xerox Alto from 1973. Where’s all this going? Eskil want’s to show what the PC could be in terms of user interfacing. He’s building a framework for a graphical user interface. Oh, just something small and easy then Eskil?
There’s going to be a lot of UI design skill involved here and to be successful, the API that Eskil is working on needs to incorporate the best-of-knowledge about the subject in order to cover screen sizes, varying input methods and different usage scenarios. What Eskil has outlined sounds like a two-year multi-team project! One thing I’ll be interested to read about in the following weeks is Eskil’s take on a standard, globally-accepted set of gestures. It’s critical for future adoption.
Read Eskil’s week 1 post here.
Peter O’ Hanlon
Peter is creating an open-source photo application called Huda which will use the concept of storing photo edits as a series of command sequences that can be edited and re-organised. A kind of MiDi sequencer for photos, I imagine. What’s it got to do with perceptual computing? Peter is incorporating gestures and voice into his app and I believe he’s right when he says that touch should be part of the experience. What’s the round-trip delay on a gesture response? It’s probably very long compared to the touch response on a screen so there are cases when you still need touch. Accuracy is important too. (We talked about Digitizers recently – another level of accuracy and response.) Peter will also be evaluating facial analysis – a step beyond gestures.
Peter is using the Windows Presentation Foundation in his project so we assume Visual Studio will be in there too. Does that give Peter a path to Windows 8 Modern UI applications we wonder? Read Peters week 1 post here.
Code Monkeys
The Code Monkeys are also familiar with the time limits of the Ultimate Coder challenge and they’re taking a similar approach again by taking a game from their portfolio, Stargate Gunship, and adding perceptual computing features to it. There’s a big challenge in retro-fitting something here. Will the existing UI need to change. What about the gameplay? Would it be easier to start from scratch? The Code Monkeys are using Unity as the development environment.
In their opening post they’ve posted a video that shows the Creative PC hardware and talks about ‘granular’ tracking and hits on an important topic – stimulation. Developers are being stimulated by the idea of perceptual computing. When do you think we’ll see the first perceptual computing sensor-set in the frame of an Ultrabook?
The Code Monkeys week 1 post is here.
It’s almost worrying to see the complexity of the projects being taken on here. Seriously, there are 7 weeks in total and teams are taking on huge projects. The bar is being set extremely high but if rich and engaging results come out at the end it will be a win for everyone watching. Do stay tuned to learn about the development environments, to view coding techniques and samples and of course, to get hold of the final product at the end.
Important links for the Intel Ultimate Coder Challenge: Going Perceptual Competition
- Intel competition Website
- Join everyone on Facebook
- Competitors list and backgrounder
- Twitter stream #ultimatecoder
- All our perceptual computing coverage can be found here.
- All posts about this and the previous software competition can be found here
Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Ultrabooknews.