Eiki and Calibre were showing an interesting approach to better project 3D imagery. They made the two projectors as co-incident as physically possible then warped the screen to try to give it more natural curve when projecting. Then using the Calibre HQView hardware, adjusted the overall image to more perfectly fit the screen. It might have interesting application for larger venues where you want to project 3D and allow people to have differing views of the imagery depending on location.
There were a few glasses free 3D vendors on the floor, which might make my prediction of seeing this in 5 years a bit too generous. SONY demonstrated a 46″ prototype TV that looked amazing – not 100% there but still a big step forward. They also showed a lenticular screen that’s available today to put on your laptop with Blu-ray disc capabilities to watch your movies in 3D on the computer with no glasses. It even had head tracking to adjust the 3D to where you were looking – it’s designed for a single person of course.
Philips and Dolby also demonstrated glasses less 3D on televisions as well as phones, tablets and computer monitors. The image seemed much sharper than Sony’s glasses free display, or for that matter, any 3D display WITH glasses, but the 3D didn’t seem as pronounced as on those other systems. That’s not necessarily a bad thing, as the subtle 3D actually added depth without being overwhelming which could be more satisfying to those folks that have trouble watching more traditional 3D rendering.
The National Institute Of Information And Communications Technologies showed their 200″ glasses free display. What was most incredible about this display was the fact that you could walk from side to side and appear to see different angles of the image – as if you were standing in front of a real object. They had a car with it’s door open and you could walk around and see inside the door as you walked around. The image wasn’t overly sharp but considering it was so large and there were no glasses, it was incredibly impressive. You could also for the bee example below, walk around the bee to see the side you cannot see standing in front of the screen.
Elecard was showing some really impressive 3D conversion software and hardware that was converting 2D imagery into great 3D content in real time. They also have tools such as StereoTrack that allow you to process 2D into 3D with more creative control over the process.
The Free-viewpoint Immersive Networked Experience (FINE) is researching and developing the concept of live free-viewpoint content which will allow remote viewers to place a virtual camera in a real live-action scene and move it freely in space and time, heightening the sense of presence and reality. I’ve seen several examples of this over the last few years but it seems like this type of viewer control is closer than ever. I can imagine in a teaching and learning environment, allowing students to explore different viewpoints of media content.
Planar was showing (4) of their 55″ Clarity Matix LCD (LED backlit) displays. The monitors were so well mounted (using the EasyAxis™ mounting system) that it didn’t really bother you when dragging images across the seams. The monitors were made interactive using Omnitaps 2.o multitouch software. They offer a number of pre-built templates for things like puzzles and maps which can speed development and deployment of interactive applications and have non-profit pricing. The hardware framework was by PQ Labs and can be custom made to fit many different sized displays – including multiple displays as seen here.
Kenziko Kontact is a companion product to to the real time 3D broadcast graphics application VizRT. It allows you to create macros to allow someone to manipulate and annotate inside a 3D CGI environment in real time. You can create different types of icons and graphics tool giving the broadcaster (or in our case the instructor or presenter) a simple to use interface. While this is for broadcast, I can absolutely see applications in teaching – maybe engineering or medical instruction – being able to interact with a model with simple tools – assuming you have the developers to create the application.
The new SONY Dev-5 and Dev-3 digital recording binoculars provide full HD resolutoin in 2D or 3D plus 7.1 megapixel stills. With 10x optical zoom (10x digital) they can be used for wildlife observation/recording or during sporting events. The binocs sell for just under $2000 for the DEV-5 and $1399 for the DEV-3 which has no digital zoom, eye cups or GPS tagging.
The research engineers at the National Institute of Information and Communications Technology in Japan have integrated 3D visual, touch, sound and even smell into a multi-sensory interaction system experiment. Although there is no real object, you can see it, touch it by using a force feedback pen, hear it, and even smell it simulating real interaction. You can interactively experience the tactile sense of soft 3D virtual balloons and break them with the explosive sound and smell. It was quite impressive.
Here’s the previous version (not the balloon, but you can get the idea)