Leap Motion

Jack showed me a cool product called Leap Motion the other day. I took a few minutes to play with it and can definitely see a wide range of applications. What is it? Basically it’s a sensor system that picks up hand gestures to control applications on your computer (or even your computer itself). While not dissimilar in concept to the Primesense family of technologies (e.g. Xbox Kinect and Asus Xtion) the system stands out in as focused primarily on hand gestures and PC applications as opposed to gaming. Many controllable apps are available through their app store with a good portion of them educational in nature such as this anatomy program that let me dissect a skull. The system has an SDK available so programmers can quickly build apps around it without much ground-up work. It certainly presents a lot of potential in the education arena.

This isn’t far out tech or some obscure device you’ll never see in the field either. The unit is $80 and as of this month HP has started shipping laptops with Leap tech built into it with Asus soon to follow. Very Minority Report-ish!

2014-01-16 09.51.01

Makerbot Desktop 3D Scanner

2014-01-15 15.09.05Screen Shot 2014-01-15 at 3.36.22 PM

Along with our 3D printers we received 3D scanner and have had a a few opportunities to play with it and learned quite a bit.  At first take our scans weren’t really successful and I was actually a little bit disappointed with the unit (a Makerbot Scanner) but after some practice we got pretty good at it.  You can see the original blue devil and a scanned version of what we made.

How It Works

Basically the object you want to scan is placed on top of a turntable that isn’t all that dissimilar to a real record player.  The unit is USB’ed into a computer an uses bundled scanning software.  A few clicks of the mouse (very limited options) starts the scanning process.  A pair of lasers kick off and on and the turntable begins rotating slowly.  The laser reflected light is picked up by an on board camera and that data is piped back to your host machine that draws you a 3D file.  The process is really simple.

Multi-Scanning

One of the cool things about this scanner is the ability to do multi-scans.  There’s virtual no way to penetrate  every nook and cranny of the object you are examining with the laser beam.  The system is intelligent enough recognize that and allow you to scan the item multiple times from different angles (upside down for example).  This exposes parts of your object that may not have been visible in the first pass.  Upon completing the additional scans (you can do this as many times as you want exposing more detail) the unit will concatenate the scans into a single object, intelligently blending them together.

Sci-Fi vs Realty

Now the sad part, while the scanner does a great job for a $1500 desktop product, the result isn’t perfect.  There are always minor defects.  The hope and dream with a 3D scanner is that you can scan something then immediately print a perfect replica.  That just won’t happen.  The scans, from our experience could be as much as 90% (eyeball estimate) accurate they weren’t perfect.  For example, look at the horn of our Blue Devil rubber ducky, you’ll notice some artifacts.  However, it was a good start if your workflow involved bringing an object into Blender or other 3D software for further refinement.  It may also be good for educational applications where you need to show someone the size and shape of something but don’t necessarily need to replicate something like a gear in a machine, perhaps a skull or other anatomy component.   Another important thing to note, the scan you create will not have any color in it, only the physical shape.

Limitations

There are also some important limitations to note.  The scanner cannot scan certain objects:

  • nothing larger than 8 inches high or wide
  • nothing heavier than 6lbs
  • no extremely light objects (nothing white)
  • no extremely dark objects (nothing black)
  • nothing shinny (no chrome plating)
  • nothing extremely matte (e.g. foam padding)
  • nothing translucent (no glass or plastic)
  • nothing hairy (no chinchillas)

Autodesk 123D Catch

2014-01-13 16.20.59

3D research is still going string in the Media Lab.  Today we reviewed a really cool application from from Autodesk called 123D Catch.  The concept of the application is simple, basically you use your cameras photos to take panoramic shots of a room or subject, the photos are uploaded to a cloud somewhere then converted to 3D.  You get the resulting image back.  You can see an example at the front of this post of a 3D Todd.  While 3D Todd isn’t quite as good as the real Todd, the resulting image obtained with free software using an iPhone was still competitive with even $1000 scanners.

 

 

More 3D Printing

1497557_10152078542832383_773105301_n

We’ve had the better part of a week with the 3D printers.  Our printing efforts have been conservative.  Much time has been spent reading documentation and using our printing material smartly (our order of filament hasn’t come in yet).  But we have printed a few things including a model of the Duke Chapel, a steam punk iPhone case, and other cool stuff.

But so far the most interesting thing about the project has been interacting with colleagues.  The printer is setup as a usable display now at ATC.  We run it a few times a day and have even attached a webcam so that people can see the operation.  Response has been great, a lot of people have stopped by and introduced themselves to the technology.

Take a look at our webcam feed or swing by the Media Lab if you have a moment – http://people.duke.edu/~dwb36/3dprint.html

First 3D Print

1515050_10152076001987383_1427276459_n

We spent some time with the new 3D Printers today and managed to create a few small trinkets from the stock files they sent with.  Our first impressions are very positive.  It took a while to get everything unboxed, setup and calibrated but never the less it was worth the wait.  Above is a nut and bolt we created.  It took about 30 minutes for the system to built it and another 15 minutes or so for the unit to heat up.  Setup time was less than an hour to get a single system calibrated and loaded with new filament (3D ink).

 

Install FFMPEG on a Mac

ffmpeg-logo

For those that don’t know about FFmpeg, it’s a project containing a command line suite of tools that allow someone to convert quite literally any piece of media to any other piece of media.  It includes an entire library of codecs that can be used to wrangle just about any piece of video.  It includes sub-components that allow for just about infinite flexibility.  If you work in media you probably use VLC, a great tool, plays about anything.  VLC actually uses the codec library from the FFmpeg project, that’s why it works so well.  Components of FFmpeg are found in many media products.

However, one down side to FFmpeg is that it’s flexible nature can make it tricky to install.  Thankfully it has gotten easier though the simpler installation methods aren’t well published.  A friend emailed over the weekend asking how to install it on a Mac.  I promised a blog post in our community that might help him or others dealing with media.

I’ll assume you have 10.9 or newer.  There are two options for an “easy install”, Homebrew or MacPorts.  Note, don’t put Homebrew and MacPorts on the same machine, they don’t like each other, pick one or the other.

Install Xcode

  1. Install Xcode from the Mac App Store.
  2. Open Terminal, enter the following and click “accept” on the dialog box:
    xcode-select --install

Using MacPorts

  1. Get the latest MacPorts “easy installer” and install it: https://distfiles.macports.org/MacPorts/
  2. Install FFmpeg
sudo port install ffmpeg

Using Homebrew

  1. Install Homebrew with the following command:
    ruby -e "$(curl -fsSL https://raw.github.com/Homebrew/homebrew/go/install)"
  2. Install FFmpeg.  The command will be “brew install ffmpeg –ANY-OPTIONS-YOU-WANT”.
Example:  brew install ffmpeg --with-fdk-aac --with-tools

A couple quick notes.  You might be asking “what’s the difference between Homebrew and MacPorts?”  Well, they basically do the same thing.  Homebrew is a little easier to use, MacPorts is a little more complicated but powerful (though many would argue the point).  In truth I have just had a little easier time with MacPorts while I seem to have to wrestle a little more with Homebrew.  For instance, in this article I actually had some bugs with Homebrew I had to fight.  Conversely, the FFMPEG Project actually documents and supports Homebrew.

If there’s any interest in a “How To Use FFmpeg” post please comment!

Edit:  Here’s a list of optional installs using Homebrew

–with-fdk-aac  (Enable the Fraunhofer FDK AAC library)
–with-ffplay  (Enable FFplay media player)
–with-freetype  (Build with freetype support)
–with-frei0r  (Build with frei0r support)
–with-libass  (Enable ASS/SSA subtitle format)
–with-libcaca  (Build with libcaca support)
–with-libvo-aacenc  (Enable VisualOn AAC encoder)
–with-libvorbis  (Build with libvorbis support)
–with-libvpx  (Build with libvpx support)
–with-opencore-amr  (Build with opencore-amr support)
–with-openjpeg  (Enable JPEG 2000 image format)
–with-openssl  (Enable SSL support)
–with-opus  (Build with opus support)
–with-rtmpdump  (Enable RTMP protocol)
–with-schroedinger  (Enable Dirac video format)
–with-speex  (Build with speex support)
–with-theora  (Build with theora support)
–with-tools  (Enable additional FFmpeg tools)
–without-faac  (Build without faac support)
–without-lame  (Disable MP3 encoder)
–without-x264  (Disable H.264 encoder)
–without-xvid  (Disable Xvid MPEG-4 video encoder)
–devel  (install development version 2.1.1)
–HEAD  (install HEAD version)

Your XBOX Kinect is made by…Apple?

en-INTL_L_Xbox360_Kinect_Sensor_LPF-00004_RM2_mnco

PrimeSense, the company that developed the technology (and holds the patents) behind the XBOX Kinect was purchased by Apple three weeks ago.  It will be interesting to see how this all shakes out.  Upfront, this isn’t an Apple vs Microsoft thing though I’m sure some people think it might be.  Microsoft has licensed those patents for years to come so nothing will change on the XBOX front.

Though, it is interesting to speculate how they will use the test.  PrimeSense has arguably the best 3D sensors on the market and Apple has a long history of pushing the envelope with human interface and design.  It will be interesting to see what they can do with this technology and how it will be incorporated into future products.  Our interest in PrimeSense is as a 3D scanning technology that could be used in conjunction with any 3D lab we may working that could include such tools as 3D printers, laser cutters, 3D scanners, and the list.  PrimeSense enabled gear would be right at home so they are a company we have been following closely.

Network Distributed Audio – Airfoil

Airfoil

We had an interesting use case come up.  A question was posed about being able to listen to multiple audios streams in the same room and be able to select between them.  The first thing that popped into my head was “Silent Disco”.  After a little research I rediscovered a product from my past called Airfoil.

Airfoil is a nifty little app.  Basically the server installs on a PC or Mac and sends the audio from any program or the system output itself to it’s client, Airfoil Speakers.  Speakers can be installed on just about any device including various mobile devices and windows systems.  The backend connection technology is AirPlay which has a considerable list or pro’s and con’s in an enterprise environment.

The nice thing about Airfoil over other iOS apps that can deliver audio via AirPlay is its ability to deliver to multiple devices simultaneously and keep them in sync.  The effect is sort of a broadcast.  The other nice thing is the capability of playing any software including the system audio mix, basically if it comes out of your speakers Airfoil will deliver it to Speakers, you’re no longer tied to an app.

Screen Shot 2013-12-11 at 11.03.38 AM

How It Works

It couldn’t be easier.  Basically go to Rogue Ameba’s site and install the software.  If you’re on a mobile device, you can go to the App Store or Play for Speakers.  Launch Airfoil, select the audio you wish to push out and you’re done!  The client then opens Speakers, they will be presented with a list of Airfoils to chose from (if there’s more than one).  They click “recieve”, at that’s it!  That easy!

Screen Shot 2013-12-11 at 11.01.40 AM

How Much Does It Cost?

Airfoil is $25, there is a 15% discount for education.  The Airfoil Speakers app is free!  So for $21.25 you could build a great distributed audio system in a classroom.

Did You Notice Any Limitations?

The only thing I saw in the software I would like to see improved is the latency.  There is a little lag, about .5 seconds.  This could be annoying in applications where you are sharing system audio from video for an application like hearing assistance.

Final Thoughts

In the consumer market this app may be useful for parties, or multi-zone distributed audio in a connected home.  In the enterprise this could be a great fit inside of a large conference room, exhibition hall, command center or other situation where multiple presentations are happening at the same time without physical dividers between people.  Basically, put in your headphones, grab the feed you want to listen to with Airfoil Speakers on your mobile.

It’s important to reiterate again, this is a Mac OR PC OR Mobile application, it’s truly cross platform.

 

4K Netflix

Screen Shot 2013-11-04 at 1.38.08 PM

Two months ago we were asked to begin exploring 4K streaming.  My initial reaction was “well that’s not going to work” and “we’re years away from 4K streaming services”.  I was wrong.  Today, Netflix began offering a 4K test video you can watch called El Fuente.  The video is available in multiple sizes, frame rates and bitrates so you can experiment with what looks good and works on your computer.  Our experience was on a newer iMac with a resolution of WQHD.  While this isn’t 4k it is four times the resolution of 720 and the best thing we have on hand.  The video was a little bit jerky and pixelated at first but ones it pad played for a minute and the adaptive streaming adjusted correctly we did see quite a beautiful image.