The LightBoard Is Done!

After few weeks of rapid prototyping we’ve finished Duke’s version of the LightBoard pioneered by Michael Peshkin.

[youtube]http://www.youtube.com/watch?v=N1I4Afti6XE[/youtube]

This great project will give a low tech method for people with traditional whiteboard learning modules to present them in a more camera friendly way for lecture capture or production.  The concept is simple, a large piece of glass is placed between the presenter and the camera who writes on it with a day-glow marker.  The video is mirrored so the text appears correctly to the viewer.  The result is a presentation where the lecturer can visually explain the lesson without turning their back to the learners or blocking the writing with their body.

2014-06-10 13.59.59Lightboard Manual-DDMC

 

 

2014-05-28 17.00.572014-05-28 17.02.28

That’s the simple concept.  In actuality the engineering was more difficult and we had several project goals and improvements we wanted to make over the original Peshkin model:

  • The project needed to be portable and lightweight but still not vibrate or move
  • Our version should simplify the lighting challenges as much as possible
  • The unit should be quickly deploy-able since it will be setup before and after shoots
  • The design of the system should not include any more custom fabricated parts than necessary and should consist primarily of commodity components easily sourced online
  • Final design should be documented and easily constructed by a junior technician or a non-professional fabricator.  It should not include any skilled labor for the build such as complicated electrical work, welding or metal cutting.

We made several simple improvements to accomplish our goals.  The structure was built entirely of 80/20 extruded aluminum making it light weight and simple to build while being easily sourced through online retailers.  We opted for an electronic solution to mirroring the camera versus using physical mirrors.  The system uses high quality casters making it easy to move.  The height of the design was also a factor as it needed to fit through a standard commercial door.  We avoided the temptation to at every turn to involve more complicated components and processes for the sake of simplicity.

2014-05-27 17.09.54

Regarding our lighting and deploy-ability.  We are operating on the assumption that most universities have access to a 3D Printer at this point.  Components that utilized custom designs have been published in formats that are easily 3D printable.  The designs of those components are also easily fabricated with simple carpentry in any campus wood or scene shop.

The one stand out of the design is the LED mounting system.  We found that no such product existed on the market so we designed our own and consider it to be the center piece of this version.  We found that holding the LED’s directly to the glass was the key to using low cost LED strips and less power.  The further the light source is form the glass, the less illumination there is by orders of magnitude.  There is also light leakage.  The system we designed holds the LED strips precisely and directly to the glass leaving little light leakage and greatly improving the efficiency of the light transmission.  We also discovered that a key to the design was to have the space on the lecturer half of the board well lit while the camera side of the board remains as dark as possible.  Applying large studio lights to that goal makes traditional 3-point lighting difficult since the lecturer will be just two or three feet from the glass.  To overcome that problem we added additional tracks for more LED’s that can discretely but effectively and evenly illuminate the subject.  This also allows light tuning, the strips on the subject side can be tuned to 2700-3000K for skin tones while light penetrating the glass can be tuned to 6000K to better illuminate the markers.  The result of the LED Clip greatly reduces the need for additional studio fixtures.

2014-06-04 11_31_37

IMG_7189IMG_7184

Open Hardware

We have done diligence documenting the build process and hope this can assist other institutions in constructing their own!  https://wiki.duke.edu/display/LIG/Lightboard

Lightboard Manual-Step3 Lightboard Manual-Step6

SkypeTX

Screen Shot 2014-04-30 at 6.43.16 AM

We’ve discussed some of the difficulties with integrating professional gear into various soft conferencing systems before.  While there are some great solutions such as the Wirecast virtual camera driver, the Vaddio AV Bridge, Epiphan, and of course scan converters, all have their limitations since they are essentially adapting the signal.  Skype apparently has recognized the need for industrial signals and encoding with a recently announced product called SkypeTX.  The system will include HD-SDI connections by beyond that little has been announced.  This will be an exciting product when it hits the market!

https://media.skype.com/skype-tx/

6G or 12G? The Next Format War

800px-BNC_connector_(male)

For those that work in broadcast or production, the Serial Digital Interface or SDI for short, is king.  It’s the professional standard for AV connections.  SDI has been around since the 90’s and has been updated along they way as the world transitioned from 480i to 480P to 720P to 1080i to 1080P from copper to fiber.  The spec title changed along the way, from 259M to 292 to 424.  259 could carry about 270Mb/s of data with 292 (for 1080i) carrying about 1.5Gb/s.  424 upped the game again to about 3Gb/s.  SMPTE has been the governing body of the standard so this transition has always been smooth.

With 4K on the horizon we’re now seeing two flavors of SDI.  4K and UHD video requires four times the bandwidth of HD video, which itself is almost-2k.  So in theory, for 4K to work, you should need 12Gb/s, after all, 1080P needs 3Gb/s.  The answer to feeding the massive bandwidth requirements of 4K has been to use four existing 424M SDI (informally called 3G SDI) to create 12G SDI.  For single-camera productions this might not be a big deal, just hook up four cables from your camera to your recorder or confidence monitor and call it a day.  But for multi-camera productions, people doing live switching, and the increasing number of engineers in the integration space who are taking advantage of the benefits of SDI, running four cables to each device quickly creates a mess.  So what to do?  Enter 6G SDI.  Rather than relying on four cables (8 conductors) to create 12G, 6G SDI relies on technical improvements in the spec to double the bandwidth.  But 6 is still half of 12, how to they make up the rest of the bandwidth?  Well, they have been cutting into the color sampling to do that.  The result is a little sacrifice in quality for improved integration and simplified engineering.

Who cares?  The issue at the moment is which spec will win, 12G or 6G SDI.  Neither spec has been officially ratified by SMPTE yet but gear using 6 and 12G are already shipping.  Blackmagic for instance, is heavily invested in 6G.  AJA, the other major production capture card manufacturer, is heavily invested in 12G.  So if you’re trying to buy into 4K right now which standard is the most future proof?  Time will tell.

4K Pipelines from dot Hill

I had the opportunity to attend a great seminar today from dot Hill today about 4K workflows. They passed a lot of good information for storage bandwidth build outs and I thought I would list some of my takeaways.

  1. 4K isn’t coming, it’s here.  This isn’t an exotic “theater only” technology, it’s already in the living room.  Netflix is already streaming House of Cards and Breaking Bad at 4K.
  2. For storage, they mentioned using 28TB of data for a single 4K feature length production
  3. Probably the most controversial point, SMB/CIFS storage just isn’t going to cut it for the performance 4K production needs.
  4. They said they have spent several months exploring SMB3 as a possibility but ultimately found it still under performs clustered filesystems.
  5. When planning storage they recommend designing architecture around 4 streams per editor.  This means you need to multiply bandwidth times four.
  6. They recommend Fibre Channel architecture as well and have stated they have had better performance results than 10gig Ethernet.  They also cited a lack of 10gig HBA’s and drivers for Mac systems.
  7. They suggested a very interesting product from ATTO called ThunderLink that is a great Fibre Channel HBA for Thunderbolt equipped machines.
  8. Their designs seem centered around StorNext / XSAN (really the same thing) technology.
  9. They said they have noticed a functional life of 5 years for enterprise SSD’s before they start to degrade

There were many other good points and general discussion related to video production storage and I would recommend the seminar to anyone considering such a project.

Screen Shot 2014-03-12 at 10.05.09 AM

Maxing Out Your GoPro

Screen Shot 2014-03-03 at 12.59.51 PM
GoPro has seen phenomenal success with their line of cameras that have put low cost POV production into the hands of every person with an active hobby. Their brand is as synonymous with POV cameras as Kleenex is with facial tissue.

Their latest model, the Hero3, saw heavy engineering improvements and the addition of the Sony IMX117 Exmor sensor made it an absolute power house. The sensor is capable of full 4K DCI at 60P (though the Hero3’s engine can only hand 15fps at 4K). But the camera is still considered by many pro’s dabbling with prosumer equipment to be a one trick pony only useful for POV.

Why? The big limiting factor is the lens. To put it bluntly, the lens sucks. While it’s awesome for POV where everything is thoughtlessly in focus with zero need for adjustment. However, that’s a bad thing for people wanting to shoot in situations where “POV quality” and lack of adjustability aren’t acceptable.

Enter the solution, the GoPro Hero3 Ribcage. This little system is for hardware hackers. Basically you rip the front of the camera off (lens included) and replace it with a piece of machined metal. This system allows the mounting of CS or C mount lenses and through adapters, Nikon, MFT, or just about anything else. It also maintains compatibility with the default M12 lens so you can always get the stock look back if you so desire. You can see an example of the quality you can now get out of using improved lenses (notice the depth of field in the shot). Could a Hero3 be your next high end camera?

AirPlay & AirParrot

2014-03-02 20.49.35

I’m beginning to think that much of the expensive technology that goes into an average integrated classroom could be replaced by cheap (less than a $100) boxes running AirPlay.

One of the big challenges holding back this tech is the lack of mirroring support on Windows devices. There is a relatively cheap piece of software out there that does provide this functionality for non-Apple users called AirParrot. I decided to give it a whirl this weekend.

There really isn’t much to report beyond the fact that it worked flawlessly, installed easily and otherwise did everything it’s supposed to do. There’s an icon on the dock, simply right click it and select the AirPlay device on the other side you want to mirror and off you go. There was one hiccup in that it didn’t “wake” my AppleTV from sleep mode after several attempts but I’m not sure where that issue exists. Though once I clicked on the remote to wake it up, everything worked as it should. Latency is low, less than a half second. Audio also works but I had to make sure I checked that option when mirroring.

The license is $9.99 retail but volume and educational pricing plans exist. Definitely a must have piece of software.

images

Using Pro Cameras in a USB World

Often times we have events, projects or circumstances that require us to use high end cameras with software that is really only designed for USB cams. Professional and Prosumer gear almost always lacks the USB interface so this can be a challenge. Pro’s typically use SDI video connections while Prosumer products usually use the HDMI standard. There are a ton of PCI/PCIe cards and capture devices out there from companies like Blackmagic, AJA and Matrox that bring our video into the computer. Unfortunately, programs like Skype, WebEx, Screenflow and the like don’t often pick up on those devices as a valid input. So what’s a video professional to do, give up on their pro gear and use a $59 Logitech?

One cool solution is to use Wirecast. Yes, this is a piece of software for streaming but it has one extremely valuable and almost unknown feature, “The Virtual Camera Driver”. Wirecast, which takes virtually any popular input device and typically streams to a streaming server or live streaming service will also easily convert that same stream to be a “USB’ish” output. This will allow you to take your professional and prosumer video gear and leverage it against popular soft conferencing services like WebEx and Skype or rapid E-learning software like Screenflow or Camtasia. You get the added benefit of using Wirecast to add effects to video like lower thirds, multiple cameras and other enhancements if you so choose. To see how this is setup, take a look at our demo video shot in the self-service, open-to the-campus MPS Recodring Studio!

If your browser doesn’t embed the video, check it out here:  http://vimeo.com/87130283

[vimeo]https://vimeo.com/87130283[/vimeo]

Cheaper 4K

side2

Blackmagic Designs reduced the price of their 4K Cinema Camera today from $3995 to $2995.

http://www.blackmagicdesign.com/products/blackmagicproductioncamera4k

Simple Lighting Tweaks For Video

In our MPS recording studio I set out to improve the lighting quality with a low budget. I thought I would share my strategies and results in hopes other might benefit. Cameras do a bad job of reproducing what the eye can see. Our eyes easily filter out a broad spectrum of light to produce a pleasing image. Cameras aren’t that smart, one of the ways we can help them out is to control the temperature and power of the light they receive.

Like most rooms, our recording space has a simple office style suspended lighting and bulbs. The generic contractor grade bulbs are about $5 a piece and the room holds 8 of them. While common florescent are notoriously horrible for video work, fluorescent technology itself has been used in studios for years to great effect. The trick is to use good bulbs, the right temperatures and the right power. In most situations contractors will just stuff whatever is cheapest into a fixture so taking a few minutes and a few dollars to correct this can yield great results.

I did a mock up with a visualizer but that’s more than what most people would need. From past experience I know that in most situations 3200K “warm” temperatures look the best on a variety of skin tones. I also know you need slightly more wattage the further away the fixtures are from the subject. The front lights are about 8 feet away while the ones behind me are about 4 feet so I made the closer ones about 25% less powerful. I replaced the front 40 watt, 4100K bulbs with 32 watt, 3500K bulbs. I replaced the same 40 watt, 4100K bulbs in the back of the room with 20 watt 3500K bulbs as well.

As far as the result, here’s a before and after. While it didn’t make a radical difference the improvement is definitely noticeable. You’ll see the glare on my receding hairline is much small in the after photos. You’ll also notice my skin tone is much richer and more vivid, as is the colors on my clothing. The black color is also better, the muslin behind me now barely shows up indicating a correct black point. All in all the cost to upgrade the room was about $80 and took only a few minutes.

I would easily say that the improvement to the quality of the video is at least 10% and it was as easy as changing a light bulb.

Run OS X in VMware 6

I just installed VMware Fusion 6 and found this great little feature that lets you easily install OS X as a virtual machine.  For those wondering why I would want to run Apple’s OS on a machine already running Apple’s OS, the answer is “testing”!  Working in IT I am always testing new programs and hacking at my OS.  With our 3D printing program alone I’ve installed 54 pieces of software.  For years I would always keep a spare Mac around just to try thing out before doing something that might require the inconvenience of re-imaging my every-day-work machine. No longer, I can use VM’s as a great desktop sandbox and testing environment without the expense or hassle of maintaining another piece of hardware.

You have been able to install OS X on past versions of VMware Fusion but it was a bit inconvenient.  The Fusion 6 makes it point-and-click-easy, you don’t even need to have an installation image or serial number if your using Mavericks.  If you try this out, make sure you take advantage of VMware’s snapshot feature.  This lets you save and restore a VM so you can unwind any changes.