Geeky Goggles Let You Take a Field Trip Without Leaving Class

by Robin A. Smith

Kun Li of the Center for Instructional Technology and senior physics major Nicole Gagnon try out a virtual reality headset called Oculus Rift. Photo by Jeannine Sato.

Kun Li of the Center for Instructional Technology and senior physics major Nicole Gagnon try out a virtual reality headset called Oculus Rift. Photo by Jeannine Sato.

On the last day of class, just a few yards from students playing Twister and donning sumo suits, about two dozen people try on futuristic goggles in a windowless conference room.

Behind the clunky headgear, they are immersed in their own virtual worlds.

One woman peers inside a viewer and finds herself underwater, taking a virtual scuba tour.

The sound of breathing fills her headphones and bubbles float past her field of view.

When she looks left or right the image on the screen moves too, thanks to a tiny device called an accelerometer chip — the same gadget built into most smartphones that automatically changes the screen layout from landscape to portrait as the phone moves or tilts.

She turns her head to “swim” past corals and schools of fish. Suddenly a shark lunges at her and bares its razor teeth. “Whoa!” she yelps, taking a half-step back into a table.

A few feet away, virtual reality enthusiast Elliott Miller from Raleigh straps on something that looks like a pair of ski goggles and takes a hyperrealistic roller coaster ride.

He swivels in his office chair for a 100-degree view of the other passengers and the coaster’s corkscrews, twists and turns as he zips along at more than 60 miles per hour, in HD resolution.

“It feels pretty real. Especially when you’re going up a big drop,” Miller said.

Elliott Miller uses a virtual reality headset to take a ride on a real-life roller coaster in Sweden called the Helix. Photo by Jeannine Sato.

Elliott Miller uses a virtual reality headset to take a ride on a real-life roller coaster in Sweden called the Helix. Photo by Jeannine Sato.

Duke senior Nicole Gagnon declines a ride. “I get motion sick,” she said.

Virtual reality headsets like these aren’t in use in Duke classrooms — at least not yet.

Since its beginnings in the mid-1980s, the technology has mostly been developed for the gaming industry.

“[But] with virtual reality becoming more widespread, it won’t be long before it makes it to the classroom,” said Seth Anderson from Duke’s Center for Instructional Technology.

Duke chemistry professor Amanda Hargrove and postdoc Gary Kapral have been testing out ways to use the devices in their chemistry courses.

Thanks to funding from the Duke Digital Initiative, they designed a program that shrinks students down to the size of a molecule and lets them explore proteins and nucleic acids in 3-D.

“We call this demo the ‘Molecular Jungle Gym,’” Kapral said. “You can actually go inside, say, a strand of RNA, and stand in the middle and look around.”

The pilot version uses a standard Xbox-style controller to help students understand how proteins and nucleic acids interact with each other and with other kinds of molecules — key concepts for things like drug design.

Kapral has found that students who use virtual reality show better understanding and retention than students who view the same molecules on a standard computer screen.

“The Duke immersive Virtual Environment (DiVE) facility has been doing this for a long time, but you have to physically go there,” said Elizabeth Evans of the Duke Digital Initiative. “What makes virtual reality headsets like these different is they make virtual reality not only portable but also affordable.”

Duke student Nicole Gagnon peers through a cardboard viewer that turns any smartphone into a virtual reality headset. Photo by Jeannine Sato.

Duke student Nicole Gagnon peers through a cardboard viewer that turns any smartphone into a virtual reality headset. Photo by Jeannine Sato.

Of course, “affordable” is relative. The devices Kapral and Hargrove are using cost more than $300 per headset. But for less than 20 dollars, anyone can turn a smartphone into a virtual reality headset using a simple kit from makers like Google Cardboard, which designs viewers made of folded cardboard.

Critics of virtual reality technology say it’s just another form of escapism, after TV, the Internet and smartphones.

But educational technology advocates see it as a way to help students see and hear and interact with things that would be impossible otherwise, or only available to a lucky few:  to travel back in time and take virtual field trips to historic battlefields as cannon fire fills the air, to visit archeological sites and examine one-of-a-kind cultural artifacts from different angles, or experience different climate change scenarios predicted for the future.

“It’s hard to imagine what one inch versus one foot of sea level rise means unless you stand on a beach and experience it,” Evans said. “Virtual reality could let us give students experiences that are too expensive, too dangerous, or too rare to give them in real life.”

Kapral agrees: “One day students could even do chemistry experiments without worrying about blowing things up.”

Join the mailing list for virtual reality at Duke: https://lists.duke.edu/sympa/subscribe/vr2learn

In a free mobile app called SeaWorld VR, the screen displays two images side by side that the viewer’s brain turns into a 3-D image:

Lights. Camera. Action. Sharpen.

by Anika Radiya-Dixit

On Friday, April 10, while campus was abuzz with Blue Devil Days, a series of programs for newly admitted students, a group of digital image buffs gathered in the Levine Science Research Center to learn about the latest research on image and video de-blurring from Duke electrical and computer engineering professor Guillermo Sapiro. Professor Sapiro specializes in image and signal analysis in the department of Computer and Electrical Engineering in Duke’s Pratt School of Engineering. Working alongside Duke postdoctoral researcher Mauricio Delbracio, Sapiro has been researching methods to remove image blur due to camera shake.

Sapiro’s proposed algorithm is called burst photography, which achieves “state-of-the-art results an order of magnitude faster, with simplicity for on-board implementation on camera phones.” As shown in the image below, this technique combines multiple images, where each has a random camera shake and therefore each image in the burst is blurred slightly differently.

Professor Sapiro explains the basic principle of burst photography.

Professor Sapiro explains the basic principle of burst photography.

To de-blur the image, Sapiro’s algorithm then aligns the images together using a gyroscope and combines them in the Fourier domain. The final result essentially takes the best parts of each slightly-blurred image — such as the ones below — and gives sharpened images a greater weight when averaging blurred images in the burst.

Set of images with varying degrees of linear blur.

Set of images with varying degrees of linear blur.

This technique also produces phenomenal effects in video sharpening by collapsing multiple blurred frames into a single sharpened picture:

Contrast between sample frame of original video (left) with FBA sharpened video (right).

Contrast between sample frame of original video (left) with FBA sharpened video (right).

One impressive feature of burst photography is that it allows the user to obtain a mixed-exposure image by taking multiple images at various levels of exposure, as can be seen in parts (a) and (b) in the figure below, and then combining these images to produce a splendid picture (c) with captivating special effects.

Result of FBA algorithm on combining images with various levels of exposure.

Result of FBA algorithm on combining images with various levels of exposure.

If you are interested in video and image processing, email Professor Sapiro or check out his lab.

Got Data? 200+ Crunch Numbers for Duke DataFest

Photos by Rita Lo; Writing by Robin Smith

While many students’ eyes were on the NCAA Tournament this weekend, a different kind of tournament was taking place at the Edge. Students from Duke and five other area schools set up camp amidst a jumble of laptops and power cords and white boards for DataFest, a 48-hour stats competition with real-world data. Now in its fourth year at Duke, the event has grown from roughly two dozen students to more than 220 participants.

Teams of two to five students had 48 hours to make sense of a single data set. The data was kept secret until the start of the competition Friday night. Consisting of visitor info from a popular comparison shopping site, it was spread across five tables and several million rows.

“The size and complexity of the data set took me by surprise,” said junior David Clancy.

For many, it was their first experience with real-world data. “In most courses, the problems are guided and it is very clear what you need to accomplish and how,” said Duke junior Tori Hall. “DataFest is much more like the real world, where you’re given data and have to find your own way to produce something meaningful.”

“I didn’t expect the challenge to be so open-ended,” said Duke junior Greg Poore. “The stakeholder literally ended their ‘pitch’ to the participants with the company’s goals and let us loose from there.”

As they began exploring the data, the Poke.R team discovered that 1 in 4 customers spend more than they planned. The team then set about finding ways of helping the company identify these “dream customers” ahead of time based on their demographics and web browsing behavior — findings that won them first place in the “best insight” category.

“On Saturday afternoon, after 24 hours of working, we found all the models we tried failed miserably,” said team member Hong Xu. “But we didn’t give up and brainstormed and discussed our problems with the VIP consultants. They gave us invaluable insights and suggestions.”

Consultants from businesses and area schools stayed on hand until midnight on both Friday and Saturday to answer questions. Finally, on Sunday afternoon the teams presented their ideas to the judges.

Seniors Matt Tyler and Justin Yu of the Type 3 Errors team combined the assigned data set with outside data on political preferences to find out if people from red or blue cities were more likely to buy eco-friendly products.

“I particularly enjoyed DataFest because it encouraged interdisciplinary collaboration, not only between members from fields such as statistics, math, and engineering, but it also economics, sociology, and, in our case, political science,” Yu said.

The Bayes’ Anatomy team won the best visualization category by illustrating trends in customer preferences with a flow diagram and a network graph aimed at improving the company’s targeting advertising.

“I was just very happily surprised to win!” said team member and Duke junior Michael Lin.

Mapping Science: The Power of Visualization

By Lyndsey Garcia

Mobile Landscapes: Using Location Data from Cell Phones for Urban Analysis

Mobile Landscapes: Using Location Data from Cell Phones for Urban Analysis

We are constantly surrounded by visuals: television, advertisements and posters. Humans have been using visuals such as cartographic maps of the physical world to help guide our exploration and serve as a reminder of what we have already learned.

But as research has moved into more abstract environments that are becoming more difficult to interact with or visualize, the art of science mapping has emerged to serve as a looking glass to allow us to effectively interpret the data and discern apparent outliers, clusters and trends.

Now on display from from January 12 to April 10, 2015, the exhibit Places & Spaces: Mapping Science serves as a fascinating and intriguing example of the features and importance of science mapping.

The end result of a ten-year effort with ten new maps added each year, all one hundred maps are present at Duke at three different locations: the Edge in Bostock Library, the third floor of Gross Hall, and the second floor of Bay 11 in Smith Warehouse.

Visualizing Bible Cross-References

Visualizing Bible Cross-References

Science maps take abstract concepts of science and make them more visible, concrete, and tangible. The scope of the exhibit is broad, including science maps of the internet, emerging pandemics in the developing world, even the mood of the U.S. based on an analysis of millions of public tweets. Science mapping is not limited to the natural or technological sciences. Several maps visualize social science data such as Visualizing Bible Cross Connections and Similarities Throughout the Bible, where the axis represents the books of the Bible and the arches portray connections or similar phraseology between the books.

Angela Zoss, the exhibit ambassador who brought the project to Duke, comments, “The visualization helps at multiple phases of the research process. It helps the researcher communicate the data and understand his or her data better. When we try to summarize things with equations or summary statistics, such as the average, the mean, or the median, we could be glossing over very important patterns or trends in the data. With visualization, we can often visualize every single point in space for small data sets. One might be able to detect a pattern that you would never have been lost in simple summary statistics.”

The physical exhibit holds importance to the Places & Spaces project due to the physical printing of the maps. Some of the details on the maps are so intricate that they require an in-person viewing of the map in order to appreciate and understand the information portrayed. Such as, A Chart Illustrating Some of the Relations Between the Branches of Natural Science and Technology, is a hand-drawn map from 1948 showing the relationships between the branches of natural sciences and technology by using a distance-similarity metaphor, in which objects more similar to each other are more proximate in space.

A Chart Illustrating Some of the Relations between the Branches of Natural Science and Technology. Used by permission of the Royal Society

The maps look more like works of art in a museum than a collection of maps to interpret data. Angela Zoss explains her love of visualization as, “Visual graphics can inspire an emotion and excitement in people. It can encourage people to feel for information that would otherwise seem dry or intangible. The exhibit heightens those emotions even more because you see so many wonderful examples from so many different viewpoints. Every visualizing person is going to make a different choice in the details they want represented. Being able to see that variety gives people a better idea of how much more is possible.”

Fruit flies get their close-up shot, Nobel style

By Robin Ann Smith

Any movie that begins with an extreme close-up of the back side of a fruit fly — the same kind found feeding on over-ripe tomatoes and bananas in your kitchen — may seem like an unlikely candidate for action blockbuster of the year. But this is no typical movie.

Duke biologists Dan Kiehart and Serdar Tulu recorded this 3D close-up of a developing fly embryo using new super-resolution microscope technology developed by Eric Betzig, one of the winners of the 2014 Nobel Prize in Chemistry.

Cutting-edge microscopes available on many campuses today allow researchers to take one or two images a second, but with a new technique called lattice light-sheet microscopy — developed by Betzig and colleagues and reported in the Oct. 24, 2014, issue of Science — researchers can take more than 50 images a second, and in the specimen’s natural state, without smooshing it under a cover slip.

Kiehart and Tulu traveled to the Howard Hughes Medical Institute’s Janelia Farm research campus in Ashburn, Virginia, where the new microscope is housed, to capture the early stages of a fruit fly’s development from egg to adult in 3D.

Fruit fly embryos are smaller than a grain of rice. By zooming in on an area of the fly embryo’s back that is about 80 microns long and 80 microns wide — a mere fraction of the size of the period at the end of this sentence — the researchers were able to watch a line of muscle-like cells draw together like a purse string to close a gap in the fly embryo’s back.

The process is a crucial step in the embryo’s development into a larva. It could help researchers better understand wound healing and spina bifida in humans.

Their movie was assembled from more than 250,000 2D images taken over 100 minutes. The hundreds of thousands of 2D snapshots were then transferred to a computer, which used image-processing software to assemble them into a 3D movie.

“This microscope gives us the highest combination of spatial and temporal resolution that we can get,” Kiehart said.

Betzig won this year’s Nobel Prize for his work on techniques that allow researchers to peer inside living cells and resolve structures smaller than 200 nanometers, or half the wavelength of light — a scale once thought impossible using traditional light microscopes.

Even finer atomic-scale resolution has long been possible with microscopes that use beams of electrons rather than light, but only by killing and slicing the specimen first, so living cells and the tiny structures in motion inside them couldn’t be observed.

Betzig and collaborators Wesley Legant, Kai Wang, Lin Shao and Bi-Chang Chen of Janelia Farm Research Campus all played a role in developing this newest microscope, which creates an image using a thin sheet of patterned light.

The fly movie is part of a collection of videos recorded with the new technology and published in the Oct. 24 Science paper.

One video in the paper shows specialized tubes inside cells called microtubules — roughly 2,000 times narrower than a human  hair — growing and shrinking as they help one cell split into two.

Other videos reveal the motions of individual cilia in a single-celled freshwater creature called Tetrahymena, or cells of a soil-dwelling slime mold banding together to form multicellular slugs.

Kiehart and Tulu will be going back to Janelia Farm in January to use the microscope again.

“For this visit we’re going to zoom in to a smaller area to look at individual cells,” Tulu said.

“Waking up the morning of October 8 and hearing on the radio that our paper includes a Nobel Prize winner was pretty special,” Kiehart said.

CITATION: “Lattice light-sheet microscopy: Imaging molecules to embryos at high spatiotemporal resolution,” Chen, B.-C., et al. Science, October 2014. http://www.sciencemag.org/content/346/6208/1257998