Duke Research Blog

Following the people and events that make up the research community at Duke.

Category: Visualization (Page 2 of 11)

Creating Technology That Understands Human Emotions

“If you – as a human – want to know how somebody feels, for what might you look?” Professor Shaundra Daily asked the audience during an ECE seminar last week.

“Facial expressions.”
“Body Language.”
“Tone of voice.”
“They could tell you!”

Over 50 students and faculty gathered over cookies and fruits for Dr. Daily’s talk on designing applications to support personal growth. Dr. Daily is an Associate Professor in the Department of Computer and Information Science and Engineering at the University of Florida interested in affective computing and STEM education.

Dr. Daily explaining the various types of devices used to analyze people’s feelings and emotions. For example, pressure sensors on a computer mouse helped measure the frustration of participants as they filled out an online form.

Affective Computing

The visual and auditory cues proposed above give a human clues about the emotions of another human. Can we use technology to better understand our mental state? Is it possible to develop software applications that can play a role in supporting emotional self-awareness and empathy development?

Until recently, technologists have largely ignored emotion in understanding human learning and communication processes, partly because it has been misunderstood and hard to measure. Asking the questions above, affective computing researchers use pattern analysis, signal processing, and machine learning to extract affective information from signals that human beings express. This is integral to restore a proper balance between emotion and cognition in designing technologies to address human needs.

Dr. Daily and her group of researchers used skin conductance as a measure of engagement and memory stimulation. Changes in skin conductance, or the measure of sweat secretion from sweat gland, are triggered by arousal. For example, a nervous person produces more sweat than a sleeping or calm individual, resulting in an increase in skin conductance.

Galvactivators, devices that sense and communicate skin conductivity, are often placed on the palms, which have a high density of the eccrine sweat glands.

Applying this knowledge to the field of education, can we give a teacher physiologically-based information on student engagement during class lectures? Dr. Daily initiated Project EngageMe by placing galvactivators like the one in the picture above on the palms of students in a college classroom. Professors were able to use the results chart to reflect on different parts and types of lectures based on the responses from the class as a whole, as well as analyze specific students to better understand the effects of their teaching methods.

Project EngageMe: Screenshot of digital prototype of the reading from the galvactivator of an individual student.

The project ended up causing quite a bit of controversy, however, due to privacy issues as well our understanding of skin conductance. Skin conductance can increase due to a variety of reasons – a student watching a funny video on Facebook might display similar levels of conductance as an attentive student. Thus, the results on the graph are not necessarily correlated with events in the classroom.

Educational Research

Daily’s research blends computational learning with social and emotional learning. Her projects encourage students to develop computational thinking through reflecting on the community with digital storytelling in MIT’s Scratch, learning to use 3D printers and laser cutters, and expressing ideas using robotics and sensors attached to their body.

VENVI, Dr. Daily’s latest research, uses dance to teach basic computational concepts. By allowing users to program a 3D virtual character that follows dance movements, VENVI reinforces important programming concepts such as step sequences, ‘for’ and ‘while’ loops of repeated moves, and functions with conditions for which the character can do the steps created!

 

 

Dr. Daily and her research group observed increased interest from students in pursuing STEM fields as well as a shift in their opinion of computer science. Drawings from Dr. Daily’s Women in STEM camp completed on the first day consisted of computer scientist representations as primarily frazzled males coding in a small office, while those drawn after learning with VENVI included more females and engagement in collaborative activities.

VENVI is a programming software that allows users to program a virtual character to perform a sequence of steps in a 3D virtual environment!

In human-to-human interactions, we are able draw on our experiences to connect and empathize with each other. As robots and virtual machines grow to take increasing roles in our daily lives, it’s time to start designing emotionally intelligent devices that can learn to empathize with us as well.

Post by Anika Radiya-Dixit

Seeing Nano

Take pictures at more than 300,000 times magnification with electron microscopes at Duke

Sewer gnat head

An image of a sewer gnat’s head taken through a scanning electron microscope. Courtesy of Fred Nijhout.

The sewer gnat is a common nuisance around kitchen and bathroom drains that’s no bigger than a pea. But magnified thousands of times, its compound eyes and bushy antennae resemble a first place winner in a Movember mustache contest.

Sewer gnats’ larger cousins, horseflies are known for their painful bite. Zoom in and it’s easy to see how they hold onto their furry livestock prey:  the tiny hooked hairs on their feet look like Velcro.

Students in professor Fred Nijhout’s entomology class photograph these and other specimens at more than 300,000 times magnification at Duke’s Shared Material & Instrumentation Facility (SMIF).

There the insects are dried, coated in gold and palladium, and then bombarded with a beam of electrons from a scanning electron microscope, which can resolve structures tens of thousands of times smaller than the width of a human hair.

From a ladybug’s leg to a weevil’s suit of armor, the bristly, bumpy, pitted surfaces of insects are surprisingly beautiful when viewed up close.

“The students have come to treat travels across the surface of an insect as the exploration of a different planet,” Nijhout said.

Horsefly foot

The foot of a horsefly is equipped with menacing claws and Velcro-like hairs that help them hang onto fur. Photo by Valerie Tornini.

Weevil

The hard outer skeleton of a weevil looks smooth and shiny from afar, but up close it’s covered with scales and bristles. Courtesy of Fred Nijhout.

fruit fly wing

Magnified 500 times, the rippled edges of this fruit fly wing are the result of changes in the insect’s genetic code. Courtesy of Eric Spana.

You, too, can gaze at alien worlds too small to see with the naked eye. Students and instructors across campus can use the SMIF’s high-powered microscopes and other state of the art research equipment at no charge with support from the Class-Based Explorations Program.

Biologist Eric Spana’s experimental genetics class uses the microscopes to study fruit flies that carry genetic mutations that alter the shape of their wings.

Students in professor Hadley Cocks’ mechanical engineering 415L class take lessons from objects that break. A scanning electron micrograph of a cracked cymbal once used by the Duke pep band reveals grooves and ridges consistent with the wear and tear from repeated banging.

cracked cymbal

Magnified 3000 times, the surface of this broken cymbal once used by the Duke Pep Band reveals signs of fatigue cracking. Courtesy of Hadley Cocks.

These students are among more than 200 undergraduates in eight classes who benefitted from the program last year, thanks to a grant from the Donald Alstadt Foundation.

You don’t have to be a scientist, either. Historians and art conservators have used scanning electron microscopes to study the surfaces of Bronze Age pottery, the composition of ancient paints and even dust from Egyptian mummies and the Shroud of Turin.

Instructors and undergraduates are invited to find out how they could use the microscopes and other nanotech equipement in the SMIF in their teaching and research. Queries should be directed to Dr. Mark Walters, Director of SMIF, via email at mark.walters@duke.edu.

Located on Duke’s West Campus in the Fitzpatrick Building, the SMIF is a shared use facility available to Duke researchers and educators as well as external users from other universities, government laboratories or industry through a partnership called the Research Triangle Nanotechnology Network. For more info visit http://smif.pratt.duke.edu/.

Scanning electron microscope

This scanning electron microscope could easily be mistaken for equipment from a dentist’s office.

s200_robin.smith

Post by Robin Smith

X-mas Under X-ray

If, like me, you just cannot wait until Christmas morning to find out what goodies are hiding in those shiny packages under the tree, we have just the solution for you: stick them in a MicroCT scanner.

A christmas present inside a MicroCT scanner.

Our glittery package gets the X-ray treatment inside Duke’s MicroCT scanner. Credit Justin Gladman.

Micro computed-tomography (CT) scanners use X-ray beams and sophisticated visual reconstruction software to “see” into objects and create 3D images of their insides. In recent years, Duke’s MicroCT has been used to tackle some fascinating research projects, including digitizing fossils, reconstructing towers made of stars, peaking inside of 3D-printed electronic devices, and creating a gorgeous 3D reconstruction of organs and muscle tissue inside this Southeast Asian Tree Shrew.

x-ray-view

A 20 minute scan revealed a devilish-looking rubber duck. Credit Justin Gladman.

But when engineer Justin Gladman offered to give us a demo of the machine last week, we both agreed there was only one object we wanted a glimpse inside: a sparkly holiday gift bag.

While securing the gift atop a small, rotating pedestal inside the device, Gladman explained how the device works. Like the big CT scanners you may have encountered at a hospital or clinic, the MicroCT uses X-rays to create a picture of the density of an object at different locations. By taking a series of these scans at different angles, a computer algorithm can then reconstruct a full 3D model of the density, revealing bones inside of animals, individual circuits inside electronics – or a present inside a box.

“Our machine is built to handle a lot of different specimens, from bees to mechanical parts to computer chips, so we have a little bit of a jack-of-all-trades,” Gladman said.

Within a few moments of sticking the package in the beam, a 2D image of the object in the bag appears on the screen. It looks kind of like the Stay Puft Marshmallow Man, but wait – are those horns?

Blue devil ducky in the flesh.

Blue devil ducky in the flesh.

Gladman sets up a full 3D scan of the gift package, and after 20 minutes, the contents of our holiday loot is clear. We have a blue devil rubber ducky on our hands!

Blue ducky is a fun example, but the SMIF lab always welcomes new users, Gladman says, especially students and researchers with creative new applications for the equipment. For more information on how to use Duke’s MicroCT, contact Justin Gladman or visit the Duke SMIF lab at their website, Facebook, Youtube or Instagram pages.

Kara J. Manke, PhD

Post by Kara Manke

Mapping the Brain With Stories

alex-huth_

Dr. Alex Huth. Image courtesy of The Gallant Lab.

On October 15, I attended a presentation on “Using Stories to Understand How The Brain Represents Words,” sponsored by the Franklin Humanities Institute and Neurohumanities Research Group and presented by Dr. Alex Huth. Dr. Huth is a neuroscience postdoc who works in the Gallant Lab at UC Berkeley and was here on behalf of Dr. Jack Gallant.

Dr. Huth started off the lecture by discussing how semantic tasks activate huge swaths of the cortex. The semantic system places importance on stories. The issue was in understanding “how the brain represents words.”

To investigate this, the Gallant Lab designed a natural language experiment. Subjects lay in an fMRI scanner and listened to 72 hours’ worth of ten naturally spoken narratives, or stories. They heard many different words and concepts. Using an imaging technique called GE-EPI fMRI, the researchers were able to record BOLD responses from the whole brain.

Dr. Huth explaining the process of obtaining the new colored models that revealed semantic "maps are consistent across subjects."

Dr. Huth explaining the process of obtaining the new colored models that revealed semantic “maps are consistent across subjects.”

Dr. Huth showed a scan and said, “So looking…at this volume of 3D space, which is what you get from an fMRI scan…is actually not that useful to understanding how things are related across the surface of the cortex.” This limitation led the researchers to improve upon their methods by reconstructing the cortical surface and manipulating it to create a 2D image that reveals what is going on throughout the brain.  This approach would allow them to see where in the brain the relationship between what the subject was hearing and what was happening was occurring.

A model was then created that would require voxel interpretation, which “is hard and lots of work,” said Dr. Huth, “There’s a lot of subjectivity that goes into this.” In order to simplify voxel interpretation, the researchers simplified the dimensional subspace to find the classes of voxels using principal components analysis. This meant that they took data, found the important factors that were similar across the subjects, and interpreted the meaning of the components. To visualize these components, researchers sorted words into twelve different categories.

img_2431

The Four Categories of Words Sorted in an X,Y-like Axis

These categories were then further simplified into four “areas” on what might resemble an x , y axis. On the top right was where violent words were located. The top left held social perceptual words. The lower left held words relating to “social.” The lower right held emotional words. Instead of x , y axis labels, there were PC labels. The words from the study were then colored based on where they appeared in the PC space.

By using this model, the Gallant could identify which patches of the brain were doing different things. Small patches of color showed which “things” the brain was “doing” or “relating.” The researchers found that the complex cortical maps showing semantic information among the subjects was consistent.

These responses were then used to create models that could predict BOLD responses from the semantic content in stories. The result of the study was that the parietal cortex, temporal cortex, and prefrontal cortex represent the semantics of narratives.

meg_shieh_100hedPost by Meg Shieh

Students Mine Parking Data to Help You Find a Spot

No parking spot? No problem.

A group of students has teamed up with Duke Parking and Transportation to explore how data analysis and visualization can help make parking on campus a breeze.

As part of the Information Initiative’s Data+ program, students Mitchell Parekh (’19) and Morton Mo (’19) along with IIT student Nikhil Tank (’17), spent 10 weeks over the summer poring over parking data collected at 42 of Duke’s permitted lots.

Under the mentorship of graduate student Nicolas-Aldebrando Benelli, they identified common parking patterns across the campus, with the goal of creating a “redirection” tool that could help Duke students and employees figure out the best place to park if their preferred lot is full.

A map of parking patterns at Duke

To understand parking patterns at Duke, the team created “activity” maps, where each circle represents one of Duke’s parking lots. The size of the circle indicates the size of the lot, and the color of the circle indicates how many people entered and exited the lot within a given hour.

“We envision a mobile app where, before you head out for work, you could check your lot on your phone,” Mo said, speaking with Parekh at the Sept. 23 Visualization Friday Forum. “And if the lot is full, it would give you a pass for an alternate lot.”

Starting with parking data gathered in Fall 2013, which logged permit holders “swiping” in and out from each lot, they set out to map some basic parking habits at Duke, including how full each lot is, when people usually arrive, and how long they stay.

However, the data weren’t always very agreeable, Mo said.

“One of the things we got was a historical occupancy count, which is exactly what we wanted – the number of cars in the facility at a given time – but we were seeing negative numbers,” said Mo. “So we figured that table might not be as trustworthy as we expected it to be.”

Other unexpected features, such as “passback,” which occurs when two cars enter or exit under the same pass, also created challenges with interpreting the data.

However, with some careful approximations, the team was able to estimate the occupancy of lot on campus at different times throughout an average weekday.

They then built an interactive, Matlab-based tool that would suggest up to three alternative parking locations based on the users’ location and travel time plus the utilization and physical capacity of each lot.

“Duke Parking is really happy with the interface that we built, and they want us to keep working on it,” Parekh said.

“The data team worked hard on real world challenges, and provided thoughtful insights to those challenges,” said Kyle Cavanaugh, Vice President of Administration at Duke. “The team was terrific to work with and we look forward to future collaboration.”

Hectic class schedules allowing, the team hopes to continue developing their application into a more user-friendly tool. You can watch a recording of Mo and Parekh’s Sept. 23 presentation here.

The team's algorithm recommends up to three alternative lots if a commuter's preferred lot is full. In this video, suggested alternatives to the blue lot are updated throughout the day to reflect changing traffic and parking patterns. Video courtesy of Nikhil Tank.

Kara J. Manke, PhD

Post by Kara Manke

 

Is Durham’s Revival Pricing Some Longtime Residents Out?

When a 2015 national report on gentrification released its results for the nation’s 50 largest cities, both Charlotte and Raleigh — North Carolina’s top two biggest cities — made the list.

The result was a collection of maps and tables indicating whether various neighborhoods in each city had gentrified or not, based on changes in home values and other factors from 1990 to the present.

Soon Durham residents, business owners, policy wonks and others will have easy access to similar information about their neighborhoods too, thanks to planned updates to a web-based mapping tool called Durham Neighborhood Compass.

Two Duke students are part of the effort. For ten weeks this summer, undergraduates Anna Vivian and Vinai Oddiraju worked with Neighborhood Compass Project Manager John Killeen and Duke economics Ph.D. student Olga Kozlova to explore real-world data on Durham’s changing neighborhoods as part of a summer research program called Data+.

As a first step, they looked at recent trends in the housing market and business development.

Photo by Mark Moz.

Durham real estate and businesses are booming. A student mapping project aims to identify the neighborhoods at risk of pricing longtime residents out. Photo by Mark Moz.

Call it gentrification. Call it revitalization. Whatever you call it, there’s no denying that trendy restaurants, hotels and high-end coffee shops are popping up across Durham, and home values are on the rise.

Integrating data from the Secretary of State, the Home Mortgage Disclosure Act and local home sales, the team analyzed data for all houses sold in Durham between 2010 and 2015, including list and sale prices, days on the market, and owner demographics such as race and income.

They also looked at indicators of business development, such as the number of business openings and closings per square mile.

A senior double majoring in physics and art history, Vivian brought her GIS mapping skills to the project. Junior statistics major Oddiraju brought his know-how with computer programming languages.

To come up with averages for each neighborhood or Census block group, they first converted every street address in their dataset into latitude and longitude coordinates on a map, using a process called geocoding. The team then created city-wide maps of the data using GIS mapping software.

One of their maps shows the average listing price of homes for sale between 2014 and 2015, when housing prices in the area around Duke University’s East Campus between Broad Street and Buchanan Boulevard went up by $40,000 in a single year, the biggest spike in the city

Their web app shows that more businesses opened in downtown and in south Durham than in other parts of the city.

Duke students are developing a web app that allows users to see the number of new businesses that have been opening across Durham. The data will appear in future updates to a web-based mapping tool called Durham Neighborhood Compass.

They also used a programming language called “R” to build an interactive web app that enables users to zoom in on specific neighborhoods and see the number of new businesses that opened, compare a given neighborhood to the average for Durham county as a whole, or toggle between years to see how things changed over time.

The Durham Neighborhood Compass launched in 2014. The tool uses data from local government, the Census Bureau and other state and federal agencies to monitor nearly 50 indicators related to quality of life and access to services.

When it comes to gentrification, users can already track neighborhood-by-neighborhood changes in race, household income, and the percentage of households that are paying 30 percent or more of their income for housing — more than many people can afford.

Vivian and Oddiraju expect the scripts and methods they developed will be implemented in future updates to the tool.

When they do, the team hopes users will be able to compare the average initial asking price to the final sale price to identify neighborhoods where bidding has been the highest, or see how fast properties sell once they go on the market — good indicators of how hot they are.

Visitors will also be able to compare the median income of people buying into a neighborhood to that of the people that already live there. This will help identify neighborhoods that are at risk of pricing out residents, especially renters, who have called the city home.

Vivian and Oddiraju were among more than 60 students who shared preliminary results of their work at a poster session on Friday, July 29 in Gross Hall.

Vivian plans to continue working on the project this fall, when she hopes to comb through additional data sets they didn’t get to this summer.

“One that I’m excited about is the data on applications for renovation permits and historic tax credits,” Vivian said.

She also hopes to further develop the web app to make it possible to look at multiple variables at once. “If sale prices are rising in areas where people have also filed lots of remodeling permits, for example, that could mean that they’re flipping those houses,” Vivian said.

Data+ is sponsored by the Information Initiative at Duke, the Social Sciences Research Institute and Bass Connections. Additional funding was provided by the National Science Foundation via a grant to the departments of mathematics and statistical science.

groupshot

 

 

 

 

Writing by Robin Smith; video by Sarah Spencer and Ashlyn Nuckols

Page 2 of 11

Powered by WordPress & Theme by Anders Norén