Creating Technology That Understands Human Emotions

“If you – as a human – want to know how somebody feels, for what might you look?” Professor Shaundra Daily asked the audience during an ECE seminar last week.

“Facial expressions.”
“Body Language.”
“Tone of voice.”
“They could tell you!”

Over 50 students and faculty gathered over cookies and fruits for Dr. Daily’s talk on designing applications to support personal growth. Dr. Daily is an Associate Professor in the Department of Computer and Information Science and Engineering at the University of Florida interested in affective computing and STEM education.

Dr. Daily explaining the various types of devices used to analyze people’s feelings and emotions. For example, pressure sensors on a computer mouse helped measure the frustration of participants as they filled out an online form.

Affective Computing

The visual and auditory cues proposed above give a human clues about the emotions of another human. Can we use technology to better understand our mental state? Is it possible to develop software applications that can play a role in supporting emotional self-awareness and empathy development?

Until recently, technologists have largely ignored emotion in understanding human learning and communication processes, partly because it has been misunderstood and hard to measure. Asking the questions above, affective computing researchers use pattern analysis, signal processing, and machine learning to extract affective information from signals that human beings express. This is integral to restore a proper balance between emotion and cognition in designing technologies to address human needs.

Dr. Daily and her group of researchers used skin conductance as a measure of engagement and memory stimulation. Changes in skin conductance, or the measure of sweat secretion from sweat gland, are triggered by arousal. For example, a nervous person produces more sweat than a sleeping or calm individual, resulting in an increase in skin conductance.

Galvactivators, devices that sense and communicate skin conductivity, are often placed on the palms, which have a high density of the eccrine sweat glands.

Applying this knowledge to the field of education, can we give a teacher physiologically-based information on student engagement during class lectures? Dr. Daily initiated Project EngageMe by placing galvactivators like the one in the picture above on the palms of students in a college classroom. Professors were able to use the results chart to reflect on different parts and types of lectures based on the responses from the class as a whole, as well as analyze specific students to better understand the effects of their teaching methods.

Project EngageMe: Screenshot of digital prototype of the reading from the galvactivator of an individual student.

The project ended up causing quite a bit of controversy, however, due to privacy issues as well our understanding of skin conductance. Skin conductance can increase due to a variety of reasons – a student watching a funny video on Facebook might display similar levels of conductance as an attentive student. Thus, the results on the graph are not necessarily correlated with events in the classroom.

Educational Research

Daily’s research blends computational learning with social and emotional learning. Her projects encourage students to develop computational thinking through reflecting on the community with digital storytelling in MIT’s Scratch, learning to use 3D printers and laser cutters, and expressing ideas using robotics and sensors attached to their body.

VENVI, Dr. Daily’s latest research, uses dance to teach basic computational concepts. By allowing users to program a 3D virtual character that follows dance movements, VENVI reinforces important programming concepts such as step sequences, ‘for’ and ‘while’ loops of repeated moves, and functions with conditions for which the character can do the steps created!

 

 

Dr. Daily and her research group observed increased interest from students in pursuing STEM fields as well as a shift in their opinion of computer science. Drawings from Dr. Daily’s Women in STEM camp completed on the first day consisted of computer scientist representations as primarily frazzled males coding in a small office, while those drawn after learning with VENVI included more females and engagement in collaborative activities.

VENVI is a programming software that allows users to program a virtual character to perform a sequence of steps in a 3D virtual environment!

In human-to-human interactions, we are able draw on our experiences to connect and empathize with each other. As robots and virtual machines grow to take increasing roles in our daily lives, it’s time to start designing emotionally intelligent devices that can learn to empathize with us as well.

Post by Anika Radiya-Dixit

Science Meets Policy, and Maybe They Even Understand Each Other!

As we’ve seen many times, when complex scientific problems like stem cells, alternative energy or mental illness meet the policy world, things can get a little messy. Scientists generally don’t know much about law and policy, and very few policymakers are conversant with the specialized dialects of the sciences.

A screenshot of SciPol’s handy news page.

Add the recent rapid emergence of autonomous vehicles, artificial intelligence and gene editing, and you can see things aren’t going to get any easier!

To try to help, Duke’s Science and Society initiative has launched an ambitious policy analysis group called SciPol that hopes to offer great insights into the intersection of scientific knowledge and policymaking. Their goal is to be a key source of non-biased, high-quality information for policymakers, academics, commercial interests, nonprofits and journalists.

“We’re really hoping to bridge the gap and make science and policy accessible,” said Andrew Pericak, a contributor and editor of the service who has a 2016 masters in environmental management from the Nicholas School.

The program also will serve as a practical training ground for students who aspire to live and work in that rarefied space between two realms, and will provide them with published work to help them land internships and jobs, said SciPol director Aubrey Incorvaia, a 2009 masters graduate of the Sanford School of Public Policy.

Aubrey Incorvaia chatted with law professor Jeff Ward (center) and Science and Society fellow Thomas Williams at the kickoff event.

SciPol launched quietly in the fall with a collection of policy development briefs focused on neuroscience, genetics and genomics. Robotics and artificial intelligence coverage began at the start of January. Nanotechnology will launch later this semester and preparations are being made for energy to come online later in the year. Nearly all topics are led by a PhD in that field.

“This might be a different type of writing than you’re used to!” Pericak told a meeting of prospective undergraduate and graduate student authors at an orientation session last week.

Some courses will be making SciPol brief writing a part of their requirements, including law professor Jeff Ward’s section on the frontier of robotics law and ethics. “We’re doing a big technology push in the law school, and this is a part of it,” Ward said.

Because the research and writing is a learning exercise, briefs are published only after a rigorous process of review and editing.

A quick glance at the latest offerings shows in-depth policy analyses of aerial drones, automated vehicles, genetically modified salmon, sports concussions and dietary supplements that claim to boost brain power.

To keep up with the latest developments, the SciPol staff maintains searches on WestLaw, the Federal Register and other sources to see where science policy is happening. “But we are probably missing some things, just because the government does so much,” Pericak said.

Post by Karl Leif Bates

Rooftop Observatory Tracks Hurricane Rain and Winter Snow

Jonathan Holt replaces the protective cover over the rain gauge.

Jonathan Holt replaces the protective cover over the rain gauge.

On Friday night, while most of North Carolina braced against the biting sleet and snow with hot cocoa and Netflix, a suite of research instruments stood tall above Duke’s campus, quietly gathering data on the the storm.

The instruments are part of a new miniature cloud and precipitation-monitoring laboratory installed on the roof of Fitzpatrick CIEMAS by graduate student Jonathan Holt and fellow climate researchers in Ana Barros’s lab.

The team got the instruments up and running in early October, just in time for their rain gauge to register a whooping six inches of rain in six hours at the height of Hurricane Matthew — an accumulation rate comparable to that of Hurricane Katrina when it made landfall in Mississippi. Last weekend, they collected similar data on the winter storm, their Micro Rain Radar tracking the rate of snowfall throughout the night.

The rooftop is just the latest location where the Barros group is gathering precipitation data, joining sites in the Great Smokies, the Central Andes of Peru, and Southern Africa. These three instruments, with a fourth added in early January, are designed to continuously track the precipitation rate, the size and shape of raindrops or snow flakes – which climatologists collectively dub hydrometeors — and the formation and height of clouds in the air above Duke.

Ana Barros, a professor of civil and environmental engineering at Duke, says that her team uses these field observations, combined with atmospheric data from institutions like NOAA and NASA, to study how microscopic particles of dust, smoke, or other materials in the air called aerosols interact with water vapor to form clouds and precipitation. Understanding these interactions is a key prerequisite to building accurate weather and climate models.

“What we are trying to do here is to actually follow the lifecycle of water droplets in the air, and understand how that varies depending on weather systems, on conditions, on the climatic region and the location on the landscape,” Barros said.

A distrometer on the roof of Fitzpatrick CIEMAS.

A laser beam passing between the two heads of the distrometer detects the numbers and sizes of passing raindrops or snowflakes.

Besides tracking dramatic events like Matthew, Barros says they are also interested in gathering data on light rainfall, defined as precipitation at a rate of less than 3 mm of an hour, throughout the year. Light rainfall is a significant source of water in the region, comprising about 35 percent of the annual rainfall. Studies have shown that it is particularly prone to climate change because even modest bumps in temperature can cause these small water droplets to evaporate back to gas.

Eliminating this water source, “is not a dramatic change,” Barros said. “But it is one of those very important changes that has implications for how we manage water, how we use water, how we design infrastructure, how we have to actually plan for the future.”

Barros says she is unaware of any similar instrument suites in North Carolina, putting their rooftop site in position to provide unique insights about the region’s climate. And unlike their mountainous field sites, instruments on the roof are less prone to being co-opted by itchy bears.

“When we can gather long term rain gauge data like this, that puts our research group in a really unique position to come up with results that no one else has, and to draw conclusions about climate change that no one else can,” Holt said. “It is fun to have a truly unique perspective into the meteorology, hydrology and weather in this place.”

Micro Rain Radar data from Hurricane Matthew and the snowstorm on Jan. 6th.

The Micro Rain Radar (MRR) shoots radio waves into the sky where they reflect off water droplets or snowflakes, revealing the size and height of clouds or precipitation. The team collected continuous MRR data during Hurricane Matthew (top) and last Friday’s snow storm (bottom), creating these colorful plots that illustrate precipitation rates during the storms.

Kara J. Manke, PhD

Post by Kara Manke

Seeing Nano

Take pictures at more than 300,000 times magnification with electron microscopes at Duke

Sewer gnat head

An image of a sewer gnat’s head taken through a scanning electron microscope. Courtesy of Fred Nijhout.

The sewer gnat is a common nuisance around kitchen and bathroom drains that’s no bigger than a pea. But magnified thousands of times, its compound eyes and bushy antennae resemble a first place winner in a Movember mustache contest.

Sewer gnats’ larger cousins, horseflies are known for their painful bite. Zoom in and it’s easy to see how they hold onto their furry livestock prey:  the tiny hooked hairs on their feet look like Velcro.

Students in professor Fred Nijhout’s entomology class photograph these and other specimens at more than 300,000 times magnification at Duke’s Shared Material & Instrumentation Facility (SMIF).

There the insects are dried, coated in gold and palladium, and then bombarded with a beam of electrons from a scanning electron microscope, which can resolve structures tens of thousands of times smaller than the width of a human hair.

From a ladybug’s leg to a weevil’s suit of armor, the bristly, bumpy, pitted surfaces of insects are surprisingly beautiful when viewed up close.

“The students have come to treat travels across the surface of an insect as the exploration of a different planet,” Nijhout said.

Horsefly foot

The foot of a horsefly is equipped with menacing claws and Velcro-like hairs that help them hang onto fur. Photo by Valerie Tornini.

Weevil

The hard outer skeleton of a weevil looks smooth and shiny from afar, but up close it’s covered with scales and bristles. Courtesy of Fred Nijhout.

fruit fly wing

Magnified 500 times, the rippled edges of this fruit fly wing are the result of changes in the insect’s genetic code. Courtesy of Eric Spana.

You, too, can gaze at alien worlds too small to see with the naked eye. Students and instructors across campus can use the SMIF’s high-powered microscopes and other state of the art research equipment at no charge with support from the Class-Based Explorations Program.

Biologist Eric Spana’s experimental genetics class uses the microscopes to study fruit flies that carry genetic mutations that alter the shape of their wings.

Students in professor Hadley Cocks’ mechanical engineering 415L class take lessons from objects that break. A scanning electron micrograph of a cracked cymbal once used by the Duke pep band reveals grooves and ridges consistent with the wear and tear from repeated banging.

cracked cymbal

Magnified 3000 times, the surface of this broken cymbal once used by the Duke Pep Band reveals signs of fatigue cracking. Courtesy of Hadley Cocks.

These students are among more than 200 undergraduates in eight classes who benefitted from the program last year, thanks to a grant from the Donald Alstadt Foundation.

You don’t have to be a scientist, either. Historians and art conservators have used scanning electron microscopes to study the surfaces of Bronze Age pottery, the composition of ancient paints and even dust from Egyptian mummies and the Shroud of Turin.

Instructors and undergraduates are invited to find out how they could use the microscopes and other nanotech equipement in the SMIF in their teaching and research. Queries should be directed to Dr. Mark Walters, Director of SMIF, via email at mark.walters@duke.edu.

Located on Duke’s West Campus in the Fitzpatrick Building, the SMIF is a shared use facility available to Duke researchers and educators as well as external users from other universities, government laboratories or industry through a partnership called the Research Triangle Nanotechnology Network. For more info visit http://smif.pratt.duke.edu/.

Scanning electron microscope

This scanning electron microscope could easily be mistaken for equipment from a dentist’s office.

s200_robin.smith

Post by Robin Smith

Acoustic Metamaterials: Designing Plastic to Bend Sound

I recently toured Dr. Steven Cummer’s lab in Duke Engineering to learn about metamaterials, synthetic materials used to manipulate sound and light waves.

Acoustic metamaterials recently bent an incoming sound into the shape of an A, which the researchers called an acoustic hologram.

Acoustic metamaterials recently bent an incoming sound into the shape of an A, which the researchers called an acoustic hologram.

Cummer’s graduate student Abel Xie first showed me the Sound Propagator. It was made of small pieces that looked similar to legos stacked in a wall. These acoustic metamaterials were made of plastic and contained many winding pathways that delay and propagate, or change the direction, of sound waves. The pieces were configured in certain ways so they could design a sound field, a sort of acoustic hologram.

These metamaterials can be configured to direct a 4 kHz sound wave into the shape of a letter ‘A’. The researchers measured the outgoing sound wave using a 2D sweeping microphone that passed back and forth over the A-shaped sound like a lawnmower, moving to the right, then up, then left, etc. The arrangement of metamaterials that reconfigures sound waves is called a lens, because it can focus sound waves to one or more points like a light-bending lens.

Xie then showed me a version of the acoustic metamaterials 10 times smaller that propagated ultrasonic (40 KHz) sound waves. He told me that since 40 kHz was well out of the human range of hearing, it could be a viable option for the wireless non-contact charging of devices like phones. The smaller wave propagator could direct inaudible sound waves to your device, and then another piece of technology called a transfuser would convert acoustic energy into electrical energy.

This structure, with a microphone in the middle, can perform the "cocktail party" trick that humans can -- figuring out where in the room a sound is coming from.

This structure with a microphone in the middle can perform the “cocktail party” trick that humans can — picking out once voice among many.

Now that the waves have been directed, how do we read them? Xie directed me to what looked like a plastic cheesecake in the middle of the table. It was deep and beige and was split into many ‘slices.’ Each slice was further divided into a unique honeycomb of varying depth. The slices were separated from each by glass panes. This directed the soundwaves across the unique honeycomb of each slice towards the lone microphone in the middle. A microphone would be able to recognize where the sound was coming from based on how the wave had changed while it passed over the different honeycomb pattern of each slice.

Xie described the microphone’s ability to distinguish where a sound is coming from and comprehend that specific sound as the “cocktail party effect,” or the human ability to pick out one person speaking in a noisy room. This dense plastic sound sensor is able to distinguish up to three different people speaking and determine where they are in relation to the microphone. He explained how this technology could be miniaturized and implemented in devices like the Amazon Echo to make them more efficient.

Dr. Cummer and Abel Xie’s research is changing the way we think about microphones and sound, and may one day improve all kinds of technology ranging from digital assistants to wirelessly charging your phone.

Frank diLustro

Frank diLustro is a senior at the North Carolina School for Science and Math.

 

X-mas Under X-ray

If, like me, you just cannot wait until Christmas morning to find out what goodies are hiding in those shiny packages under the tree, we have just the solution for you: stick them in a MicroCT scanner.

A christmas present inside a MicroCT scanner.

Our glittery package gets the X-ray treatment inside Duke’s MicroCT scanner. Credit Justin Gladman.

Micro computed-tomography (CT) scanners use X-ray beams and sophisticated visual reconstruction software to “see” into objects and create 3D images of their insides. In recent years, Duke’s MicroCT has been used to tackle some fascinating research projects, including digitizing fossils, reconstructing towers made of stars, peaking inside of 3D-printed electronic devices, and creating a gorgeous 3D reconstruction of organs and muscle tissue inside this Southeast Asian Tree Shrew.

x-ray-view

A 20 minute scan revealed a devilish-looking rubber duck. Credit Justin Gladman.

But when engineer Justin Gladman offered to give us a demo of the machine last week, we both agreed there was only one object we wanted a glimpse inside: a sparkly holiday gift bag.

While securing the gift atop a small, rotating pedestal inside the device, Gladman explained how the device works. Like the big CT scanners you may have encountered at a hospital or clinic, the MicroCT uses X-rays to create a picture of the density of an object at different locations. By taking a series of these scans at different angles, a computer algorithm can then reconstruct a full 3D model of the density, revealing bones inside of animals, individual circuits inside electronics – or a present inside a box.

“Our machine is built to handle a lot of different specimens, from bees to mechanical parts to computer chips, so we have a little bit of a jack-of-all-trades,” Gladman said.

Within a few moments of sticking the package in the beam, a 2D image of the object in the bag appears on the screen. It looks kind of like the Stay Puft Marshmallow Man, but wait – are those horns?

Blue devil ducky in the flesh.

Blue devil ducky in the flesh.

Gladman sets up a full 3D scan of the gift package, and after 20 minutes, the contents of our holiday loot is clear. We have a blue devil rubber ducky on our hands!

Blue ducky is a fun example, but the SMIF lab always welcomes new users, Gladman says, especially students and researchers with creative new applications for the equipment. For more information on how to use Duke’s MicroCT, contact Justin Gladman or visit the Duke SMIF lab at their website, Facebook, Youtube or Instagram pages.

Kara J. Manke, PhD

Post by Kara Manke