Back to News & Media
Monday, September 19, 2016

Expanding the User Experience

Draper technical staff member named HFES president

CAMBRIDGE, MA – Halloween is scarily almost upon us. It is the time of year when parties involve closing your eyes and feeling around in a bowl of mystery items; brains are secretly mushy spaghetti and eyeballs are eerily similar peeled grapes. While silly, it is an activity that shows how the human brain can misunderstand our surroundings when one of our senses is not being used. The more types of sensory data people collect, the better they understand their environment. When a person sees and feels something at the same time, more and different neurons fire, providing fuller situational awareness.

Draper’s internal research and development (IRaD) program, Immersive Situational Awareness Wear (isaWear)™, enables users to recognize more data from their surroundings and react more quickly to the information. isaWear is a group of wearables that tap into several senses at once, providing visual, audio, thermal, and tactile triggers in coordinated data delivery. Autonomous driving is one application of this technology. Instead of a navigation system showing a map of directions, isaWear would deliver a heat pulse to your left wrist while displaying a “turn left” icon on augmented reality glasses while also telling you through audio instruction. The integration of multimodal signals while driving can ultimately provide a richer understanding of the driving environment and therefore a safer experience as well. Draper worked with Carnegie Mellon University to conduct studies of the isaWear technology. isaWear will be featured at Draper’s upcoming Engineering Possibilities 2016 showcase this October.

Draper’s Human-Computer Interaction (HCI) group is expanding technology and user experience in areas like big data, autonomy, and even biotechnology. This group applies its User-Centered Design & Engineering approach to improve user experience and user capability across a range of users and missions. The HCI group has been successful spanning from mobile applications for tactical decision-making to designing autonomous vehicles and algorithms that support operator trust and adoption. From spacecraft to autonomous vehicles, this group designs and builds technology that make interacting with systems more intuitive, safe, and empowering for the users.

This past Spring, Kelly Sprehn, a senior member of Draper’s HCI technical staff, was elected and is now serving as president of the Human Factors and Ergonomics Society (HFES) New England chapter. With regional and student chapters, this society promotes knowledge of human systems technology. Part of Sprehn’s role as president is to serve as a role model, leading and engaging students. Draper will help to sponsor the society’s largest annual event, the HFES Student Conference in April, bringing together approximately 30 students from New England programs to present their work to peers. Sprehn, along with other Draper HFES members, will also participate in the October career panel on Human Factors in the Workforce hosted by Tufts University.

The HFES International Annual Meeting in Washington, D.C., will host its annual User Experience Day as part of the conference this week. UX Day provides technical sessions and networking opportunities in user experience for all conference attendees, and hosts a Leadership Development Workshop by invitation only. The workshop is intended to bring together leading industry professionals and promising young students in the field. Sprehn is representing Draper as an industry leader in user experience among a panel featuring Microsoft, Boeing, Veritas and more. She will discuss the importance of using user experience to develop products and implementing that research early on in the design process.

The isaWear kit's four "presenters" convey information using visual, auditory, tactile, and thermal cues. The thermal presenter delivers pulses of heat or coolness to the inner wrists, changing the body's perception of thermal comfort. Augmented reality glasses with heads-up display and 3D audio headphones. Kelly Sprehn, president of HFES New England.
Capabilities Used
Human-Centered Solutions

Draper has continued to advance the understanding and application of human-centered engineering to optimize the interaction and capabilities of the human’s ability to better understand, assimilate and convey information for critical decisions and tasks. Through its Human-Centered Solutions capability, Draper enables accomplishment of users’ most critical missions by seamlessly integrating technology into a user’s workflow. This work leverages human-computer interaction through emerging findings in applied psychophysiology and cognitive neuroscience. Draper has deep skills in the design, development, and deployment of systems to support cognition – for users seated at desks, on the move with mobile devices or maneuvering in the cockpit of vehicles – and collaboration across human-human and human-autonomous teams.

Image & Data Analytics

Draper combines specific domain expertise and knowledge of how to apply the latest analytics techniques to extract meaningful information from raw data to better understand complex, dynamic processes. Our system design approach encompasses effective organization and processing of large data sets, automated analysis using algorithms and exploitation of results. To facilitate user interaction with these processed data sets, Draper applies advanced techniques to automate understanding and correlation of patterns in the data. Draper’s expertise encompasses machine learning (including deep learning), information fusion from diverse and heterogeneous data sources, optimized coupling of data acquisition and analysis and novel methods for analysis of imagery and video data.

Media Contact

Media Relations

Contact Info: 
Strategic Communication

Media Relations

P: 
617-258-2464
C: 
617-429-2883