CAMBRIDGE, MA – Halloween is scarily almost upon us. It is the time of year when parties involve closing your eyes and feeling around in a bowl of mystery items; brains are secretly mushy spaghetti and eyeballs are eerily similar peeled grapes. While silly, it is an activity that shows how the human brain can misunderstand our surroundings when one of our senses is not being used. The more types of sensory data people collect, the better they understand their environment. When a person sees and feels something at the same time, more and different neurons fire, providing fuller situational awareness.
Draper’s internal research and development (IRaD) program, Immersive Situational Awareness Wear (isaWear)™, enables users to recognize more data from their surroundings and react more quickly to the information. isaWear is a group of wearables that tap into several senses at once, providing visual, audio, thermal, and tactile triggers in coordinated data delivery. Autonomous driving is one application of this technology. Instead of a navigation system showing a map of directions, isaWear would deliver a heat pulse to your left wrist while displaying a “turn left” icon on augmented reality glasses while also telling you through audio instruction. The integration of multimodal signals while driving can ultimately provide a richer understanding of the driving environment and therefore a safer experience as well. Draper worked with Carnegie Mellon University to conduct studies of the isaWear technology. isaWear will be featured at Draper’s upcoming Engineering Possibilities 2016 showcase this October.
Draper’s Human-Computer Interaction (HCI) group is expanding technology and user experience in areas like big data, autonomy, and even biotechnology. This group applies its User-Centered Design & Engineering approach to improve user experience and user capability across a range of users and missions. The HCI group has been successful spanning from mobile applications for tactical decision-making to designing autonomous vehicles and algorithms that support operator trust and adoption. From spacecraft to autonomous vehicles, this group designs and builds technology that make interacting with systems more intuitive, safe, and empowering for the users.
This past Spring, Kelly Sprehn, a senior member of Draper’s HCI technical staff, was elected and is now serving as president of the Human Factors and Ergonomics Society (HFES) New England chapter. With regional and student chapters, this society promotes knowledge of human systems technology. Part of Sprehn’s role as president is to serve as a role model, leading and engaging students. Draper will help to sponsor the society’s largest annual event, the HFES Student Conference in April, bringing together approximately 30 students from New England programs to present their work to peers. Sprehn, along with other Draper HFES members, will also participate in the October career panel on Human Factors in the Workforce hosted by Tufts University.
The HFES International Annual Meeting in Washington, D.C., will host its annual User Experience Day as part of the conference this week. UX Day provides technical sessions and networking opportunities in user experience for all conference attendees, and hosts a Leadership Development Workshop by invitation only. The workshop is intended to bring together leading industry professionals and promising young students in the field. Sprehn is representing Draper as an industry leader in user experience among a panel featuring Microsoft, Boeing, Veritas and more. She will discuss the importance of using user experience to develop products and implementing that research early on in the design process.