Research

My research interests include sensory perception, pattern recognition, and human-machine interaction. My current research projects are focused on understanding and using sensor signals. In particular, we are investigating computational intelligence techniques for studying real-time sensory perception and interactive human-machine interfaces, and applying them to robotics (especially human-robot interaction), spatial referencing interfaces, gait analysis, and now eldercare and rehabilitation. We are especially interested in proactive healthcare models, which include early detection of health changes and screening tools to detect high risk of injury. For more information, see also our Center for Eldercare and Rehabilitation Technology which includes links to our papers.

October, 2014: We have been working on two systems that screen for potential injury risk. One project is collaborative with sports doc, Dr. Aaron Gray, to screen athletes: Portable Inexpensive Motion Analysis System to Identify Female Athletes at High Risk of Knee ACL Tear . We are also working on game-based intervention exercises. The second project is collaborative with music professor, Dr. Paola Savvidou, to develop a screening tool for piano students: Measuring the Alignment of Piano Students for Injury Prevention

October, 2013: We started a new NSF-funded US Ignite project to develop an interactive interface to support remote physical therapy: GENI-Enabled In-Home Personalized Health Monitoring and Coaching . This is being tested with participants in private homes in Kansas City, using Google Fiber.

May, 2013: We will be running a larger study of our sensor-based health alert system, in a new project funded by NIH: Intelligent Sensor System for Early Illness Alerts in Senior Housing . Sensors include our new hydraulic bed sensor for capturing pulse, respiration, and restlessness as well as the Kinect depth camera for capturing in-home gait parameters. We will install 70 systems in Americare senior housing in mid-Missouri.

July, 2012: We began a new collaboration in eldercare technology with Western Home Communities in Cedar Falls, IA, Inventive Health Solutions in Kansas City, and Lincoln University in Jefferson City, MO in a new project: An In-Home Health Alert System with Remote Care Coordination funded by the NSF as part of the US Ignite initiative.

Sept., 2010: We started a new project: Human-Driven Spatial Language for Human-Robot Interaction funded by the HCC program at the NSF and collaborating with Laura Carlson at Notre Dame.

Sept., 2009: We started 3 new eldercare projects:
Active Heterogeneous Sensing for Fall Detection and Fall Risk Assessment funded by the CPS program at the NSF. (investigates vision and acoustic sensing)
Technology to Automatically Detect Early Signs of Illness in Senior Housing funded by NIH.
Technology to Automatically Detect Falls and Assess Fall Risk in Senior Housing funded by AHRQ. (investigates radar sensing)

Sept., 2008: A new project started under the NGA-funded Text to Sketch program: Natural-Language Processing Applied to Geospatial Information led by Dr. Jim Keller.

Jan, 2008: We started a new eldercare project, Elder-Centered Recognition Technology for the Assessment of Physical Function , funded by the NSF HCC program, which is now investigating the monitoring of gait and physical movement in a multi-person environment. This is an extension of our other ongoing work using passive sensor networks to monitor the physical and cognitive health of elders for early problem identification.

Jan, 2007: Our Mizzou ADVANCE grant (funded by the NSF) has started, with the goal of helping women faculty in Science, Technology, Engineering, and Math. Programs include mentoring, climate theatre, and a STRIDE committee, as well as a research program to evaluate the effectiveness of each component.

GK-12 Fellowships for Fall, 2006 ($30K per year): I will be looking for two students to fill GK-12 fellowships for the 2006-2007 academic year. (ECE or Computer Science) Fellows work with local students in grades 6-9 to develop engineering design projects such as lego robots. Details on the application process can be found here . See also the Project Web Site for more information.

Fall, 2005: We are continuing work on our sketch-based interfaces for control of one or more mobile robots. In summer, 2005, we collaborated with the Naval Research Lab on a demo and usability study at the AAAI conference. Here is the abstract on using sketches to control a team of robots.

Jan., 2005: We started a new project, Technology Interventions for Elders with Mobility and Cognitive Impairments incorporating smart home technology into eldercare systems. This NSF ITR project is a multidisciplinary collaboration with nursing and health management and informatics. All of our RA positions have been filled but keep watching for announcements. In general, I am especially looking for students with experience and interest in pattern recognition, computational intelligence, and computer vision. If you want to become part of the team, you should take courses in these areas.

Jan, 2004: We started a project on Biologically Inspired Working Memory for Robots , funded through the NSF ITR program.

Older projects include: The Guinness Robot Project   which incorporates Using Spatial Language for Human-Robot Communication ,   Analyzing Sketched Route Maps ,   and   Face Recognition with MSNN .   Also, Equine Gait Analysis .   Check out the projects page and additional projects: Learning Force Sensory Patterns and Skills from Demonstration ,   and Event-Driven Computing Projects for Software Engineering Education.