On the last day of the past semester (12.21.07), Alex Shyrokov and I paid a visit to Northeastern University’s Intelligent Human-Machine Systems Lab. This was a follow-up meeting to the contact we established with fellow researchers last fall on the NECHFES Student Conference. We were very interested in finding out about the sensored SmartWheel that they developed. It has many similarities with our sensored steering wheel. Our hosts were Hua Cai and Hongjie Leng, who kindly demonstrated us their system.
Hua Cai (NEU), Hongjie Leng (NEU), Oskar Palinko (UNH)
As it can be seen on the above image, the simulator consists of a driving seat, steering wheel, pedals and a computer screen. Both the software and hardware were developed at Northeastern. The university has a larger, higher fidelity simulator too in a neighboring lab, which was inaccessible at the time. The two simulators can be connected, so two drivers can interact in the same scenario.
The steering wheel has a number of sensors on its surface: blood volume pulse (infra-red), skin conductance, skin temperature and gripping force sensors. To be able to measure these variables, it is necessary that the drivers hold the steering wheel at exact positions, i.e. to align the fingertips with the sensor positions. The grip must not be too hard, otherwise the data gets noisy.
The multi-modal system even incorporates sensors in the seat belt: one for strain and another one for bending. The strain sensor picks up breathing cycle signals while the bending sensors can indicate possible accidents.
I would like to thank our hosts for letting us explore their system and for answering all of our questions. The exchanged knowledge will surely benefit both sides.