Jun 24

Former Civil Engineering Grad Student Receives Award From AWWA

Zwerneman_Collins AWWA Award

Pictured L-R: Andrew DeGraca, Chair of the AWWA Technical & Educational Council, M. Robin Collins, Professor and Chair of the Department of Civil Engineering, John Zwerneman, former Civil Engineering graduate student and award winner, and Jeff Nason, Chair of the AWWA Academic Achievement Award Committee.

John Zwerneman, former Civil Engineering graduate student, was recently awarded 2nd place for Best Thesis for an Academic Achievement Award from the American Water Works Association (AWWA). He also received a check for $1500. The Academic Achievement Award encourages academic excellence by recognizing contributions made by students and academicians in the field of public water supply. Entries are considered by a panel of judges (Academic Achievement Award Committee) appointed by the American Water Works Association, and are evaluated on the basis of originality, practical application, value to the water supply field, potential value as a reference, and overall clarity. John’s thesis entitled “Investigating the Effect of System Pressure on Trihalomethane Post-Treatment Diffused Aeration” was advised by Prof M. Robin Collins.

Jun 14

QACafe BBQ photo

Computer Science Dept. Chair, Radim Bartos, poses for a photo at the QACafe BBQ this May. Seven out of eleven QA Cafe employees are UNH alumni, including the company founder Joe McEachern.


Apr 29

Presenting Our Analysis Tool

Today I performed my final presentation for my senior project. My project, A Tool to Analyze Spoken Dialogue and Pupil Diameter on a Multi-Touch Surface, was the development of a piece of software to help researchers analyze post-experimentation data. This data came from a past experiment, described here.

The tool loads this data into a single, synchronized visualization to save researchers from switching between applications such as MatLab, DialogueView, and Wavesurfer, to compare relationships. Please enjoy the video below of this tool in action.

This presentation represents the end of my time with Project54 as I am graduating in May. I have very much enjoyed working at Project54 and know that my experiences and knowledge will absolutely benefit me while working in industry!

Feb 16

GPS Peripheral Navigational Aid

To view instructional information click on the photographs.

Feb 14

Instrumented Steering Wheel Project Update

The Instrumented Steering Wheel members, Adam Leone (CS) and Travis Royer (CE), have further refined the software and hardware used in this project.  The goal of the project is to add an authentication factor to vehicle security through a password tapped on an array of force sensing resistors (FSR) and processed on an Arduino microcontroller.

The system currently stores three passwords, selected by a tap on the corresponding FSR.  The user selects an FSR and chooses to either log into that profile with a short tap or input a new password for that profile with a three second hold.  If logging in is selected, the user enters his password attempt and the Arduino compares it against what is stored and outputs success/failure.  If inputting a new password, the user taps the pattern three times and the Arduino determines if they are within the defined error bounds.  If so, it averages the patterns, stores the new one in the profile, and outputs success.  Otherwise, it discards the new input, retaining the old password, and outputs failure.  Program flow is visually demonstrated in Figure 1:

Figure 1 - Flow chart of ISW tapped password program


The algorithm used to compare a password input against the stored pattern has been modified from that seen in the previous post.  When a user taps on an FSR, the Arduino calculates and stores the time of maximum pressure rather than using the time the leading edge of the tap passed the pressure threshold as a tap time.  Tap time is based off an absolute start time, that of the first tap.  Input taps must fall within a small window of time centered on the corresponding pattern time.  This window grows as the time from the start point increases, giving the user more flexibility toward the end of the sequence.  Pressure is similarly compared but a different error rate used and every tap pressure compared rather than those after the first.   The algorithm is visualized in Figure 2.  Taps are first recognized when the pressure crosses the threshold (orange circle) and the max pressure thereafter is calculated (yellow circle).  The tap time is recorded when max pressure is reached.  When comparing times or pressures, the blue lines are the length/size of the corresponding tap in the stored pattern and the green line is the window the input tap must fall within based on the preset error rates.  The window grows with the magnitude of the metric.

Figure 2 - Algorithm comparing input against a stored pattern


Although we haven’t started extensive testing, we have found three problems, two with this algorithm.  The first is the use of a linked list to store the taps.  This data structure requires input to be given in the same order every time, which eliminates the possibility of a password  multiple simultaneous taps because they may be ordered differently in the list each time.  The first problem with the algorithm is the use of a tap time that marks the time elapsed from the start of input to that tap.  We chose this method rather than storing the times between taps because it forces the user to maintain the same input temp each time rather than being able to compress and extend the pattern to the limits of the error window.  The difficulty of maintaining the same tempo, even when entered twice in thirty seconds, is becoming more and more apparent and might lead us to the conclusion that the pressure of taps and the relative times in between taps is more important than the tempo.  The third problem relates to the pressure comparison.  The error window is proportional to the “size” of the tap pressure.  It is easier to match a hard tap (large window) than a soft one (small window).  This problem might be overcome in part by comparing pressure based on three increments (soft, medium, hard) but trouble arises when a tap is at the threshold in between those categorizations.

Jan 08

Instrumented Steering Wheel Project Demo

The Instrumented Steering Wheel Project is the senior project of Adam Leone, Computer Science, and Travis Royer, Computer Engineering.  It operates under the umbrella of the University of New Hampshire’s Project 54 with Electrical and Computer Engineering Associate Professor Andrew Kun advising.


This project seeks to develop an additional form of authentication for vehicle security, specifically in the form of a tapped password.  The goal of this project is to implement this form of authentication and determine the level of possible security while maintaining enough flexibility to allow human variation in password inputs.  In order to test the efficacy of this authentication, a test rig has been built and programmed for an initial demonstration.


Arduino microcontroller, force sensing resistors, bluetooth shield


The test rig uses force sensing resistors as input devices that register user taps.  An Arduino microcontroller collects data from the FSRs at regular intervals and analyzes the input by determining where the tapping force exceeds a predefined threshold.  The first tap marks the start of input and each successive tap is recorded as its time from the absolute start point.


When a user inputs their tapped password, the Arduino compares it against the correct pattern and determines if it matches within some fixed error rate.  The pattern recognition algorithm compares the tapping times of the template and input based off the absolute start point.  It allows for input tap variation within a fixed percentage error window.  This means that initial taps have a smaller window they must fall within but taps later in the pattern have an increasingly large window of time.  Basing the time off the absolute start point forces the user to tap a sequence at roughly the same tempo, rather than a system of comparing the times in between taps, which allows for more tempo variation due to the error window.  This tap ‘window’ based on the error rate can be seen in this graph, where the blue bar represents the expected tap time and the green bar represents the window of opportunity based on the error rate.


Visual graph shows taps at the rising edge of FSR input, comparison windows based on error rate


Currently programmed for this demonstration, the test rig inputs a password pattern then an input pattern, outputs the comparison results, and waits for a new password pattern.  When a password pattern is given and the arduino finishes processing that input, the left, green LED turns on.  Next, the user inputs taps that may or may not match the password.  If the input matches the password pattern within the 15% error rate currently set, theright, green LED turns on and both turn off, otherwise the left LED turns off.  The arduino is now ready for new input


The following videos demonstrate the preliminary capabilities of the project test rig and the concepts we are implementing.

A pattern guaranteed to match is one tap because there is no comparison of times between taps.

One Tap Password Pattern

A more complex pattern requires the input cadence to be similar to the password.  Input given too slowly is rejected.

Different Tempos

Different numbers of taps are rejected.

Different Number of Taps

The error rate allows simple patterns to be easily matched, but makes it more difficult for longer patterns.

Varying Difficulty
Future functionality will be added to the test rig such as the ability to store multiple users in a database and add users by normalizing a pattern they enter multiple times.  If the main project goals are accomplished, the input analysis will be enhanced to calculate the tap time based on the peak input pressure and the pattern comparison algorithm will be modified to compare pressure within a fixed error rate.

Dec 15

Peripheral Navigation Aid Prototype Design

8x8 LED Matrix Design Idea courtesy of http://www.thingiverse.com/oomlout

Our goal is to design an 8×7 LED matrix on an acrylic layout that can be easily attached and removed at the driver’s convenience.  Once the 15cm x 24cm transparent layout is complete, the LED’s will be soldered in place, and wired into their respective rows and columns.  The goal will be displaying “strips” of LED’s indicating directional information.  Once completed, an Arduino keypad will be implemented to assist in giving directional information to the driver in a more efficient and timely manner.

Dec 15

7×8 LED Array Advancement

Following the design of the 8×4 LED array, an even more advanced 7×8 LED array was designed. It will act as a prototype unit for driving simulator testing. With its relatively smaller size, the array can easily be moved and tested in various spots within the simulator to determine where its placement is most beneficial for drivers. The choice of the placement will be determined by which location produces the highest PDT (percent dwell time). This will be done using the eye trackers.

To display straight, right, and left turns on the LED array, a keypad will be implemented within this design. After learning that the Arduino does not support commands directly from a computer, this solution was proposed. The keypad shown is specifically made for Arduino and easily connects to its pins on the board. Each type of turn will be assigned a key on the keypad. When that key is pressed, the code corresponding to that turn will run and display on the 7×8 array.

– Tina Tomaszewski

Dec 14

Testing locations of 7×8 LED array

The following pictures show three possible locations that have been discussed for testing the new LED array in the simulator while using the eye tracker to obtain data.  Originally, the general consensus seemed to be placing the array above the driver’s head to allow for peripheral viewing while keeping focus on the road.  However, we are not so sure that will be the best location due to factors such as people having different heights and seating placements.  This suggestion is shown in the first picture.  The second and third pictures are other possible locations that we will be using to test our design.  One way will be to place it in a similar place that the current GPS is, but hope that since our design is less complex and involved than a GPS that it will be far less distracting and will still only require peripheral viewing.  Another location we wish to obtain data from involves holding the device outside the car so that it is directly in front of the driver.  With the help of the eye tracker, we will be able to determine how effective each of these locations are.  More locations will be tested as well, but these are the ones we have chosen to begin our experiments with.


Dec 06

8×4 LED Array


After completing the construction of the 3×3 array, an 8×4 array was built to give a more accurate representation of the LED array that will eventually be placed into the simulator. We observed that having more LED’s to control quickly made the coding and timing of the direction changes more complex.

This video shows the sequence for a left hand and right hand turn approaching, respectively. These turns are then followed by a solid vertical line which instructs the driver to continue driving straight. This video can be viewed by clicking here: 8×4 LED Array

Now that we have successfully constructed an 8×4 LED array and have learned how to control more complex directions and timing we will be able to construct the final 7×8 array which will be placed in the simulator and used for testing.


– Aaron Lecomte


Older posts «