Surya Sharma

Machine Learning Applications | Computer Vision | This website may be my recipe book.

Sensor Axes on the Shimmer

So I’ve decided to log what I’m doing on a daily basis on the website. I’ve been using Evernote and a paper pen system for this, but there’s way too many things going on at this point of time, and Evernote’s changes with their premium pricing is a bit annoying.

Background

Our research deals with detecting eating from wrist movement information. Devices capture acceleration and rotation information from the wrist and use that to detect eating.

Unintended Acceleration

Previous work done by the group used acceleration from an iPhone – linear acceleration. Most sensors like the Shimmer instead report force experienced by the sensor – acceleration including gravity, as highlighted in the Google tech talk below. We were able to modify the shimmer code to log orientation information (as quaternions), which would let us estimate gravity, then subtract it from the acceleration to get linear acceleration.

Implementation

We’ve used a short piece of code that I wrote (Markerparser) to read data from custom programmed Shimmer devices and collected data from ~400 participants. These devices record acceleration and angular velocity across all 3 axes (X, Y, Z). Most devices (iPhone, Actigraph) follow the Right hand rule for axis rotation and direction. For example, if your right thumb points to positive X, then the direction of your fingers is designated as rotation in the positive X direction.

righthandrule

Shimmer has a manual that provides axis information on page 21. This was assumed to be true when MarkerParser was created.

shimmeraxis1

However, I realized later (and again in the last week) that the axis information did not match. There are multiple Shimmer Software, and multiple sensors in the Shimmer, all of which report different information.

shimmeraxis2

While the eating detection algorithms do not care about most axis orientation or direction, the step to calculate linear acceleration does, which means we need to know the correct axis information before moving forward at this point of time.

The problem

I had adjusted for a change in axis in MarkerParser, and then again in ShimmerView (the visualizer). However, since the original source for axis information was incorrect, I cannot say for certain if everything in the pipeline is providing correct results.

Solution / Steps

Completed:

I have recorded a video of the Shimmer being recorded and adjusted the CSV file for a time sync error between the clock and the video. Files are locally stored in Eating Detection\Actigraph Study\AHRS TESTING\BiasTesting

My next step is to output gravity instead of linear acceleration to the datafiles. This will result in Shimmerview visualizing gravity instead of linear acceleration. A mismatch in expected gravity and visualized gravity will show that the axes need to be fixed in Shimmerview and then AHRS Test – data already exists for 400+ participants, and it will be time consuming to change those files. An events file corresponding to the video already exists in the BiasTestingFolder.

A file with gravity data is ready.

To Do:

Compare CSV from Consensys, Youtube video and Shimmerview output to check if all three are consistent.

Next Post

Previous Post

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

© 2024 Surya Sharma

Theme by Anders Norén