Despite the high precision of the biplanar fluoroscopy system, digitized data contain errors compounded from several potential sources. (1) The undistortion relies on accurately determining the centroids of the grid. Image quality and/or imperfection in flattening the grid may cause small error. (2) camera calibration provides a "best estimate" of camera position based on digitized clicks on the image and the "true" measured distances of the points on the calibration object. Digitizing error and/or offsets in the calibration object point distances can also add uncertaintly. (3) Digitizing, whether automated or by hand can also add error and finally (4) 3D reconstruction between the two cameras may also add error.
Even very small error for individual markers can cause larger and more noticeable errors in rigid object orientation. This is clearly visible as "jittery bones" when raw data are used to drive animations.
Two Smoothing Steps
1. "Smart" smoothing. - using "measured" intermarker distances from CT data to constrain digitized data
2. Standard smoothing - butterworth filter to remove high frequency noise from bone orientations
"Smart" smoothing - basic instructions
What you need:
- CT Marker locator coordinates (from the "Create a Setup Scene" step)
- xyzpts file from DLTdataviewer
- smoothData.m, rigidFilter2.m, and rigidOrientation2.m. click here to download CTXmatlab tools
What to do:
- Open matlab and type smoothData
You will be asked for the \[prefix\]xyzpts file. Select it.
- You will then be asked for the CT Marker coordinate file (from the "Create a Setup Scene" step). Select it.
- Input how many bones
- Select the markers associated with bone1 and click OK
- Repeat last for all other bones
Save the output file (default is \[prefix\]xyzpointsBones.csv)
What you get:
- The .csv file contains 6 columns per bone. 3 translations and 3 rotations
- This format can be read in maya using imRg.
- You will probably get a less jittery but not perfect animation using only this smoothing step. Continue to Standard smoothing for best results.
import your \[prefix\]xyzpointsBones.csv bones file
- you will have 3 variables, colheaders, data, and textdata
- type: plot(data)
- click the Insert Legend button
- pick a data column with some cyclic curves (the legend will allow you to match the column number with the color of the line)
- type: plot(data(:,4)) %the number 4 being an example of the data column chosen)
- There is probably still a bit of high frequency noise on the curve - we'll use a butterworth filter to smooth it out a touch more
- use the following function: dataSmooth = tybutter(data,f,g) %f = cutoff frequency and g = recording frequency
- example: dataSmooth = tybutter(data(:,4),50,250)
- type: hold on
- type: plot(dataSmooth,'r-')
- look at figure1. It should have a red curve showing the effect of the chosen filter cutoff against the original data
- generally the filter cutoff ranges between 10 and 80, you just have to try several cutoff frequencies until you find the one that appears to get rid of the high frequency jitters without changing the general shape of the curve
- Once you have a filter that you like, apply it to all of the data
- example: dataSmooth = tybutter(data,50,250)
- dataSmooth should have the same dimensions as data
- Save your new data using csvWithHeaders
- csvWithHeaders('fileName.csv', dataSmooth, colheaders)