Handheld OCT Scanner

Published on February 2017 | Categories: Documents | Downloads: 56 | Comments: 0 | Views: 275
of 34
Download PDF   Embed   Report

Comments

Content

ATLAS OCT

Handheld OCT Scanner
Dr. Stephen Allen Boppart

Tunaidi Ansari, Brian Baker, Casey Lewis, and Nickalus Zielinski

Table of Contents Introduction .................................................................................... 4 The Project ..................................................................................... 5 Design Process ................................................................................ 7
Function Decomposition ................................................................................................. 7 Concept Evaluations ....................................................................................................... 7 Measure Parameter...................................................................................................... 7 Calculate Position ....................................................................................................... 8 Display Image ............................................................................................................. 8 Store Data.................................................................................................................... 8 Combine Image and Position Data ............................................................................. 8 Sweep Light over Area ............................................................................................... 8 Communicate Data...................................................................................................... 8 Power Sensor .............................................................................................................. 9 Initiate Scan ................................................................................................................ 9 Configuration Designs .................................................................................................... 9

As-Built Documentation .............................................................. 11
Probe ............................................................................................................................. 11 Position Tracking Hardware ......................................................................................... 13 Bluetooth Dongle ...................................................................................................... 13 Original IMU ............................................................................................................ 13 Wiimote..................................................................................................................... 15 New Atomic IMU ..................................................................................................... 16 Input Software ............................................................................................................... 16 IMUReader ............................................................................................................... 17 WiimoteReader ......................................................................................................... 19 MATLAB – atlasoct_main.m ................................................................................... 20 MATLAB – reader.m................................................................................................ 20 Integration Software...................................................................................................... 20 MATLAB – calibrator.m .......................................................................................... 20 MATLAB – calibrator2.m ........................................................................................ 20 MATLAB – integration.m – going from measurements to positions ....................... 21 MATLAB – turner.m – rotating the position data .................................................... 21 MATLAB – revive.m – generate a 3-d representation of the sample ....................... 21 MATLAB – slicer.m – create a series of usable images........................................... 22 ImageJ – MedNuc-OrtView plugin – Putting it all together..................................... 22 Design Verification ........................................................Error! Bookmark not defined. Design Validation ..........................................................Error! Bookmark not defined.

Plan/Project Critique ................................................................... 23
Probe ............................................................................................................................. 23 Position Tracking Hardware and Input Software.......................................................... 23 Integration Code / Software .......................................................................................... 24 Putting It All Together – Combining the OCT and position data ................................. 24

2

Acknowledgements ...................................................................... 26 Appendix 1: Atlas OCT MOU .................................................... 27 Appendix 2: Atlas OCT Project Plan ....................................... 33 Appendix 3: Atlas OCT Budget ................................................. 34

3

Introduction
The following report is a detailed documentation of the Bioengineering senior design project by Atlas OCT. This team is comprised of Tunaidi Ansari, Brian Baker, Casey Lewis, and Nickalus Zielinski. The report covers the work of the Atlas OCT team over the Fall 2008 and Spring 2009 semesters under the direction of the instructor, Dr. Michael Haney, and the client, Dr. Stephen Boppart of the Biophotonics Imaging Laboratory at the Beckman Institute for Advanced Science and Technology. The goal of the team‟s design project was to create a functional prototype for a hand-held OCT scanner, capable of tracking position as well as collecting OCT data, and assimilating the two data sets to produce two dimensional cross sectional scans or three dimensional volume scans. This report contains an explanation of the project and its goals, documentation of the design phase, a description of the completed prototype as-built, and a discussion and evaluation of the product and process by which it was created.

4

The Project
On October 7th, 2008, Dr. Stephen Boppart presented to the team two possible projects involving optical coherence tomography (OCT) technology. The project the team chose to develop was titled “Gyroscopic Three-Dimensional Scanner for Optical Coherence Tomography” and involved several scientific disciplines, including OCT, optics, software engineering, and electronics, as well as special packaging concerns, to insure that the device could be operated in hand-held manner. The design‟s goal was a portable and modular OCT imaging system, which would allow for a compact and portable system which could be used in a clinical setting, as well as a user-friendly software interface, and a probe that could attach to modular beam delivery devices. The hand-held nature of the probe would allow for scanning in transverse or a three dimensional manner, creating either cross-sectional images or 3-D volume scans. One initial solution provided by the client at the conception of the design team was a previously purchased USB mouse which contained gyroscopes, as well as the traditional optoelectronic sensor in the form of a LED (light emitting diode) which tracks movement over a surface in two dimensions using an optical flow estimation algorithm. Although this route received a strong recommendation from the client from the very onset of the project, the team did not choose this method, and the reasons for which this decision was made will be discussed at length later in the report. From the descriptions provided and several additional meetings with the client and his research team, the design team developed a problem statement, mission statement, and a list of project objectives. Additionally, the team developed a project plan, spanning the two semesters allocated toward this project, as well as a budget. The problem statement is an attempt to identify the problem at hand, as well as the desired end state of the project, what obstacles are preventing this from being reached, and how a successful solution to the problem will be identified. The problem statement developed by Atlas OCT is as follows: To obtain Optical Coherence Tomography (OCT) scans from a handheld device, position tracking data must be integrated with OCT data to reconstruct a two dimensional image. Such a device must consistently and accurately track its position so that a computer may assemble images from the data. Currently, there is no such device that incorporates position tracking to create images from a manual scanner. The mission statement is designed to identify what the team will be doing, as well as for whom (the client) and how this will be accomplished. The mission statement developed by Atlas OCT is as follows: Atlas OCT will provide Dr. Stephen Boppart and the Biophotonics Imaging Laboratory at the Beckman Institute for Advanced Science and Technology with a physical handheld OCT scanner that tracks its position, with supporting software to assemble two dimensional images from the collected data. If time permits, generation of three dimensional images and publishing of research shall be attempted. This will be achieved through regular meetings with Dr. Stephen Boppart and the senior design team, and interaction with other members of the Biophotonics Imaging Laboratory. Finally, a listing of the project objectives was developed, including the project‟s goals and outcomes, as well as any “deliverables”, including plans, drawing, devices,

5

reports, reviews, and presentations. The Atlas OCT team decided upon two project objectives for the client, 1) A prototype device that collects OCT data associated with its physical position and 2) software to operate the device and construct images from the data and its source code, as well as objectives associated with the senior design class, including weekly progress reports and meetings, design reviews, presentations, and reports. The three aforementioned object (problem statement, mission statement, and project objectives) were used to form the body of a memorandum of understand (MOU) between the client and the team. The MOU was designed to serve as an informal contract between the two parties, which could be modified to suit the project‟s goals and the client‟s expectations, identifying and clarifying key areas of interest, including resources, and intellectual property, as well as serving as a mechanism to resolve disputes between the client and the team, should they arise. The MOU drafted and signed by team Atlas OCT and Dr. Stephen Boppart can be viewed in Appendix 1. In order to be able to stay on task, and accomplish all the goals that were determined on time, a project plan was put into place. The first step taken in order to prepare the project plan was to determine what steps needed to be taken and what order each of these needed to be done in. Additionally, the interdependencies were determined. The main areas that needed to be focused upon were researching different technologies and alternatives, gathering the materials, getting each individual task to work, and then to integrate all the different functions of our probe. After this was accomplished we set out to do some testing on the prototype, starting with phantoms, and then moving to live subjects. As the project moved forward, and setbacks were encountered, the project plan was continually updated and referred to in order to keep the project on task. The Project Plan can be found in Appendix 2. A budget was also developed for the Atlas OCT project. As initially stated, the Bioengineering department was willing to fund up to $500 for a single project, with a soft-cap at the $400 level. Our final budget included the following: $200 for a system to measure and track motion, $13 for a communication device, $100 for 2 hours of machine shop work, $30 for the SM1V05 OCT attachment (spacer), $10 for a case, $40 for a backup motion detection device, and $40 for various shipping costs. The estimated costs totaled $436.50, and the actual costs totaled $401.01. A copy of the final Atlas OCT budget is available in Appendix 3.

6

Design Process
Function Decomposition
Our project was broken down into the following verb-noun objectives:  Measure Parameter – Some type of data must be acquired to calculate position and orientation  Calculate Position – The measured data must be converted to position and orientation  Display Image – The final result must be viewable  Store Data – The data must reside somewhere, whether it is kept on the probe and downloaded at once or streamed to the computer and stored there  Combine Image and Position Data – OCT and Position data must be used together to produce the finale output image  Sweep Light over Area – Different methods were available for broader data collection  Communicate Data – The measured data must somehow get from the probe to the computer  Power Sensor – The measurement unit must receive energy to perform its function  Initiate Scan – There must be a way of telling the system to begin taking data

Concept Evaluations
While some of the decomposed function verb-noun pairs interacted, each category of concepts could, for the most part, be evaluated separately. Numerical analysis can be found in the Concept Evaluations excel sheet on the CD. Measure Parameter Measure Parameter was easily the most complex and influential concept to evaluate. Several options for obtaining data that could be used to calculate position were considered: a combination of accelerometers and gyroscopes, a camera on the probe looking at external visual markers – either on the ceiling, or infra-red LEDs, an optical mouse, an articulated arm with angle sensors, a gyroscopic mouse, and laser positioning systems. These concepts were evaluated in terms of their availability, cost, size, and especially ease of implementation, ease of use, and estimated resolution. We looked for a solution that could track position accurately but fit in a compact form factor. Figures were found showing low resolutions for camera solutions, and optical mice and gyro mice just didn‟t have enough sensors to track position and orientation in 3D space. The laser positions system and articulated arm both seemed bulky and counter to the design input requirement of a handheld probe, and the articulated arm also was found to be too expensive to pursue, although its resolution was very high. In the end, a combination of accelerometer and gyroscopes was chosen.

7

Calculate Position Different languages were considered for doing the math required to turn the measured parameter data in position data. Specifically, MATLAB, Python, and microcontroller code on the probe were considered. Because of the team‟s familiarity with MATLAB, and the heavy use of MATLAB within the Biophotonics lab, it was easily the strongest option. Display Image Displaying the image could be done using either ImageJ or by outputting raw AScan data. The benefits of visualization in ImageJ were found to outweigh the simplicity of dumping A-Scans. Store Data We considered storing the data on the probe and then downloading it the computer in one lump, and also considered streaming the data from the probe continuously to the computer. It would be much easier to not worry about storage on the probe, and because of the way our IMU functioned, it was also simpler to implement a streaming connection; this was the method chosen for design and implementation. Combine Image and Position Data For this category, we were mostly interested in determining whether to try using only our position data to align A-Scans, or whether to also try using Adeel‟s code that examines A-Scans to determine where they should be aligned. It would be easier to just do our own, but collaboration could result in better resolution. It was decided to combine with Adeel‟s if time allowed. Sweep Light over Area Because OCT probes take a 1-dimensional depth scan, acquiring a 3-dimensional volume of data requires many passes over a given array. One way this can be accomplished is buy attaching a mirror to a galvanometer and rapidly sweeping the OCT beam through an angle. Because our product will track position on its own, it could also be manually waved over the entirety of the surface. It would be easier for the user to have a galvo do the work, but in the end it was found that time constraints would probably rule out designing a galvo assembly for the end of the probe, and manual movement was chosen. Communicate Data Various methods of sending parameter data from the probe to the computer were considered. Concepts evaluated included a physical cable, two speeds of generic radio

8

frequency, Bluetooth, and Zigbee/Xbee protocols. These were considered in terms of cost, ease of implementation, ease of use, and data transfer rate. Bluetooth performed well in all categories – it is a solid standard in wireless communication. A physical cable also seemed to be a strong option. The IMU module we chose ended up having Bluetooth, so its availability made it an easy choice. Power Sensor The sensor could be powered by a cord, a standard 9V battery, or from a variety of rechargeable options. Rechargeable options would require time spent development over-charge protection circuits and would be costly. A 9V battery would need to be replaced often. In the end, considering that the scanner would require a fiber optic cable for OCT data anyway, a cord was found to be the best option. Initiate Scan Different methods could be used to start taking a scan. A computer interface could begin measurement, or different input types on the probe could take data. Considering cost and ease of use and ease of implementation, it was found that manipulating an input on the computer while handling the probe would prove to be cumbersome; some initation on the probe would be best.

Configuration Designs
The OCT scanning probe is essentially the only prototype of our project that needed configuration designs. Our client suggested two specific configurations, which we accepted as his requests.

SM Fiber

Collimating lens

Focusing lens

Machined hole for placement of IMU

Figure 1 - The schematic representation of the first configuration design.

9

The first involved a plastic enclosure shaped like a flashlight as shown in Figure 1. In the middle of the conical casing, there would be a machined opening with dimensions large enough to fit the IMU and a protective covering. A lens would be situated on either side of this opening. A collimating lens would be installed on the backend of the opening in order to direct the OCT beam in a straight line towards the front. On the front-end of the opening, a focusing lens would be installed in order to receive the collimated OCT beam and focus it into a point for scanning. Obviously, there would be a hole at the front of the flashlight-like enclosure in order to allow the focused beam to pass through. At the end of the enclosure, we would connect a single-mode optical fiber to the system, connecting it to the OCT machine and carrying its corresponding light to the probe to be filtered by the collimating lens.

SM fiber

IMU

Collimating lens

Focusing lens

Spacer

Figure 2 - The schematic representation of the second configuration design.

The second design embodies some of the same concepts but is a little more complex. Rather than a large enclosure as illustrated in the first configuration, we would use a smaller tube-like casing with a much smaller diameter as shown in Figure 2. This is because the IMU would not be placed in the interior of the casing; only the lenses would be inside. As previously explained, the collimating lens would be at the end of the tube, while the focusing lens would be placed near the front of the tube. In addition, the singlemode optical fiber will be used in the same manner as the first configuration. The IMU will be placed inside its own box-like encasing and attached to the back of the probe as to not interfere with the user‟s handling when scanning images. This design also takes into account the need for an appropriate distance of separation between the focusing lens and the intended sample. As such, a spacer, with a rounded tip and a hole for passing light, is incorporated into the system as well.

10

As-Built Documentation
Probe
We adopted the second of our configuration designs and modified it to better adapt to our needs as shown in Figure 3.

IMU SM fiber Outer Threading Outer Threading Outer Threading

Collimating lens: f = 11 mm SM1L20 SM1V05 Figure 3 - The schematic representation of the as-built probe.

Focusing lens: f = 30 mm

Rather than just a single conical case, the body of our probe consisted of four detachable components, all of which are tube-like encasings. The first two, purchased from Thorlabs, are SM1L20 pieces. These conical tubes have a one inch diameter and are two inches long. On each end of the pieces are threadings, one possessing outer threading and the other possessing inner threading. As such, both SM1L20 pieces are attached together and are facilitated by the combination of outer to inner threading. Next, we have the SM1V05 piece, also purchased from Thorlabs. This tube-like encasing also has a one inch diameter, but its length is only one inch long. Similar to the SM1L20, it also possesses inner and outer threading, and is attached to the inner threading of the outermost SM1L20 piece. The SM1V05 tube comes with an adjustment ring, shown in Figure 4a, that holds the fourth component, the spacer, in place and at a set distance away from the SM1V05 piece. This spacer is formed with Delrin, a material with similar properties as the Thorlabs pieces, and was machined by the Materials Research Laboratory at the University of Illinois at Urbana-Champaign. Shown in Figure 4b, this spacer, possessing a total of 12 mm of inner threading, is connected to the SM1V05 piece in the same previous mechanism. The free end of the spacer is flat, 10 mm away from the

11

threadings and with a 3 mm diameter hole at the center, rather than round so that the probe will be held still and perpendicular to the sample.
SM1V05

a)

b)

12 mm

10 mm

3 mm diameter

Outer threading

Inner threading

Spacer

Figure 4 - a) A photograph of the SM105 with the adjustable ring support. b) The schematic representation of Delrin spacer.

Next, we have the lenses, which were taken out from the Biophotonics Imaging Laboratory. These two lenses would be fitted perfectly into the conical tubes from Thorlabs. The first lens is the collimating lens, with a relative aperture of 11 mm. This would be fitted inside the back of the first SM1L20 piece. The other lens is the focusing lens, with a relative aperture of 30 mm. This would be fitted inside the front of SM1V05 piece before the spacer is attached. The single-mode optical fiber, with its connector, is added to the end of the probe, right before the collimating lens. The other end of this fiber is installed to the OCT machine. Finally, the IMU, model number SEN-8190, is purchased from Sparkfun Electronics. The IMU then is secured into its own casing, a 3” x 3” x 1” plastic enclosure with a cover purchased from POLYCASE. This is accomplished by drilling four holes into the bottom of the enclosure, inserting standoffs through those holes, and stabilizing the IMU onto the standoffs with screws. A power cable is clipped onto the IMU through another hole that is drilled through one of the sides of the case. The IMU is then mounted onto the body of the probe, specifically on top of the SM1V05 piece. This is different from the second configuration in that we realize that the closer the IMU is to the front of the probe, the better its motion sensitivity. User handling is not compensated in that holding the case of the IMU when scanning increases stability. The alternative plan involved purchasing a Nintendo Wiimote from GameStop. The Wiimote would take the place of the IMU and would not need an enclosure. It would be strapped onto the body of the probe with rubber bands.

12

Both the IMU and the Nintendo Wiimote have a built-in Bluetooth module, which is our method of communication. Bluetooth detection is installed on a Dell desktop computer, where programming and code will take over for data acquisition.

Position Tracking Hardware
Initially, an Inertial Motion Unit (IMU) from Sparkfun was used for obtaining data to translate into position. However, because of hardware problems, this had to later be swapped out for a Wiimote. The original IMU had more precision and more degrees of freedom, but the hardware failure necessitated a trade of accuracy and data for availability. Software is available for the use of both – a different IMU will be available in the future, and much of the previous IMU software can be reused with a few changes. The following paragraphs will describe each hardware unit in more detail. Bluetooth Dongle A USB Bluetooth dongle was acquired for communicating with the original IMU. Because the Wiimote also uses Bluetooth, the same dongle was used for those communications. Included with the Bluetooth dongle was a mini-CD containing device drivers and the Bluesoleil software suite which handles communication with Bluetooth devices. Before using the original IMU or Wiimote, this software must be set up. Original IMU

Figure 5 – Original IMU mounted in its case, with power connector at left

The original IMU (Part #SEN-08190, IMU 6 Degrees of Freedom - v2 with Bluetooth from sparkfun.com), pictured in Figure 5, is a hardware module that integrates a Freescale MMA7260Q triple axis accelerometer with three perpendicularly-mounted iMEMS gyroscopes and a BlueSMiRF unit for wireless communication, all tied together with a PIC16F88 microcontroller. The Bluetooth module exposes a virtual COM/Serial port which can be used to change settings on the IMU as well as reading acceleration and angular velocity data.

13

The sensitivity of the IMU is determined by the Analog-to-Digital converter (ADC) on the PIC microcontroller. The MMA7260Q outputs three analog values corresponding to accelerations in the orthogonal x, y, and z directions on a scale of 0 to 3.3V; the PIC samples these values at 10 bit resolution on a scale of 5V. With the accelerometer set to use its most sensitive setting, +/- 1.5g, this results in an acceleration resolution [3g / 3.3V] * [5V / 1024 tick] = .00444 g / tick = .0435 m/s^2 / tick. To set up the original IMU, first supply power. A power source greater than 5V (6V is what we used; 9V would be the maximum suggested) must be supplied. A power adapter cable was created to fit the IMU power jack; the other end has banana plugs that can be inserted into any standard lab power supply. Additionally, an AC to 6VDC transformer was fitted with banana plugs so the IMU can be powered from a standard wall outlet. Next, make sure the “DEBUG/RF” switch on the IMU is in the RF position. The DEBUG option is used to connect the IMU to a computer with a physical cable and to download new firmware. More details are available in the datasheet (see CD). Next, turn the power switch to the “On” position. Some LEDs will blink rapidly as the IMU starts up, and then a red LED will steadily blink. To connect a computer to the IMU, open the Bluesoleil software and search for devices while the IMU is on. The „Firefly‟ device that appears in the IMU. Double-click the Firefly option to connect to it, and then click the serial port icon to start using the serial port service. The Bluetooth icon and serial port icon should both turn green. The simplest and most straightforward way to see if communication with the IMU is working is to try communicating using hyperterminal. Open hyperterminal (usually in Programs->Accessories->Communications). Select the virtual COM port that Bluesoleil provides (usually around COM5-8 or so). The settings for the port are as follows:
Bits per second: 57600 Data bits: 8 Parity: None Stop bits: 1 Flow control: None

Once connected, press a key (such as „x‟), and the menu for the IMU will appear. Pressing Ctrl+G starts the data stream, and pressing the space bar stops the data stream and returns to the menus. By navigating the menu options, you can change the accelerometer sensitivity, change the data output type, and turn data channels on and off. The 1.5g accelerometer setting is the most sensitive, and binary output is used by our software. Data channels can be turned on or off, but to be read by our programs correctly, the IMUReader settings file must be correspond to these data channel settings. The IMUReader program is discussed in more detail later.

14

Figure 6 – Bad IMU data. The IMU was moved forward from a stationary position and then stopped. A sine-like curve would be expected to demonstrate acceleration followed by deceleration. However, the positive acceleration portion consistently produced the curve seen above.

Late in the testing, the IMU was replaced with a Wiimote. It was determined late in testing that acceleration values read from the IMU were incorrect, indicating a faulty accelerometer; one such reading is presented in Figure 6. Wiimote The Wiimote replaced the IMU late in testing when the IMU was determined to be faulty. The Wiimote was chosen primarily for its availability, and also for its ease of use - software libraries are readily available and it communicates with other devices using Bluetooth, much like the IMU, although it exposes a HID interface instead of a serial port.

Figure 7 – The Wiimote pictured above the OCT probe

The Wiimote is a controller for the Nintendo Wii game console (pictured in Figure 7). It is unlike traditional controller in several ways. The most important innovation for our project is the inclusion of a three-axis accelerometer, which was used to get data in lieu of a functional IMU.

15

The data obtained from the Wiimote doesn‟t stack up to the data from the IMU in two ways. First, the accelerometer is not as precise; it provides 255 values over a 6g range, or 6g / 255 ticks = .0235 g /tick, compared to .00444 g /tick from the IMU. Additionally, the Wiimote does not have any method of obtaining rotational data. However, its availability was invaluable. To connect the Wiimote to a computer, open the Wiimote‟s battery case and press the small red synchronization button. The LEDs on the Wiimote should begin blinking. Next, on the computer, search for Bluetooth devices, and connect to the „Nintendo RVLCNT-01‟ HID device when it appears. The most straightforward way to test connection is the use the WiimoteTest program included with the WiimoteLib software library, from http://www.brianpeek.com/blog/pages/wiimotelib.aspx (see CD). This program will display the state of all the buttons, accelerometer data, and IR camera data. Additionally, LEDs and rumble can be turned on and off. Additionally, the WiimoteReader can be used to view accelerometer data over time and test connection. The WiimoteReader will be discussed in more detail later. New Atomic IMU Because the original IMU item from Sparkfun has been discontinued, the replacement module is a slightly different version called the Atomic IMU. The Atomic IMU is quite similar to the original IMU; however, there are important changes. Many of these can be found in the data sheet, which is included on the CD. The Atomic IMU does not have a Bluetooth connection. It needs to be connected via serial/DB9 cable to the computer. However, the Atomic IMU UART connection only operates at 0/3.3V logic levels, which must be shifted to RS-232 standard using a converter chip such as the MAX3221. The Atomic IMU can also be set up through hyperterminal, but there are differences in how this is done. More details can be found in its datasheet. Baud rate on the new IMU is higher; it runs at 115200 baud instead of 57600 baud. The order of data inputs in the data stream is different on the Atomic IMU. While it bookends frames in “A” and “Z” characters, it sends a “count” parameter, and sends acceleration values before pitch values. No voltage references or temperatures are measured or sent. The IMUReader program is built to handle data from the Atomic IMU, but as this hardware is not currently on hand, it could not be tested. A description of how to use the IMUReader to read Atomic IMU data is found later in the Input Software IMUReader section.

Input Software
Data is read from either the IMU or Wiimote using custom VB.NET software and outputted to a text file that MATLAB can read. Originally, an attempt was made to read serial data from the IMU directly into MATLAB – however, MATLAB had faulty methods for working with the virtual COM port, so another program was needed. The IMUReader and WiimoteReader programs were coded and built in VB.NET using Visual

16

Studio 2008. IMUReader and Wiimote reader both output data in the same format, so the rest of the MATLAB code works the same, regardless of which input device is used. Input Software binaries and source code are included on the CD. IMUReader Before using the IMUReader the settings.ini file must be set to match the IMU‟s channel settings. Using hyperterminal, different output channels can be enabled or disabled on the IMU as described in the Position Tracking Hardware section. To be read in properly, the IMUReader must know which channels are enabled and disabled. Simply open settings.ini in a text editor, and set the enabled values to “true”, and the disabled values to “false”. For example, the line
pitchvoltage=false

tells the IMUReader to not read in pitch voltage. The line
yaw=true

tells the IMUReader that the IMU will be sending yaw rate data.

Figure 8 – The IMUReader being used to view data. The data shown is random test data.

Once settings.ini matches the IMU settings, the IMUReader program can be run. When run without any command line options, the program lets you view data by graphing acceleration and rotation velocity data in real time as seen in Figure 8. Once 17

opened, specify the COM port to connect to in the „COM Port:‟ box and click the “…” button to connect. Once connected, the data stream can be started and stopped using the “Start Stream” and “Stop Stream” buttons. Pressing “start test” dumps test data into the graphs. Finally, the number of bytes in the serial port buffer is displayed below the start/stop stream buttons. This can be used to tell whether graphed data is lagging behind real time actions.

Figure 9 – The IMUReader reading in a certain number of frames and saving them to output.txt for MATLAB.

When the IMUReader program is run with command line options, as it is in the MATLAB code, the IMUReader will open, automatically connect to the IMU, read a specified number of frames, save the data to output.txt, and close. It opens a small window displaying its current status as seen in Figure 9. The command line option format is as follows:
IMUReader.exe <COM Port> <# frames to read> [“1” for atomic IMU]

The last command line parameter is optional; when set to 1, it reads data in the format sent from the new Atomic IMU. For example, the line
IMUReader.exe 5 1000

will read in 1000 frames of data off of COM5 from the original IMU. The line
IMUReader.exe 2 2000 1

will read 2000 frames of data off of COM2 from the new Atomic IMU. The output.txt file contains blocks of data for each frame. Each line holds one piece of data in the following order:
Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame Frame etc … 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 Time frame arrived (in ms) X Acceleration (raw) Y Acceleration (raw) Z Acceleration (raw) Pitch rate (raw) Roll rate (raw) Yaw rate (raw) Pitch Voltage (raw) Roll Voltage (raw) Yaw Voltage (raw) Pitch Temperature (raw) Roll Voltage (raw) Yaw Voltage (raw) Battery Voltage (raw) Time frame arrived (in ms) X Acceleration (raw) Y Acceleration (raw)

18

Any values that are not sent by the IMU are written as zero. The time values are important because sometimes the Bluetooth connection drops bytes. The IMU sends the ASCII character “A” before a new a frame, and an ASCII character “Z” at the end of a frame, so the IMUReader can tell when a frame has dropped a byte. These frames are simply dropped and not included in the output. WiimoteReader The WiimoteReader program is very similar to the IMUReader. In fact, they share a common code base – the WiimoteReader was made from a gutted version of the IMUReader, with serial port portions ripped out and replaced with code to interface with the Wiimote. Both programs open a viewer that graphs data in real-time when executed without command-line options. Both programs show a status window and write an output.txt – in the exact same format – when run with certain command line options.

Figure 10 – The WiimoteReader being used to view data. The data shown is random test data.

When run without command line options, the form pictured in Figure 10 appears. It is comparable to the IMUReader viewer – but sans pitch, roll, and yaw graphs since the Wiimote does not contain accelerometers. With the Wiimote synchronized to the PC, simply click the „Connect‟ button to start graphing Wiimote acceleration data. Alternatively, „start test‟ can dump random test values into the graphs. The command line options for the Wiimote reader are as follows:
WiimoteReader.exe <COM Port (deprecated)> <# frames to read> [“1” to wait for trigger press]

The COM Port value must be supplied but is not used. The second option specifies the number of frames to request. Finally, the last optional command line parameter may be specified to require a trigger press on the Wiimote before data is input. If this option is not specified, frames will be acquired immediately after connection to the Wiimote. If it 19

is set to 1, LEDs 2 through 4 will oscillate until the trigger (the B button) is pressed. At that point, frames will be read in. For example,
WiimoteReader.exe 27 500

will immediately read in 500 frames from the Wiimote, and
WiimoteReader.exe 99 750 1

will read 750 frames from the Wiimote beginning with a trigger press. Once the specified number of frames has been read, the Wiimote will vibrate briefly to signal the end of data collection. MATLAB – atlasoct_main.m – the Main File that runs everything else The atlasoct_main.m file is the main function that calls the rest of the functions. It starts by clearing out any data from previous runs, after this it takes new calibration data from the IMU or Wiimote, and then it starts taking in the acceleration data. Through the rest of the functions, it‟s final output is a series of files that can be opened as a stack in ImageJ to be viewed with the MedNuc-OrtView plugin. MATLAB – reader.m - reading IMUReader/WiimoteReader output The reader.m file simply open output.txt, as written by either the IMUReader or WiimoteReader, and enters the data it contains into the time, acceleration, and pitch vectors.

Integration Software
As mentioned previously, the IMU‟s base form of data output is accelerations in the x, y, and z directions, and angular velocities for pitch, yaw, and roll. In order to convert accelerations into position, a double integration must be performed, and for the angular rates, a single integration is required. This code was also written in Matlab. MATLAB – calibrator.m – simple calibration The calibrator.m file is used to find the resting-state values of the accelerometers and gyros automatically. It reads in a number of frames from either the IMU or Wiimote, calls reader.m, averages the values, and saves them to „calibration.txt‟. These values are later read in by integration.m and subtracted from the values read to get zero-centered accelerations and angular velocities. MATLAB – calibrator2.m – advanced calibration The second (calibrator2.m) similarly runs the reader.m program to collect a set of data. It then finds the maximum and minimum values for each of the accelerations, and uses these values to create a filter for the acceleration data. This is accomplished by setting any values collected in the subsequent data collection phases to zero if they fall within the maximum and minimum values obtained during a stationary calibration. 20

There is also a mechanism by which this range can be expanded to include, for example, all values within +/- 3 times the maximum acceleration obtained during calibration. This is used to insure that minor vibrations omnipresent within the device are not used during the integration phase, in an effort to reduce drift. These values are also written to a calibration text file which is called by the main integration function. Each calibration function also contains a minimum range required to qualify the calibration as successful. If the range of values taken during the calibration is too large, the code will prompt the user to take another calibration to ensure that only valid stationary data is collected. MATLAB – integration.m – going from measurements to positions The integration function then reads all calibration data from the two text files and subtracts the base level calibration values from each of the three accelerations. The integrations are then performed on the datasets using the built-in Matlab function cumtrapz, which uses the trapezoidal integration approximation to estimate the cumulative integration between two values of time. These integrations generate velocities, which are currently unused, and displacement values for each of the three axes. The function can then plot any of the collected data for visual interpretation. In conclusion, the integration portion of the code performs quite well. Several tests were performed in which the function was given a known input acceleration profile, which when integrated correctly would match known velocity and displacement profiles. In each of these test cases, the integration function generated the correct velocity and displacement values. Additionally, in several calibration tests, the second calibration function greatly reduces the degree of noise in the collected data, greatly reducing (albeit not completely removing) drift. MATLAB – turner.m – rotating the position data The turner function takes the set of x,y coordinates that is outputted by the integration function. Based on this data, a linear regression is run to determine the equation of the best fit line. The slope of this line is then used to determine the angle about the axis that the data should be rotated. This is done so that, instead of having a line that goes off on a large angle away from the axis, the data tends to follow the x axis, having a net y value of 0. This is useful in case the data is taken along a line which is not perfectly parallel to the x or y axis of the device. Not only does this give data that is easier to work with, it also reduces the size of the matrices that Matlab is required to work with, thus avoiding out-of-memory errors. MATLAB – revive.m – generate a 3-d representation of the sample Using the different A-scans, which are a single dimension array of intensity values, along with the position data, revive.m creates a 3-d matrix which is a virtual representation of the sample that has been scanned. Based upon the position that the probe is in when the different A-scan is taken, revive.m will fill in the 3-d matrix with the appropriate intensity values at the appropriate location. In the event that the probe has taken multiple A-scans at the same location, the data of the different intensities is averaged. The desired precision of the variation in the y-axis can be set with a single

21

variable within revive. That is, the number of slices along the x-z plane can be determined here. MATLAB – slicer.m – create a series of usable images In the last part of the software, the slicer.m file will take the 3-d matrix of intensities that is outputted by the revive.m function and create a stack of images representing the OCT scan of the sample. The program will ask the user for a numeric input of which dimension should be sliced to, with a value of 1, 2 or 3. This will correlate to creating slices in the x-y, y-z, or x-z planes. The series of slices will be stored as a series of TIFF files in the “images” folder, with the naming convention of the dimension sliced followed by the slice number, such as first4.tif or third1024.tif. ImageJ – MedNuc-OrtView plugin – Putting it all together The series of images can then be viewed in ImageJ, a freely available software program from the NIH. Using the MedNuc-OrtView plugin, which is also freely available, the images are displayed in a style that is similar to that of an MRI or CAT scan, showing orthogonal slices that can be moved through in any direction. The user simply needs to open ImageJ and select Import… Image Sequence under the File menu. After selecting the first image in the series, ie first1.tif or second1.tif, ImageJ will import all of the corresponding photos as a stack which can be opened in OrtView.

22

Plan/Project Critique
Probe
For the probe casing, we are proud of this prototype and believe that it was correctly assembled. Especially with regards to the spacer and even the concept of a spacer, it was an essential piece of the puzzle. The ring-like adjuster also served its role well as it secures the spacer in a stable fashion at whatever distance is needed for optimal functionality. However, what should have been done further is examining a better way of attaching the IMU case onto the body of the probe or maybe decreasing the size of the IMU and its case. Nevertheless, overall the probe was more than satisfactory in its physical manifestation. We believe the configuration design for the probe was well planned, and that the final modifications to the design proved to positively supersede the original plan.

Position Tracking Hardware and Input Software
One obvious area of problems with this project is centered at the IMU Hardware. Numerous unexpected delays were caused by both hardware and software problems. Originally, reading data off the virtual serial port did not work. While MATLAB‟s builtin serial functions were also a part of the problem, working in another language would not have necessarily solved these problems – the first VB.NET libraries could send but not receive data from the virtual port. Eventually, another library was found that could successfully obtain information, but not without significant wasted time and effort. Faulty accelerometer data also caused many problems. It took a lot of time in testing to diagnose the problem, and when found, it was too late to get a whole new IMU – availability took precedence over resolution in the decision to continue the project using a Wiimote. Because of this, the true resolution of the project design cannot be measured until a new IMU is received. The Wiimote, while a nice interim replacement, has significant shortcomings. Its wide availability and ease of use were invaluable when the IMU fault was discovered, but the Wiimote has not only significantly less resolution (by nearly an order of magnitude on accelerometers), but it has no way of tracking orientation (unless zero lateral translation is assumed). This made tests with the Wiimote even more inaccurate – the slightest change in orientation during testing could cause large amounts of drift to what was already fairly insensitive data. However, these delays would have been longer and crippled the project had they not been diagnosed sooner. Coding diagnostic software, even when an immediate need was not obvious, always proved to give fantastic returns on the time invested. The „viewer‟ forms in the IMUReader and WiimoteReader were used just as much as (if not more often than) the data output modes. Finding out what was wrong when the IMU was giving bad accelerometer data hinged entirely on being able to see the IMU‟s raw data graphed over time before it ever entered MATLAB. Future Senior Design teams would be well advised to create easy-to-use tools to test individual stages of their projects. 23

Integration Code / Software
It should be noted that the original code for integrating the data from the IMU was significantly more complex. As the IMU could be rotated during the collection of data, it was necessary to track the inertial reference frame as the probe was rotated, in order to accurately track position and subtract out the gravity vector. This was accomplished through the use of an equation using the axial rotation and acceleration values to subtract out the Coriolis, centrifugal, and Euler “fictitious” accelerations from the collected acceleration values.

Putting It All Together – Combining the OCT and position data
Working with Adeel on the initial understanding of his Matlab code gave us a very strong start towards making our code that combines the position data and the OCT data. Using some variations on functions that were already created by the Biophotonics group, we were able to manipulate the OCT output files into a form that was useable by us for our purposes. Working on the position data itself was a bit more difficult. Due to the difficulties with obtaining useful data out of the IMU, actually putting the code to test was postponed for quite a while. However, using pseudo-data, we were able to successfully run through the ending steps of our program. This means that were accurately placed each A-scan into a position in the 3-d matrix and sliced through it to produce an image stack useable in ImageJ. Not everything went perfectly smooth the first time. Once data was being acquired by the IMU, there was countless hours of work put into the coding in order to be able to increase the signal-to-noise ratio. A Kalman filter was beyond the scope of our knowledge, but some rudimentary filters, as described above in the calibration2.m file, were applied and were able to block out some of the minor noise. Regardless of the level or type of filtering that was implemented, the bad data that was being outputted by the IMU was unable to be overcome. Working on this part of the code in the future, there are a few things that could be addressed. Firstly, a Kalman filter would be able to greatly increase the reliability of the data that is being used. Also, a less memory intensive way to store the data should be investigated. If this were to be implemented, out-of-memory errors would be able to be avoided, and processing of the data would be faster and smoother. Fudical markers could also significantly improve the reliability of the data and could help account for drift of the accelerometer data. Overall, however, this section of the code seemed to flow pretty smoothly and not encounter too many errors or obstacles.

24

Conclusion
In conclusion, the Atlas OCT team succeeded in providing a prototype device that demonstrates proof of concept of a hand-held OCT system, capable of tracking its position and yielding both two dimensional and three dimensional OCT scans. Although throughout the course of the project several factors interfered with its completion, including the failures of the IMU and the limitations of the Wiimote, the team still feels that overall the objectives of project were achieved. The groundwork has also been laid for further improvements onto the design, including implementing a Kalman filter and reutilizing the possibility of three dimensional scanning. Not only was a great amount accomplished, but a great amount was learned.

25

Acknowledgements
The Atlas OCT team would like to acknowledge the following individuals for their assistance throughout the duration of this project: Dr. Michael Haney for the never ending guidance and support. Dr. Stephen Boppart for opening his laboratory to the team, and providing us with a great wealth of knowledge and support. Dr. Utkarsh Sharma for taking the time to work with the team, the materials provided, constructing the probe, and assisting with troubleshooting the IMU and Wiimote. Adeel Ahmad for providing us with his Matlab code, sample cam files, as well as additional troubleshooting assistance. Dr. Haohua Tu for the assistance and intelligible advice. Dr. Brad Sutton for assistance with the imaging problems encountered. The Bioengineering department for funding this endeavor, as well the years of knowledge they have instilled upon us.

26

Appendix 1: Atlas OCT MOU

MEMORANDUM OF UNDERSTANDING
BETWEEN

ATLAS OCT

BIOE 498 Senior Design University of Illinois Urbana Champaign
AND

Dr. Stephen A. Boppart Biophotonics Imaging Laboratory Beckman Institute for Advanced Science and Technology

27

MEMORANDUM OF UNDERSTANDING BETWEEN ATLAS OCT BIOE 498 Senior Design University of Illinois at Urbana-Champaign AND Dr. Stephen A. Boppart Biophotonics Imaging Laboratory Beckman Institute for Advanced Science and Technology Tunaidi Ansari, Brian Baker, Casey Lewis, and Nickalus Zielinski, currently enrolled in BIOE 498 Senior Design at the University of Illinois at UrbanaChampaign, hereinafter referred to as ATLAS OCT, and Dr. Stephen A. Boppart of the Biophotonics Imaging Laboratory at Beckman Institute for Advanced Science and Technology, hereinafter referred to as the Client, being convinced that design activities by ATLAS OCT for and with the Client, hereinafter referred to as the Parties, would be mutually beneficial, hereby conclude this Memorandum of Understanding, hereinafter referred to as the Memorandum. The instructors and teaching staff of BIOE 498 Senior Design at the University of Illinois at Urbana-Champaign, hereinafter referred to as the Instructors, are identified as individuals holding an interest in the educational outcomes of this endeavor, but are not signatories to the Memorandum.

ARTICLE 1: PROBLEM Statement To obtain Optical Coherence Tomography (OCT) scans from a handheld device, position tracking data must be integrated with OCT data to reconstruct a two dimensional image. Such a device must consistently and accurately track its position so that a computer may assemble images from the data. Currently, there is no such device that incorporates position tracking to create images from a manual scanner. ARTICLE 2: MISSION STATEMENTS ATLAS OCT will provide Dr. Stephen Boppart and the Biophotonics Imaging Laboratory at the Beckman Institute for Advanced Science and Technology with a physical handheld OCT scanner that tracks its position, with supporting software to assemble two dimensional images from the collected data. If time permits, generation of three dimensional images and publishing of research shall be attempted. This will be achieved through regular meetings with Dr. Stephen

28

Boppart, Dr. Haohua Tu, and the senior design team, and interaction with other members of the Biophotonics Imaging Laboratory. ARTICLE 3: PROJECT OBJECTIVES Section 1: Client/Team Objectives   A prototype device that collects OCT data associated with its physical position Software to operate the device and construct images from the data and its source code

Section 2: BIOE 498 Senior Design Objectives The following items shall be provided by ATLAS OCT to the Instructors on a regular (indicated) basis, beginning in the Spring 2009 semester. They are listed here for the benefit of the Client.   weekly progress reports from ATLAS OCT, either typed (paper) or transmitted via email to the Instructors o with copies (cc’ed) to the Client and Dr. Haohua Tu weekly progress meetings by ATLAS OCT with the Instructors o to which the Client and Dr. Tu are welcome to attend

The following items shall be provided by ATLAS OCT to the Instructors on a onetime basis, as indicated. They are listed here for the benefit of the Client.     Concepts Presentation, January 2009 Mid-term Design Review, prior to Spring Break, 2009 Final Presentation, May 2009 Vendor-fair table display and/or demo, during the Final Presentation session, including at least one of o poster + business card o brochure o demo CD o free sample (i.e. something that a “buyer” could take home) Two (2) copies of the Final Report, one for the Instructors, and one for the Client, due before Finals Week, May 2009, including but not limited to: o Problem, Mission, Plan o Design Input



29

o o o o o o

Design Output Reviewers comments; Team responses As-built documentation Verification Statement Validation Testing Results Plan/Project critique  This MOU, and all Amendments, included as an Appendix

ARTICLE 4: INTELLECTUAL PROPERTY The Parties have equal rights to the intellectual property arising from the Project Objectives described in Article 3. Intellectual property includes but is not limited to inventions, technical data and software. Intellectual property created by collaboration between the Client and ATLAS OCT shall be the joint intellectual property of both parties. Commercialization of intellectual property jointly developed or created by ATLAS OCT and the Client, if pursued, shall be pursued jointly. Both Parties may jointly publish results, as appropriate, arising from the Design Activities. Should one Party wish to publish independently, the other Party shall be asked to give prior written consent. All publications and all intellectual property developed under this Memorandum using UIUC funds are subject to UIUC policies and procedures and the UIUC contract(s) providing these funds with respect to copyright issues, as well to the policies and procedures of the Office of Technology Management of the University of Illinois with respect to inventions. ARTICLE 5: RESOURCES Section 1: Client Resources and Rules In support of and consideration for this effort, the Client agrees to allow ATLAS OCT access to the Biophotonics Imaging Laboratory and its equipment under the condition that a graduate student or research scientist oversees the laboratory while in session.

Section 2: BIOE 498 Resources and Rules The BIOE 498 Senior Design Course has limited resources available in 3116 DCL, which include desktop computers and printers. Teams will have on-going access to these resources under the conditions that they follow posted Rules, and demonstrate professional laboratory etiquette.

30

Section 3: Finances The Department of Bioengineering at the University of Illinois has a limited budget available for BIOE 498 Senior Design, and may be able to supply up to, but not to exceed, $500 for this project. All funding from this source will be subject to the policies and procedures of the Department of Bioengineering. Additional funding for this project may be provided by the Client, at the discretion of the Client. No funding or reimbursement will be provided to the ATLAS OCT without a documented Project Plan and Budget, approved by both the Client and the Instructors. ARTICLE 6: RESOLUTION OF DISPUTES In the event that questions arise about the interpretation of the provisions of the Memorandum or that problems occur on matters not prescribed herein, both Parties shall consult with each other and the Instructors, and reach a mutually acceptable solution.

ARTICLE 7: AMENDMENTS Amendments to this Memorandum shall be subject to mutual written agreement by both Parties. ARTICLE 8: DURATION The Memorandum shall take effect when it has been duly signed by the designated representatives of both Parties and shall remain in force until Commencement, 17 May 2009, unless otherwise redefined by an amendment to this Memorandum. Either Party may terminate this Memorandum by giving two (2) weeks notice to the other Party.

31

____________________________ (Signed) Dr. Stephen A. Boppart Head of Biophotonics Imaging Laboratory Beckman Institute for Advanced Science and Technology

____________________________ (Signed) Tunaidi Ansari Student BIOE 498 Senior Design Date: _______________________

____________________________ (Signed) Brian Baker Student BIOE 498 Senior Design Date: _______________________

____________________________ (Signed) Casey Lewis Student BIOE 498 Senior Design Date: _______________________

____________________________ (Signed) Nickalus Zielinski Student BIOE 498 Senior Design Date: _______________________

32

Appendix 2: Atlas OCT Project Plan

33

Appendix 3: Atlas OCT Budget

Atlas OCT Budget (Final) ITEM IMU 6 DoF v2 with Bluetooth® Capability/ADXRS40 Bluetooth® USB Module Shipping 2 hours (Approximate) Machine Shop Work OCT attachment SM1V05 Shipping Polycase Models Shipping Wiimote AA Batteries Total: ESTIMATED COST $199.95 $12.95 $15.00 $100.00 $29.60 $10.00 $10.00 $15.00 $40.00 $4.00 $436.50 ACTUAL COST $199.95 $12.95 $11.21 $82.50 $29.60 $7.26 4.58 8.98 $39.99 $3.99 $401.01 SOURCE http://www.sparkfun.com/commerce/product_info.php?products_id=8190 http://www.sparkfun.com/commerce/product_info.php?products_id=150 http://www.sparkfun.com/ http://www.scs.uiuc.edu/shop/ http://www.thorlabs.com/thorProduct.cfm?partNumber=SM1V05 http://www.thorlabs.com/ http://www.polycase.com/ http://www.polycase.com/ Campus Gamestop Campus Gamestop

34

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close