Nonlinear Volume Image Registration

Sponsored by: Kettering Medical Center and Wright Laboratories

Faculty: Ardy Goshtasby, Wright State University, Martin Satter, Wallace-Kettering Neuroscience Insttitute

Students: Lijun Ding, Hai Pan


Abtract.

Registration of 3-D (volumetric) images is being studied. The approach taken consists of four steps:

  1. Determining image structures (surfaces) by edge detection or image segmentation methods.
  2. Determining correspondence between structure points in the images by distance measures.
  3. Examinining the correspondences and elimination of potential mismatches by filtering.
  4. Determining a locally-adaptive transformation from the correspondences in order to deform one image to overlay the other.

The contribution of this work is considered to be in the formulation of a transformation function that will use corresponding points on two surfaces to register entire volumetric images. The developed methodology and software will be used to estimate brain shift during surgery, to register head-and-neck images, and to register whole-body images for diagnosis and treatment planning.


Results.

If the images do not have nonlinear geometric differences, they can be registered using the transformation of Cartesian coordinate system. Consider, for example, the following MR and PET images of a patient taken a short time apart.


Before registration

If the images can be registered by simple transformation of the Cartesian coordinate system, we first approximately align them manually to obtain.


Approximate registration

Then using an iterative template matching process refine the registration to obtain an accurate registration as shown below.


Accurate registration

Here is another way to evaluate the registration result.


Highlighted registration

This can be currently achieved by our program in a couple of minutes.

If the images have nonlinear geometric differences, such as brain images of different patients, a nonlinear transformation function is needed to register the images. Consider the two MR brain images shown below.


Images of two different brains before registration

If we select a number of corresponding points in the images manually and with thin-plate splines register the images we obtain the following result.


Images registered using thin-plate splines.

The same registration using image edges is shown below.


Images registered by thin-plate splines.

The images register better than the rigid alignment, but the registration contains some errors by a close observation. We have developed an edge-based registration method, which tries to deform one image locally so that corresponding edges in the images align. Doing so with edges obtained at a very coarse resolution, we obtain the following result.


Coarse registration by our method.

The same registration using image edges is shown below.


Coarse registration by our method.

Using finer edges in the registration process, we obtain the following result.


Fine registration by our method.

The registration shown with image edges is shown below.


Fine registration by our method.

Here is another set of examples showing registration of images before and after a surgery.


Images before registration.


Images registered using our linear registration method.


Images aligned using our linear registration method with edges shown.


Images registered using coarse edges.


Coarse registration with image edges shown.


Registration using fine edges.


Fine registration with edges shown.

To summarize, the steps of our algorithm are: 1) align the images globally using thin-plate splines. 2) Register the image using only coarse deformations by matching coarse image edges. 3) Accurately register the images by aligning high resolution image edges. The magic in our method is in a transformation function that can use a very large number of correspondences (the image edges) to accurately register the images.

This study is still in progress.


[WSU Home Page] [CSE Department Home Page] [Intelligent Systems Lab]

For more information contact A. Goshtasby (agoshtas@cs.wright.edu).

Last modified:10/12/2001.