OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 17.03.2026, 06:44

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Adaptive Stochastic Gradient Descent Optimisation for Image Registration

2008·392 Zitationen·International Journal of Computer VisionOpen Access
Volltext beim Verlag öffnen

392

Zitationen

4

Autoren

2008

Jahr

Abstract

We present a stochastic gradient descent optimisation method for image registration with adaptive step size prediction. The method is based on the theoretical work by Plakhov and Cruz (J. Math. Sci. 120(1):964–973, 2004). Our main methodological contribution is the derivation of an image-driven mechanism to select proper values for the most important free parameters of the method. The selection mechanism employs general characteristics of the cost functions that commonly occur in intensity-based image registration. Also, the theoretical convergence conditions of the optimisation method are taken into account. The proposed adaptive stochastic gradient descent (ASGD) method is compared to a standard, non-adaptive Robbins-Monro (RM) algorithm. Both ASGD and RM employ a stochastic subsampling technique to accelerate the optimisation process. Registration experiments were performed on 3D CT and MR data of the head, lungs, and prostate, using various similarity measures and transformation models. The results indicate that ASGD is robust to these variations in the registration framework and is less sensitive to the settings of the user-defined parameters than RM. The main disadvantage of RM is the need for a predetermined step size function. The ASGD method provides a solution for that issue.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Medical Image Segmentation TechniquesStochastic Gradient Optimization TechniquesAdvanced Neural Network Applications
Volltext beim Verlag öffnen