3D Multi-Field Multi-Scale Features From Range Data In Spacecraft Proximity Operations
Loading...
Date
2012-07-16
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
A fundamental problem in spacecraft proximity operations is the determination of the 6 degree of freedom relative navigation solution between the observer reference frame and a reference frame tied to a proximal body. For the most unconstrained case, the proximal body may be uncontrolled, and the observer spacecraft has no a
priori information on the body. A spacecraft in this scenario must simultaneously map the generally poorly known body being observed, and safely navigate relative to
it. Simultaneous localization and mapping(SLAM)is a difficult problem which has been the focus of research in recent years. The most promising approaches extract
local features in 2D or 3D measurements and track them in subsequent observations by means of matching a descriptor. These methods exist for both active sensors such as Light Detection and Ranging(LIDAR) or laser RADAR(LADAR), and passive sensors such as CCD and CMOS camera systems. This dissertation presents a method for fusing time of flight(ToF) range data inherent to scanning LIDAR systems with the passive light field measurements of optical systems, extracting features which exploit information from each sensor, and solving the unique SLAM problem inherent to spacecraft proximity operations. Scale Space analysis is extended to unstructured 3D point clouds by means of an approximation to the Laplace Beltrami operator which computes the scale space on a manifold embedded in 3D object space using Gaussian convolutions based on a geodesic distance weighting. The construction of the scale space is shown to be equivalent to both the application of the diffusion equation to the surface data, as well as the surface evolution process which results from mean curvature flow. Geometric features are localized in regions of high spatial curvature or large diffusion displacements at multiple scales. The extracted interest points are associated with a local multi-field descriptor constructed from measured data in the object space. Defining features in object space instead of image space is shown to bean important step making the simultaneous consideration of co-registered texture and the associated geometry possible. These descriptors known as Multi-Field Diffusion Flow Signatures encode the shape, and multi-texture information of local neighborhoods in textured range data. Multi-Field Diffusion Flow Signatures display utility in difficult space scenarios including high contrast and saturating lighting conditions, bland and repeating textures, as well as non-Lambertian surfaces. The effectiveness and utility of Multi-Field Multi-Scale(MFMS) Features described by Multi-Field Diffusion Flow Signatures is evaluated using real data from proximity operation experiments performed at the Land Air and Space Robotics(LASR) Laboratory at Texas A&M University.
Description
Keywords
Scale Space, Image Processing, LIDAR, Range Sensing, Sensor Fusion, Feature Detection, Relative-Navigation, Vision