Monocular Vision-Based Guidance and Navigation

dc.contributor.advisorMajji, Manoranjan
dc.contributor.committeeMemberJunkins, John
dc.contributor.committeeMemberXiong, Zixiang
dc.creatorVerras, Andrew A
dc.date.accessioned2022-01-27T22:10:13Z
dc.date.available2023-08-01T06:41:29Z
dc.date.created2021-08
dc.date.issued2021-07-14
dc.date.submittedAugust 2021
dc.date.updated2022-01-27T22:10:13Z
dc.description.abstractThis thesis develops mathematical methods for utilizing monocular camera measurements for the guidance and navigation of various aerospace vehicles. For each of the three projects discussed, a nonlinear estimation algorithm is developed and then applied for testing in the field, in the laboratory, and/or in simulation. The first project is an Entry, Descent, and Landing (EDL) application in the presence of a priori unknown terrain. The proposed algorithm extracts features from the image, and integrates the camera measurements with onboard inertial sensors to orient a landing vehicle with respect to the terrain. The filter is tested using model terrain in a laboratory setting as well as simulated terrain. The second project concerns automated aerial refueling and develops a filter that uses images of a known target along with inertial sensors and GPS to provide accurate estimates of the relative position and orientation of two airborne vehicles. The target vehicle is marked by LED beacons which allow for fast image processing. Results are obtained from field testing using two automobiles and from laboratory testing using robotic platforms from the Land, Air, and Space Robotics (LASR) Laboratory at Texas A&M University. Finally, the third project is for a space debris removal application. The target is a depleted rocket body in Earth orbit. It has a known shape and size but has no onboard sensors or beacons. The image projection of the target's rocket nozzle is detected by the computer vision software, and an ellipse is fitted to the projection. The parameters of this ellipse are used to estimate the target's position and orientation, which then guides an onboard capture system to secure the target. This process is tested using robots from the LASR lab.en
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/1969.1/195240
dc.language.isoen
dc.subjectComputer Visionen
dc.subjectGuidanceen
dc.subjectNavigationen
dc.subjectKalman Filteren
dc.subjectEDLen
dc.subjectProximity Operationsen
dc.subjectTRNen
dc.titleMonocular Vision-Based Guidance and Navigationen
dc.typeThesisen
dc.type.materialtexten
local.embargo.terms2023-08-01
local.etdauthor.orcid0000-0001-8152-9050
thesis.degree.departmentAerospace Engineeringen
thesis.degree.disciplineAerospace Engineeringen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.levelMastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 3 of 3
Loading...
Thumbnail Image
Name:
VERRAS-THESIS-2021.pdf
Size:
12.56 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
CopyrightandAvailabilityApprovalForm_1.pdf
Size:
1.12 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
CopyrightandAvailabilityApprovalForm.pdf
Size:
1.12 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
6.31 KB
Format:
Plain Text
Description: