Show simple item record

dc.contributor.advisorNowka, Kevin
dc.creatorGadepally, Krishna Chaitanya
dc.date.accessioned2023-09-19T19:02:46Z
dc.date.available2023-09-19T19:02:46Z
dc.date.created2023-05
dc.date.issued2023-05-02
dc.date.submittedMay 2023
dc.identifier.urihttps://hdl.handle.net/1969.1/199115
dc.description.abstractComputer vision and image processing algorithms work well under strong conditions. Unexpected computer vision results may harm pipeline modules, reducing solution efficacy. A predictor framework that was simultaneously trained with a hardness predictor network to mitigate such effects was used. The performance of this framework is guaranteed to be better than that of images with lower "hardness" levels, which improves the efficiency of modules using computer vision algorithms. There is a clear understanding of why neural networks perform well. Images with less complex patterns produced relatively stronger activations in the initial convolutional layers. This study also examined how well a convolutional neural network worked when training, validating, and testing eliminated irrelevant convolutional layers for a single image. This enhanced CNNs’ efficiency. Predictive modeling of complex time-dependent processes is erroneous in many image-based machine learning applications due to the absence of early data. Predictive agriculture uses this frequently. From early growth until harvest, cotton crop factors must be monitored. Because cotton crop output is significantly correlated with growth parameter management during a cultivation season, researchers are interested in developing forecasting models to predict canopy and vegetative indices. This study utilized a multi-layer stacked LSTM model on cotton plant canopy and vegetative attributes obtained from unmanned aerial vehicle survey images from the 2020 growing season. Based on the 2021 cultivation year canopy and vegetative index data, the weights of the final few layers of the trained LSTM model were fine-tuned to predict canopy and vegetative indices. Deep Transfer Learning improved UAS canopy and vegetative indices’ accuracy.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectHardness Predictor
dc.subjectTransfer Learning
dc.titleTechniques and Methods for Improved Effectiveness and Accuracy in Computer Vision Applications
dc.typeThesis
thesis.degree.departmentElectrical and Computer Engineering
thesis.degree.disciplineElectrical Engineering
thesis.degree.grantorTexas A&M University
thesis.degree.nameMaster of Science
thesis.degree.levelMasters
dc.contributor.committeeMemberSong, Dezhen
dc.contributor.committeeMemberDuffield, Nick
dc.type.materialtext
dc.date.updated2023-09-19T19:02:47Z
local.etdauthor.orcid0000-0002-5462-7488


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record