Using Neural Networks to Maximize a Submodular Function
Abstract
It is accurate to say that optimization plays a huge role in the field of machine learning. Majority of the machine learning problems can be reduced to optimization problems and having the ability to solve such group of optimization problems be- comes the goal of all the people who are interested in diving into the deep machine learning ocean.
Speaking of doing optimization in continuous space, both convex and concave functions can be efficiently and effectively optimized due to its convexity and con- cavity. It seems that optimization in discrete functions is worth exploring. The most appealing point we see in a submodular set function has to be its natural diminishing returns property. This property makes the group of submodular functions fit in some of the real world machine learning optimization problems, where the problem objective functions are sharing the same characteristics. And the widely extended application of submodular maximization also goes to typical data mining problems that are highly related to submodularity, for example, maximizing the spread of influence in social networks.
In this thesis, we will be introducing a neural network model that has been designed specifically to maximize a submodular set function. This synthetic sub- modular set function represents a group of functions with certain properties, which will be talked about later in the thesis. This model has the fundamental structure that can be altered or used as a portion of a new model for any other submodular or not necessary submodular set function maximization problem. And empirical results from testing will support the liability of this designated model.
Citation
Hu, Wenxiu (2020). Using Neural Networks to Maximize a Submodular Function. Master's thesis, Texas A&M University. Available electronically from https : / /hdl .handle .net /1969 .1 /191684.