Intelligent Optimization and Control for Adaptive Cellular Networks
Abstract
When streaming media in a cellular network environment with less than ideal resources, simple
algorithms do not perform well by not prioritizing specific clients well enough. Therefore,
intelligent network sharing must occur to ensure maximum average quality of experience (QoE). We propose using the general idea of RAN intelligent control (RIC) at the level of scheduling at the radio access network (RAN) at dense and sparse timescale levels to enable such sharing. We formulate this problem of this problem of designing intelligent policies as a Constrained Markov decision process. We observe that the evolution of the state at a client is independent of the others given the scheduling decision on what resources to allocate to it. Hence, we may consider the problem as that of single clients that jointly have a resource constraint, but are otherwise unrelated. We develop reinforcement learning-based polices that are able to determine the resource allocation to clients in two settings of sparse-reward and sparse-control (SRSC) and dense-reward and dense-control (DRDC) and show that significant performance improvements are possible in both settings over vanilla and state-of-the-art polices.
Citation
Cheng, Ching Wen (2021). Intelligent Optimization and Control for Adaptive Cellular Networks. Master's thesis, Texas A&M University. Available electronically from https : / /hdl .handle .net /1969 .1 /196346.