The economic average cost Brownian control problem

Academic Article

Abstract

  • AbstractIn this paper we introduce and solve a generalization of the classic average cost Brownian control problem in which a system manager dynamically controls the drift rate of a diffusion process X. At each instant, the system manager chooses the drift rate from a pair {u, v} of available rates and can invoke instantaneous controls either to keep X from falling or to keep it from rising. The objective is to minimize the long-run average cost consisting of holding or delay costs, processing costs, costs for invoking instantaneous controls, and fixed costs for changing the drift rate. We provide necessary and sufficient conditions on the cost parameters to ensure the problem admits a finite optimal solution. When it does, a simple control band policy specifying economic buffer sizes (α, Ω) and up to two switching points is optimal. The controller should invoke instantaneous controls to keep X in the interval (α, Ω). A policy with no switching points relies on a single drift rate exclusively. When there is no cost to change the drift rate, a policy with a single switching point s indicates that the controller should change to the slower drift rate when X exceeds s and use the faster drift rate otherwise. When there is a cost to change the drift rate, a policy with two switching points s < S indicates that the controller should maintain the faster drift rate until X exceeds S and maintain the slower drift rate until X falls below s.
  • Authors

  • Matoglu, Melda Ormeci
  • Vate, John H Vande
  • Yu, Haiyue
  • Status

    Publication Date

  • March 2019
  • Has Subject Area

    Published In

    Digital Object Identifier (doi)

    Start Page

  • 300
  • End Page

  • 337
  • Volume

  • 51
  • Issue

  • 01