Markov Game/Stochastic Game

Introduced in (, ), the following definition is from Chapter 2 in (, a).

A Markov game is a tuple \((N,S,\mathbf{ A },\mathbf{ R },T)\) where:

Notice that a Markov Decision Process (MDP) is a Markov game with 1 agent.

Emacs 29.4 (Org mode 9.6.15)