Markov Game/Stochastic Game

Introduced in (??, ????), the following definition is from Chapter 2 in (??, a).

A Markov game is a tuple \((N,S,\mathbf{ A },\mathbf{ R },T)\) where:

Notice that a Markov Decision Process (MDP) is a Markov game with 1 agent.

Emacs 30.1.90 (Org mode 9.7.11)