Skip to content

rllib.env

Full path: schola.rllib.env

schola.rllib.env

RLlib Environment Implementations for Schola/Unreal Engine.

This module provides two environment classes for interfacing Unreal Engine with RLlib:

  1. RayEnv: Single-environment implementation (inherits from BaseRayEnv, MultiAgentEnv)
  • Automatically selected when num_envs == 1
  • Returns MultiAgentDict format
  • Compatible with gymnasium wrappers
  • Validates that only one environment is created
  1. RayVecEnv: Vectorized multi-environment implementation (inherits from BaseRayEnv, VectorMultiAgentEnv)
  • Automatically selected when num_envs > 1
  • Returns List[MultiAgentDict] format
  • NOT compatible with gymnasium wrappers
  • Supports multiple parallel environments

Both classes inherit from BaseRayEnv, which provides shared functionality for protocol/simulator management, space initialization, and common properties.

Use make_ray_env() factory function to automatically select the appropriate class based on the number of environments from the protocol.

Functions

ItemDescription
sorted_multi_agent_space(multi_agent_space)Sorts the spaces in a multi-agent space alphabetically by agent ID.

Classes

ItemDescription
BaseRayEnv(protocol, simulator[, verbosity])Abstract base class for Schola RLlib environments.
RayEnv(protocol, simulator[, verbosity])Schola’s single-environment implementation of MultiAgentEnv for Unreal Engine.
RayVecEnv(*args, **kwargs)Schola’s vectorized implementation of VectorMultiAgentEnv for Unreal Engine.