Logging Experiments¶
As there are many potentially interesting metrics involved in the analysis of DAC methods, DACBench includes the possibility to track and store them.
To log information on an environment, you need a logger object:
from dacbench.logger import Logger
from pathlib import Path
logger = Logger(experiment_name="example", output_path=Path("your/path"))
If you want to use any of our tracking wrappers, you can then create a logging module for them:
from dacbench.wrappers import PerformanceTrackingWrapper
performance_logger = logger.add_module(PerformanceTrackingWrapper)
env = PerformanceTrackingWrapper(env, logger=performance_logger)
logger.add_env()
Now the logger will store information in the specified directory in .jsonl files. By adding more wrappers, you will also be provided with more information. The stored data can then be loaded into pandas dataframes:
from dacbench.logger import load_logs, log2dataframe
logs = load_logs("your/path/Performance∂TrackingWrapper.jsonl")
df = log2dataframe(logs)
Logger helper.
- class dacbench.logger.AbstractLogger(experiment_name: str, output_path: Path, step_write_frequency: int | None = None, episode_write_frequency: int = 1)[source]¶
Bases:
object
Logger interface. The logger classes provide a way of writing structured logs as jsonl files and also help to track information like current episode, step, time … In the jsonl log file each row corresponds to a step.
- property additional_info¶
Log additional info.
- abstract close() None [source]¶
Makes sure, that all remaining entries in the are written to file and the file is closed.
- is_of_valid_type(value: Any) bool [source]¶
Checks if the value of any type in the logger’s valid types.
- Parameters:
value – value to check
Returns
--------
bool
- abstract log(key: str, value) None [source]¶
Writes value to list of values and save the current time for key.
- Parameters:
key (str) – key to log
value – the value must of of a type that is json serializable. Currently only {str, int, float, bool, np.number} and recursive types of those are supported.
- abstract log_dict(data)[source]¶
Alternative to log if more the one value should be logged at once.
- Parameters:
data (dict) – a dict with key-value so that each value is a valid value for log
- abstract log_space(key: str, value: ndarray | dict, space_info=None)[source]¶
Special for logging gym.spaces.
Currently three types are supported: * Numbers: e.g. samples from Discrete * Fixed length arrays like MultiDiscrete or Box * Dict: assuming each key has fixed length array
- Parameters:
key – see log
value – see log
space_info – a list of column names. The length of this list must equal the resulting number of columns.
- abstract next_step() None [source]¶
Call at the end of the step. Updates the internal state and dumps the information of the last step into a json.
- set_env(env: AbstractEnv) None [source]¶
Needed to infer automatically logged information like the instance id.
- Parameters:
env (AbstractEnv) – env to log
- class dacbench.logger.Logger(experiment_name: str, output_path: Path, step_write_frequency: int | None = None, episode_write_frequency: int = 1)[source]¶
Bases:
AbstractLogger
A logger that manages the creation of the module loggers. To get a ModuleLogger for you module (e.g. wrapper) call module_logger = Logger(…).add_module(“my_wrapper”). From now on module_logger.log(…) or logger.log(…, module=”my_wrapper”) can be used to log. The logger module takes care of updating information like episode and step in the subloggers. To indicate to the loggers the end of the episode or the next_step simple call logger.next_episode() or logger.next_step().
- add_agent(agent: AbstractDACBenchAgent)[source]¶
Writes information about the agent.
- Parameters:
agent (AbstractDACBenchAgent) – the agent object to add
- add_benchmark(benchmark: AbstractBenchmark) None [source]¶
Add benchmark to logger.
- Parameters:
benchmark (AbstractBenchmark) – the benchmark object to add
- add_module(module: str | type) ModuleLogger [source]¶
Creates a sub-logger. For more details see class level documentation.
- Parameters:
module (str or type) – The module name or Wrapper-Type to create a sub-logger for
Returns
--------
ModuleLogger
- close()[source]¶
Makes sure, that all remaining entries (from all sublogger) are written to files and the files are closed.
- log(key, value, module)[source]¶
Log a key-value pair to module.
- Parameters:
key (str | int) – key to log
value – value to log
module – module to log to
- log_dict(data, module)[source]¶
Log a data dict to module.
- Parameters:
data (dict) – data to log
module – module to log to
- log_space(key, value, module, space_info=None)[source]¶
Log a key-value pair to module with optional info.
- Parameters:
key (str | int) – key to log
value – value to log
module – module to log to
space_info – additional log info
- next_step()[source]¶
Call at the end of the step. Updates the internal state of all subloggers and dumps the information of the last step into a json.
- set_env(env: AbstractEnv) None [source]¶
Writes information about the environment.
- Parameters:
env (AbstractEnv) – the env object to track
- class dacbench.logger.ModuleLogger(output_path: Path, experiment_name: str, module: str, step_write_frequency: int | None = None, episode_write_frequency: int = 1)[source]¶
Bases:
AbstractLogger
A logger for handling logging of one module. e.g. a wrapper or toplevel general logging.
Don’t create manually use Logger to manage ModuleLoggers
- __del__()[source]¶
Makes sure, that all remaining entries in the are written to file and the file is closed.
- close()[source]¶
Makes sure, that all remaining entries in the are written to file and the file is closed.
- get_logfile() Path [source]¶
Get logfile name.
Returns:¶
pathlib.Path: the path to the log file of this logger
- log(key: str, value: dict | list | tuple | str | int | float | bool) None [source]¶
Writes value to list of values and save the current time for key.
- Parameters:
key (str) – key to log
value – the value must of of a type that is json serializable. Currently only {str, int, float, bool, np.number} and recursive types of those are supported.
- log_dict(data: dict) None [source]¶
Alternative to log if more the one value should be logged at once.
- Parameters:
data (dict) – a dict with key-value so that each value is a valid value for log
- log_space(key, value, space_info=None)[source]¶
Special for logging gym.spaces.
Currently three types are supported: * Numbers: e.g. samples from Discrete * Fixed length arrays like MultiDiscrete or Box * Dict: assuming each key has fixed length array
- Parameters:
key – see log
value – see log
space_info – a list of column names. The length of this list must equal the resulting number of columns.
- next_episode()[source]¶
Writes buffered logs to file. Invoke manually if you want to load logs during a run.
- next_step()[source]¶
Call at the end of the step. Updates the internal state and dumps the information of the last step into a json.
- reset_episode() None [source]¶
Resets the episode and step. Be aware that this can lead to ambitious keys if no instance or seed or other identifying additional info is set.
- dacbench.logger.flatten_log_entry(log_entry: dict) list[dict] [source]¶
Transforms a log entry. From: {
‘step’: 0, ‘episode’: 2, ‘some_value’: {
‘values’ : [34, 45],
}
} To: [
{ ‘step’: 0,’episode’: 2, ‘value’: 34,}, { ‘step’: 0,’episode’: 2, ‘value’: 45,}
].
- Parameters:
log_entry (Dict) – A log entry
- dacbench.logger.list_to_tuple(list_: list) tuple [source]¶
Recursively transforms a list of lists into tuples of tuples.
- Parameters:
list – (nested) list
Returns
--------
tuple ((nested))
- dacbench.logger.load_logs(log_file: Path) list[dict] [source]¶
Loads the logs from a jsonl written by any logger. The result is the list of dicts in the format: {
‘instance’: 0, ‘episode’: 0, ‘step’: 1, ‘example_log_val’: {
‘values’: [val1, val2, … valn], ‘episode: [ep1, ep2, …, epn], ‘step’: [step1, step2, …, stepn],
}.
- Parameters:
log_file (pathlib.Path) – The path to the log file
Returns
--------
[Dict
...]
- dacbench.logger.log2dataframe(logs: list[dict], drop_columns: list[str] | None = None) DataFrame [source]¶
Converts a list of log entries to a pandas dataframe. Usually used in combination with load_dataframe.
- Parameters:
logs (List) – List of log entries
wide (bool) – wide=False (default) produces a dataframe with columns (episode, step, time, name, value) wide=True returns a dataframe (episode, step, time, name_1, name_2, …) if the variable name_n has not been logged at (episode, step, time) name_n is NaN.
drop_columns (List[str]) – List of column names to be dropped (before reshaping the long dataframe) mostly used in combination with wide=True to reduce NaN values
Returns
--------
dataframe
- dacbench.logger.split(predicate: Callable, iterable: Iterable) tuple[list, list] [source]¶
Splits the iterable into two list depending on the result of predicate.
- Parameters:
predicate (Callable) – A function taking an element of the iterable and return Ture or False
iterable (Iterable) – the iterable to split
Returns
--------
(positives
negatives)