3. QuickstartΒΆ

pconfigs is a generic library for configuring complex systems. Here we’ll create a hypothetical trainer for a machine learning model.

3.1. Setup AI agent rules.ΒΆ

Install pconfigs rules for your AI coding agent:

Then ask the agent to Create a new runnable pconfigs example module + config file.

3.2. Configure a class.ΒΆ

Create source/modules/trainer.py within your project.

from __future__ import annotations            # Forward references of typehints are required.
from pconfigs import pconfig, pconfiged, pdefaults

# Trainer class
@pconfiged(runnable=True)                     # runnable=True facilitates a main().
class Trainer:
    config: TrainerConfig                     # A class' config is always called 'config'.

    def __init__(self):                       # Constructor takes no arguments.
        pass

    def main(self, *args, **kwargs):                
        print(self.config.message)            # Config is automatically available.
        print(f"Lr: {self.config.lr}")


# Trainer config class.
@pconfig(constructs=Trainer)                  # We can use this config to construct Trainer.
class TrainerConfig:
    message: str                              # All config parameters are typehinted.
    lr: float


# Config defaults.
pdefaults += TrainerConfig(                   # Set defaults separately from type 
    message="Hello, World!",                  #  ..definitions (avoids clutter).
    lr=1e-4,
)

3.3. Create a config file.ΒΆ

Make source/pconfig/experiment_1.py:

from source.modules.trainer import TrainerConfig

config = TrainerConfig(                       # Set just the learning rate. Use defaults
    lr=1e-3,                                  #  ..for the other config parameters.
)                                   

3.4. Run the configured system.ΒΆ

$ python -m  pconfigs.run  source.pconfig.experiment_1.config
              ––––––––––    –––––––––––––––––––––––––   ––––
                runner               dotpath             attribute

This command constructs Trainer with the config in experiment_1.py and runs Trainer.main(). Specifically, it

  1. Reads the TrainerConfig instance in experiment_1.py

  2. Reads the constructable type that has been associated with TrainerConfig (it’s Trainer)

  3. Checks if Trainer is runnable (it is, because runnable=True),

  4. Constructs Trainer with experiment_1.config, and

  5. Calls main().

The output is,

Hello, World!
Lr: 0.001

This works equally well in machine learning applications, for example

$ torchrun --nproc_per_node=1 -m pconfigs.run source.pconfig.experiment_1.config

Runnable configs can encapsulate all system parameters into a single executable command that takes no parameters. This encapsulation facilitates reproducible experiments.

3.5. Create a revised config file.ΒΆ

A second feature of pconfigs is that you can automatically copy parameters between them.

Make source/pconfig/experiment_2.py:

from source.modules.trainer import TrainerConfig
from source.pconfig.experiment_1 import config as base 

config = TrainerConfig(
    base,                                    # Copy all parameters from the base experiment.
    message="Experiment 2.",                 # ..except change the message parameter.
)

This automatic copying capability simplifies the process of creating subsequent experiments. (See also Construction.)

$ python -m  pconfigs.run  source.pconfig.experiment_2.config

prints

Experiment 2.
Lr: 0.001

3.7. But wait, there’s more!ΒΆ

This quickstart covers the essentials: defining, running, copying, and printing configs. For more details,

  1. Constructing configs (see Construction)

  2. Running configs (see Runnables)

  3. Printing configs (see Printing)

And many important pconfig features have not been covered here:

  1. Computed properties with @pproperty (see Properties)

  2. Environment variables with @penv (see Environments)

  3. Testing with pconfigs.test (see Testing)

  4. External library configuration with @pconfiged(mock=True) (see External Libraries)

  5. Interpretable enum representations with @penum (see Enums)