Skip to content

Latest commit

 

History

History
102 lines (73 loc) · 4.41 KB

logging.rst

File metadata and controls

102 lines (73 loc) · 4.41 KB

torch._logging

PyTorch has a configurable logging system, where different components can be given different log level settings. For instance, one component's log messages can be completely disabled, while another component's log messages can be set to maximum verbosity.

Warning

This feature is a prototype and may have compatibility breaking changes in the future.

Warning

This feature has not been expanded to control the log messages of all components in PyTorch yet.

There are two ways to configure the logging system: through the environment variable TORCH_LOGS or the python API torch._logging.set_logs.

.. automodule:: torch._logging
.. currentmodule:: torch._logging

.. autosummary::
    :toctree: generated
    :nosignatures:

    set_logs

The environment variable TORCH_LOGS is a comma-separated list of [+-]<component> pairs, where <component> is a component specified below. The + prefix will decrease the log level of the component, displaying more log messages while the - prefix will increase the log level of the component and display fewer log messages. The default setting is the behavior when a component is not specified in TORCH_LOGS. In addition to components, there are also artifacts. Artifacts are specific pieces of debug information associated with a component that are either displayed or not displayed, so prefixing an artifact with + or - will be a no-op. Since they are associated with a component, enabling that component will typically also enable that artifact, unless that artifact was specified to be off_by_default. This option is specified in _registrations.py for artifacts that are so spammy they should only be displayed when explicitly enabled. The following components and artifacts are configurable through the TORCH_LOGS environment variable (see torch._logging.set_logs for the python API):

Components:
all
Special component which configures the default log level of all components. Default: logging.WARN
dynamo
The log level for the TorchDynamo component. Default: logging.WARN
aot
The log level for the AOTAutograd component. Default: logging.WARN
inductor
The log level for the TorchInductor component. Default: logging.WARN
your.custom.module
The log level for an arbitrary unregistered module. Provide the fully qualified name and the module will be enabled. Default: logging.WARN
Artifacts:
bytecode
Whether to emit the original and generated bytecode from TorchDynamo. Default: False
aot_graphs
Whether to emit the graphs generated by AOTAutograd. Default: False
aot_joint_graph
Whether to emit the joint forward-backward graph generated by AOTAutograd. Default: False
ddp_graphs
Whether to emit graphs generated by DDPOptimizer. Default: False
graph
Whether to emit the graph captured by TorchDynamo in tabular format. Default: False
graph_code
Whether to emit the python source of the graph captured by TorchDynamo. Default: False
guards
Whether to emit the guards generated by TorchDynamo for each compiled function. Default: False
recompiles
Whether to emit a guard failure reason and message every time TorchDynamo recompiles a function. Default: False
output_code
Whether to emit the TorchInductor output code. Default: False
schedule
Whether to emit the TorchInductor schedule. Default: False
Examples:

TORCH_LOGS="+dynamo,aot" will set the log level of TorchDynamo to logging.DEBUG and AOT to logging.INFO

TORCH_LOGS="-dynamo,+inductor" will set the log level of TorchDynamo to logging.ERROR and TorchInductor to logging.DEBUG

TORCH_LOGS="aot_graphs" will enable the aot_graphs artifact

TORCH_LOGS="+dynamo,schedule" will enable set the log level of TorchDynamo to logging.DEBUG and enable the schedule artifact

TORCH_LOGS="+some.random.module,schedule" will set the log level of some.random.module to logging.DEBUG and enable the schedule artifact