Skip to content

Commit

Permalink
fix linting
Browse files Browse the repository at this point in the history
  • Loading branch information
lausannel committed Sep 2, 2024
1 parent 8968543 commit 9452202
Showing 1 changed file with 4 additions and 5 deletions.
9 changes: 4 additions & 5 deletions torch_xla/experimental/spmd_fully_sharded_data_parallel.py
Original file line number Diff line number Diff line change
Expand Up @@ -97,11 +97,10 @@ def __init__(
self._auto_wrap(auto_wrap_kwargs, fsdp_kwargs)

_materialize_module(
module,
None,
[],
deferred_init_check_fn=lambda k: not isinstance(
k, SpmdFullyShardedDataParallel))
module,
None, [],
deferred_init_check_fn=lambda k: not isinstance(
k, SpmdFullyShardedDataParallel))

# Let's move the module to xla device in case it's not moved
# by the caller already.
Expand Down

0 comments on commit 9452202

Please sign in to comment.