-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configure ruff and black for linting #52
base: main
Are you sure you want to change the base?
Conversation
dcbb3e1
to
235d2a1
Compare
@benjaminaubin rebased |
4dc10d5
to
e943461
Compare
@benjaminaubin Rebased again 😁 I also enabled import sorting via Ruff (ref. #59). |
6bd67ab
to
445ae99
Compare
f85b1e8
to
46f6ec0
Compare
Rebased once again. |
Rebased 🍂 |
a0ff858
to
638bb54
Compare
Rebased on 059d8e9. |
@@ -402,7 +402,6 @@ def log_img(self, pl_module, batch, batch_idx, split="train"): | |||
# batch_idx > 5 and | |||
self.max_images > 0 | |||
): | |||
logger = type(pl_module.logger) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused local.
@@ -648,7 +647,7 @@ def init_wandb(save_dir, opt, config, group_name, name_str): | |||
|
|||
ckpt_resume_path = opt.resume_from_checkpoint | |||
|
|||
if not "devices" in trainer_config and trainer_config["accelerator"] != "gpu": | |||
if "devices" not in trainer_config and trainer_config["accelerator"] != "gpu": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
x not in y
is idiomatic.
@@ -691,7 +690,7 @@ def init_wandb(save_dir, opt, config, group_name, name_str): | |||
# TODO change once leaving "swiffer" config directory | |||
try: | |||
group_name = nowname.split(now)[-1].split("-")[1] | |||
except: | |||
except Exception: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
except:
is dangerous as it catches SystemExits and KeyboardInterrupts.
print(f"sigmas after discretization, before pruning img2img: ", sigmas) | ||
print("sigmas after discretization, before pruning img2img: ", sigmas) | ||
sigmas = torch.flip(sigmas, (0,)) | ||
sigmas = sigmas[: max(int(self.strength * len(sigmas)), 1)] | ||
print("prune index:", max(int(self.strength * len(sigmas)), 1)) | ||
sigmas = torch.flip(sigmas, (0,)) | ||
print(f"sigmas after pruning: ", sigmas) | ||
print("sigmas after pruning: ", sigmas) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused f
s.
def get_interactive_image() -> Image.Image: | ||
image = st.file_uploader("Input", type=["jpg", "JPEG", "png"]) | ||
if image is not None: | ||
image = Image.open(image) | ||
if not image.mode == "RGB": | ||
image = image.convert("RGB") | ||
return image | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Duplicated function.
@@ -87,7 +87,7 @@ def benchmark_torch_function_in_microseconds(f, *args, **kwargs): | |||
) as prof: | |||
with record_function("Default detailed stats"): | |||
for _ in range(25): | |||
o = F.scaled_dot_product_attention(query, key, value) | |||
_o = F.scaled_dot_product_attention(query, key, value) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused local, mark it as such.
with autocast(device) as precision_scope: | ||
with autocast(device): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused locals.
print(key, [len(l) for l in batch[key]]) | ||
print(key, [len(lst) for lst in batch[key]]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
l is easily confusable for I and 1.
denoiser = lambda input, sigma, c: self.denoiser( | ||
self.model, input, sigma, c, **kwargs | ||
) | ||
def denoiser(input, sigma, c): | ||
return self.denoiser(self.model, input, sigma, c, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't use a lambda
where a def
will do. (That said, this is really just partial()
...)
# from .diffusionmodules.util import mixed_checkpoint as checkpoint | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dead code.
# compute in_ch_mult, block_in and curr_res at lowest res | ||
in_ch_mult = (1,) + tuple(ch_mult) | ||
# compute block_in and curr_res at lowest res |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused variable.
This PR configures linting with the Ruff linter (and the Black formatter that's already in use in the repository) to be easily runnable with pre-commit. It also adds a GitHub Actions workflow to run those checks for every pushed commit.
It then fixes the issues that the default set of configuration for Ruff (with small changes, see pyproject.toml) notes.
This PR includes #146 and #147 because they would otherwise cause lint errors.