Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor Deadline Submission JobInfo #49

Open
wants to merge 143 commits into
base: develop
Choose a base branch
from

Conversation

kalisp
Copy link
Member

@kalisp kalisp commented Oct 18, 2024

Changelog Description

This PR tries to make DL submissions easier, mostly handling of JobInfo segment. Collection of values is moved out of submission phase much earlier where values could be filled from default values in Settings or overriden by selected list of field by artist in Publisher UI.

Move to universal collector should help with potential validation of filled values.
Profiles in Settings should allow more control of values (like priority, limit_groups) based on task_types or hosts etc.

jobinfo_settings
jobinfo_publisher

Additional info

Should be used with ynput/ayon-core#958 !!

All DCC need to be tested:

  • AfterEffects
  • Blender
  • CelAction
  • Fusion
  • Harmony
  • Houdini
  • Max
  • Maya
  • Nuke
  • Unreal

Testing notes:

  1. do regular DL publishing
  2. test that nothing breaks
  3. update ayon+settings://deadline/publish/CollectJobInfo with new defaults and check if they propagete to Publisher
  4. update values in Publisher if they propagate to DL

These settings should be base of generic JobInfo values for Deadline submission. They should contain variables previously contained in Submit* Settings.
Some of them should be exposed to Publisher UI as artist overrides.
It was decided that dataclasses should be used instead of attrs. This moves DeadlineJobInfo which is full mapping of JobInfo from abstract submitters to collectors to limit need of new class and necessary remapping later.
Added new argument for old get_job_info (which should be probabaly renamed) to pass base of prepared object to be enhanced with DCC specific fields
Empty strings overrides None defaults which might cause issue (it definitely does for job_delay).
'deadline' dictionary wasnt used at all, it contained large DeadlineJobInfo which just enlarged metadata json unnecessary.
It is handling EnvironmentKey* type of fields
@kalisp kalisp self-assigned this Oct 18, 2024
Copy link
Contributor

@BigRoy BigRoy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like where this is going! :)

client/ayon_deadline/lib.py Outdated Show resolved Hide resolved
client/ayon_deadline/abstract_submit_deadline.py Outdated Show resolved Hide resolved
Comment on lines 62 to 80
chunk_size: int = SettingsField(999, title="Frames per Task")
priority: int = SettingsField(50, title="Priority")
group: str = SettingsField("", title="Group")
limit_groups: list[LimitGroupsSubmodel] = SettingsField(
default_factory=list,
title="Limit Groups",
)
concurrent_tasks: int = SettingsField(
1, title="Number of concurrent tasks")
department: str = SettingsField("", title="Department")
use_gpu: bool = SettingsField("", title="Use GPU")
job_delay: str = SettingsField(
"", title="Delay job",
placeholder="dd:hh:mm:ss"
)
use_published: bool = SettingsField(True, title="Use Published scene")
asset_dependencies: bool = SettingsField(True, title="Use Asset dependencies")
workfile_dependency: bool = SettingsField(True, title="Workfile Dependency")
multiprocess: bool = SettingsField(False, title="Multiprocess")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be very nice to get description argument set for each of these.

Comment on lines 289 to 291
enabled: bool = SettingsField(True)
optional: bool = SettingsField(title="Optional")
active: bool = SettingsField(title="Active")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With all the other attributes removed from these models we can now make one EnableStateModel class instead that we use for all of these making this file much smaller.

Copy link
Member Author

@kalisp kalisp Nov 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The question is if these jobs should be actually optional at all as if they would be disabled, submit publish job will fail.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The question is if these jobs should be actually optional at all as if they would be disabled, submit publish job will fail.

I guess the idea would have been that it maybe would allow to disable just the farm publishing support for an instance - but there creators definitely are not configured that way. So I agree - may make no sense to be optional, or even be disabled, currently?

dln_job_info.add_render_job_env_var()

# already collected explicit values for rendered Frames
if not dln_job_info.Frames:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually think this is a place where we should not be setting this in a Deadline specific way but have it accessible on the instance even outside of the job info. So Frames would be build up from instance.data["frames"] instead so that it could turn out to be non-Deadline specific. However, not critical for now and we can do that later. Might be worth a TODO in code or an issue after this PR.

server/settings/publish_plugins.py Outdated Show resolved Hide resolved
@BigRoy BigRoy added the type: enhancement Improvement of existing functionality or minor addition label Oct 21, 2024
Some DCC might not collect that in time

Co-authored-by: Mustafa Jafar <[email protected]>
Copy link
Contributor

@MustafaJafar MustafaJafar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It works pretty well in Houdini.
Publishing cache doesn't error any more. Cache jobs don't error any more.
Publishing renders works fine with different targets. Render jobs work fine.
(there's still one bug mentioned in #49 (comment) where export job attributes override render job attributes. I'll try to debug it on my side.)

image

@MustafaJafar
Copy link
Contributor

Tested Nuke. it works.
maybe we can update the label of the jobs to differentiate between bake and render but that's a separate issue I guess.
image

Copy link
Member

@moonyuet moonyuet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested with 3dsMax. Single Camera submission is working but not the multiple camera one.
image
image

DEBUG: Using Redshift...published scene wont be used..
INFO: Using D:\sh_boss_proj\interior_lenny\lenny_bedroom\asset\publish\workfile\workfileGeneric\v042\il_asset_workfileGeneric_v042.max for render/export.
DEBUG: Submitting 3dsMax render..
DEBUG: Submitting jobs for multiple cameras..
Traceback (most recent call last):
  File "C:\Users\Kayla\AppData\Local\Ynput\AYON\dependency_packages\ayon_2402141620_windows.zip\dependencies\pyblish\plugin.py", line 527, in __explicit_process
    runner(*args)
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\abstract_submit_deadline.py", line 116, in process
    job_id = self.process_submission()
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\plugins\publish\max\submit_max_deadline.py", line 96, in process_submission
    payload = self._use_published_name_for_multiples(
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\plugins\publish\max\submit_max_deadline.py", line 286, in _use_published_name_for_multiples
    job_info = self.get_job_info_through_camera(cam)
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\plugins\publish\max\submit_max_deadline.py", line 197, in get_job_info_through_camera
    job_info.OutputDirectory += os.path.dirname(filepath)
TypeError: unsupported operand type(s) for +=: 'NoneType' and 'str'

Comment on lines +50 to +53
# do not add expected files for multiCamera
if instance.data.get("multiCamera"):
job_info.OutputDirectory = None
job_info.OutputFilename = None
Copy link
Member

@moonyuet moonyuet Nov 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# do not add expected files for multiCamera
if instance.data.get("multiCamera"):
job_info.OutputDirectory = None
job_info.OutputFilename = None
# do not add expected files for multiCamera
# if instance.data.get("multiCamera"):
# job_info.OutputDirectory = None
# job_info.OutputFilename = None

As 3dsMax supports multi camera submission by scene in current workflow, so maybe we can comment out this one first and we can come back to this when we refactor the multi-camera submission by cameras in the future.
image
image
image

This will submit the renders correctly after commenting out

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe resetting it only to ''? It seems to have issue with concatenating None + str

Copy link
Member

@moonyuet moonyuet Nov 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe resetting it only to ''? It seems to have issue with concatenating None + str

still not working.
It would instead raise AttributeError: 'str' object has no attribute 'serialize' in submit_max_deadline.py

DEBUG: Using Redshift...published scene wont be used..
INFO: Using D:\sh_boss_proj\interior_lenny\lenny_bedroom\asset\publish\workfile\workfileGeneric\v045\il_asset_workfileGeneric_v045.max for render/export.
DEBUG: Submitting 3dsMax render..
DEBUG: Submitting jobs for multiple cameras..
DEBUG: Executing [GET] settings?project_name=interior_lenny&variant=test_thumbnail-prod-04&bundle_name=test_thumbnail-prod-04&site_id=lumpy-fervent-newt
DEBUG: http://localhost:5000 "GET /api/settings?project_name=interior_lenny&variant=test_thumbnail-prod-04&bundle_name=test_thumbnail-prod-04&site_id=lumpy-fervent-newt HTTP/1.1" 200 113377
DEBUG: Response <RestApiResponse [200]>
Traceback (most recent call last):
  File "C:\Users\Kayla\AppData\Local\Ynput\AYON\dependency_packages\ayon_2402141620_windows.zip\dependencies\pyblish\plugin.py", line 527, in __explicit_process
    runner(*args)
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\abstract_submit_deadline.py", line 116, in process
    job_id = self.process_submission()
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\plugins\publish\max\submit_max_deadline.py", line 101, in process_submission
    self.assemble_payload(job_info, plugin_info),
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\abstract_submit_deadline.py", line 301, in assemble_payload
    "JobInfo": job.serialize(),
  File "D:\ayon-addon_template\ayon-deadline\client\ayon_deadline\lib.py", line 505, in serialize
    serialized.update(attribute.serialize())
AttributeError: 'str' object has no attribute 'serialize'

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes - makes sense. You can't assign directly to it like that because you'd be replacing the attribute completely.

You should be doing:

Suggested change
# do not add expected files for multiCamera
if instance.data.get("multiCamera"):
job_info.OutputDirectory = None
job_info.OutputFilename = None
# do not add expected files for multiCamera
if instance.data.get("multiCamera"):
job_info.OutputDirectory.clear()
job_info.OutputFilename.clear()

To just clear the dict.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes - makes sense. You can't assign directly to it like that because you'd be replacing the attribute completely.

You should be doing:

To just clear the dict.

Tested with this suggestion, it can successfully render the multi-camera accordingly and also cleans up duplicated the job info in terms of OutputDirectory and OutputFIlename.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: enhancement Improvement of existing functionality or minor addition
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Render product publish: use same Deadline options as other DCCs ChunkSize in Houdini is ignored
5 participants