You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I have trained a lora model locally and already got a lora model for flux. I am trying to use flux-general model from python client. In general inference is working without loading lora but when I try to upload my lora model using fal_client.upload_file using python client. Is there anything that I am missing here? Is there any limit for file size here? Thanks in advance.
The text was updated successfully, but these errors were encountered:
Hello, I have trained a lora model locally and already got a lora model for flux. I am trying to use flux-general model from python client. In general inference is working without loading lora but when I try to upload my lora model using
fal_client.upload_file
using python client. Is there anything that I am missing here? Is there any limit for file size here? Thanks in advance.The text was updated successfully, but these errors were encountered: