You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Am I doing something wrong or is the --conf-path / -c option not supported when trying to run a PySpark script using mrjob spark-submit? For example, I have an EMR runner configuration specified in the custom.conf file, and I try to run the script with:
mrjob spark-submit -r emr -c custom.conf main.py
In the logs, it clearly says:
Looking for configs in /home/<my_home>/.mrjob.conf
Looking for configs in /etc/mrjob.conf
No configs found; falling back on auto-configuration
Why does it only check those two locations, and not consider the passed argument?
The text was updated successfully, but these errors were encountered:
Am I doing something wrong or is the
--conf-path
/-c
option not supported when trying to run a PySpark script usingmrjob spark-submit
? For example, I have an EMR runner configuration specified in the custom.conf file, and I try to run the script with:In the logs, it clearly says:
Why does it only check those two locations, and not consider the passed argument?
The text was updated successfully, but these errors were encountered: