-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
optimized_hydrogen_plant cause get_weather_data error #22
Comments
Hi, Thank you for sharing this issue that you're facing. Hopefully we can solve this quickly and easily! A couple of notes: Secondly, when it runs successfully the rule should download a bunch of files into a "temp" folder and then in the end it will output the desired, complete nc file in the "Cutouts" folder. Unfortunately, your runs are getting interrupted with an error and so you've just been left with the temp files and not the end file. Things to get me up to speed with what is going on on your side, as I have just ran this and it's not an issue with the code:
|
This issue could be with the package versions that we have in our current environment file. I had been testing the new cdsapi and updated versions locally without updating the repo. If you could try |
Hi @samiyhanaqvi , Thanks for your time. So far, I have followed the instructions you stated. I have checked the URL and token; everything seems fine. I have also updated the athlete to 0.2.14, as I am still facing the same issue. Please kindly have a look below at the installed packages of the environment. mamba list packages in environment at /home/bachirou/miniconda3/envs/geoh:Name Version Build Channel_libgcc_mutex 0.1 conda_forge conda-forge |
To confirm whether the issue is from the atlite I have run the get_weather_data.py on the previous without the snakemake the algorithm create the temp and some tempwcxz.nc file inside while enabling to combine those file in one single country.nc file in Cut.couts. Maybe we need to look at if the issue is not related to snakmake mamba activate geoh2s get_weather_data 1 Select jobs to execute... [Wed Oct 30 22:23:07 2024] INFO:atlite.cutout:Building new cutout Cutouts/NE_2023.nc RuleException: |
Hi, I need to look through the packages and see if there are any compatibility issues there. I wanted to reply to clear up that there is not an issue with snakemake. Snakemake has just notified us of the issue that is happening when you run the get_weather_data.py file. Just to make sure we're on the same page - did you update Atlite in your old environment file and then run the old get_weather_data.py file and it worked? You should have two environments and if it's working with one and not the other, I suspect that there is an issue with the environment and the package versions. It would be helpful if you could run the following for both environments. (It will produce a file with the file name given at the end, make sure to change that for each environment.)
Could you then attach both to your reply and I can then compare them to see if the issue is there. |
Hi @samiyhanaqvi, Thanks for clarifying regarding the snakemake. It seems the old environment is requesting to update some package. So i updated and run the env2 the attached file.ymal is below. I have also tested something that would be helpful to resolve the issue, i have copied the get_weather_data.py and weather_parameters.xlsx from the previous versions folder and pasted into the new version (ven2 folder) then i runs the python get_weahther_data.py in env 2 it downloaded the NE_2023.nc successfully. |
Then i run again the snakemake -j 1 Cutouts/country_2022.nc snakemake check all the required files are downloaded in the temp folder and cutouts, then confirmed the successful this step. |
Does the new version with snakemake work with your new environment (which is the one in the attached file?)? |
The new version with snakmake does not work with the new environment when apply for get_weather_data.py rule. |
I think something might have gone wrong with your environment, as it seems you have updated packages which could then cause some incompatibility. In its current state, the environment should work and should display some update suggestions as "warnings", but you should ignore those, as they will and need to be looked into to ensure that updating will not cause any issues with other packages. It seems best to make sure you've got the most up-to-date repo using I know this might seem counter-intuitive, but I have just checked the environment on mine and it works with the get_weather_data.py rule. This is all seeming to be an environment issue. |
Again, as i copy the data_year.nc downloaded from the previous version instead of using the snakefile -j get_weather.py because when using the later the algorithm download the .nc file in the temp folder without combining them in one single file in the Cutouts folder. below is the snaphot:
Probable cause could be maybe the location of the temp file path, which need more specification on where the temp file should be path ((# TEMPDIR DEFINITION IS NEW TO FIX ERROR # ??))
first error hex_lcoh:
/GeoH2-main_new_NE$ snakemake -j 1 Resources/hex_lcoh_NE_2023.geojson
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Job stats:
job count
get_weather_data 1
optimize_hydrogen_plant 1
total 2
Select jobs to execute...
[Fri Oct 25 06:17:08 2024]
rule get_weather_data:
input: Data/hexagons_with_country_NE.geojson
output: Cutouts/NE_2023.nc
jobid: 1
reason: Missing output files: Cutouts/NE_2023.nc
wildcards: country=NE, weather_year=2023
resources: tmpdir=/tmp
INFO:atlite.cutout:Building new cutout Cutouts/NE_2023.nc
INFO:atlite.data:Storing temporary files in temp
INFO:atlite.data:Calculating and writing with module era5:
Traceback (most recent call last):
File "/home/bachirou/GeoH2-main_new_NE/.snakemake/scripts/tmpa7157ab8.get_weather_data.py", line 58, in
cutout.prepare(tmpdir="temp") # TEMPDIR DEFINITION IS NEW TO FIX ERROR # /home/bachirou/GeoH2-main_new_NE/temp/ C:\Users\hp\AppData\Local\Temp snakemake -j [NUMBER OF CORES TO BE USED] Resources/hex_transport_[COUNTRY ISO CODE].geojson # C:\Users\hp\AppData\Local\Temp\tmpaynbmwwm
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 98, in wrapper
res = func(*args, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 164, in cutout_prepare
ds = get_features(cutout, module, missing_features, tmpdir=tmpdir)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 46, in get_features
datasets = compute(*datasets)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/dask/base.py", line 662, in compute
results = schedule(dsk, keys, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/datasets/era5.py", line 352, in get_data
"area": _area(coords),
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/datasets/era5.py", line 227, in _area
x0, x1 = coords["x"].min().item(), coords["x"].max().item()
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/_aggregations.py", line 1581, in min
return self.reduce(
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/dataarray.py", line 3834, in reduce
var = self.variable.reduce(func, dim, axis, keep_attrs, keepdims, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/variable.py", line 1666, in reduce
result = super().reduce(
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/namedarray/core.py", line 912, in reduce
data = func(self.data, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 447, in f
return func(values, axis=axis, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/nanops.py", line 71, in nanmin
return nputils.nanmin(a, axis=axis)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/nputils.py", line 232, in f
result = bn_func(values, axis=axis, **kwargs)
ValueError: numpy.nanmin raises on a.size==0 and axis=None; So Bottleneck too.
[Fri Oct 25 06:17:10 2024]
Error in rule get_weather_data:
jobid: 1
input: Data/hexagons_with_country_NE.geojson
output: Cutouts/NE_2023.nc
RuleException:
CalledProcessError in file /home/bachirou/GeoH2-main_new_NE/Snakefile, line 41:
Command 'set -euo pipefail; /home/bachirou/miniconda3/envs/geoh2s/bin/python3.10 /home/bachirou/GeoH2-main_new_NE/.snakemake/scripts/tmpa7157ab8.get_weather_data.py' returned non-zero exit status 1.
File "/home/bachirou/GeoH2-main_new_NE/Snakefile", line 41, in __rule_get_weather_data
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/concurrent/futures/thread.py", line 58, in run
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2024-10-25T061707.897314.snakemake.log
second error get_weather_data
/GeoH2-main_new_NE$ snakemake -j 1 Cutouts/NE_2023.nc
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 1 (use --cores to define parallelism)
Rules claiming more threads will be scaled down.
Job stats:
job count
get_weather_data 1
total 1
Select jobs to execute...
[Fri Oct 25 06:18:27 2024]
rule get_weather_data:
input: Data/hexagons_with_country_NE.geojson
output: Cutouts/NE_2023.nc
jobid: 0
reason: Missing output files: Cutouts/NE_2023.nc
wildcards: country=NE, weather_year=2023
resources: tmpdir=/tmp
INFO:atlite.cutout:Building new cutout Cutouts/NE_2023.nc
INFO:atlite.data:Storing temporary files in temp
INFO:atlite.data:Calculating and writing with module era5:
Traceback (most recent call last):
File "/home/bachirou/GeoH2-main_new_NE/.snakemake/scripts/tmpey999h_0.get_weather_data.py", line 58, in
cutout.prepare(tmpdir="temp") # TEMPDIR DEFINITION IS NEW TO FIX ERROR #
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 98, in wrapper
res = func(*args, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 164, in cutout_prepare
ds = get_features(cutout, module, missing_features, tmpdir=tmpdir)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/data.py", line 46, in get_features
datasets = compute(*datasets)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/dask/base.py", line 662, in compute
results = schedule(dsk, keys, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/datasets/era5.py", line 352, in get_data
"area": _area(coords),
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/atlite/datasets/era5.py", line 227, in _area
x0, x1 = coords["x"].min().item(), coords["x"].max().item()
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/_aggregations.py", line 1581, in min
return self.reduce(
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/dataarray.py", line 3834, in reduce
var = self.variable.reduce(func, dim, axis, keep_attrs, keepdims, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/variable.py", line 1666, in reduce
result = super().reduce(
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/namedarray/core.py", line 912, in reduce
data = func(self.data, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/duck_array_ops.py", line 447, in f
return func(values, axis=axis, **kwargs)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/nanops.py", line 71, in nanmin
return nputils.nanmin(a, axis=axis)
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/site-packages/xarray/core/nputils.py", line 232, in f
result = bn_func(values, axis=axis, **kwargs)
ValueError: numpy.nanmin raises on a.size==0 and axis=None; So Bottleneck too.
[Fri Oct 25 06:18:29 2024]
Error in rule get_weather_data:
jobid: 0
input: Data/hexagons_with_country_NE.geojson
output: Cutouts/NE_2023.nc
RuleException:
CalledProcessError in file /home/bachirou/GeoH2-main_new_NE/Snakefile, line 41:
Command 'set -euo pipefail; /home/bachirou/miniconda3/envs/geoh2s/bin/python3.10 /home/bachirou/GeoH2-main_new_NE/.snakemake/scripts/tmpey999h_0.get_weather_data.py' returned non-zero exit status 1.
File "/home/bachirou/GeoH2-main_new_NE/Snakefile", line 41, in __rule_get_weather_data
File "/home/bachirou/miniconda3/envs/geoh2s/lib/python3.10/concurrent/futures/thread.py", line 58, in run
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2024-10-25T061827.048691.snakemake.log
The text was updated successfully, but these errors were encountered: