Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .github/workflows/update_project_labels.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,9 @@ jobs:
- name: LM4
if: contains(github.event.pull_request.labels.*.name, 'LM4')
run: echo "SUBCOMPONENTS=${SUBCOMPONENTS} LM4" >> $GITHUB_ENV
- name: FB
if: contains(github.event.pull_request.labels.*.name, 'FB')
run: echo "SUBCOMPONENTS=${SUBCOMPONENTS} FB" >> $GITHUB_ENV
- name: Update subcomponents text
if: ${{ env.SUBCOMPONENTS != '' }}
uses: nipe0324/update-project-v2-item-field@v2.0.2
Expand Down
8 changes: 4 additions & 4 deletions doc/UsersGuide/source/BuildingAndRunning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ the data required to run the WM RTs are already available at the following ``DIS

Within ``DISKNM``, input data for the UFS WM is located at the following locations:

* **INPUTDATA_ROOT**: ``${DISKNM}/NEMSfv3gfs/input-data-20250507``
* **INPUTDATA_ROOT**: ``${DISKNM}/NEMSfv3gfs/input-data-20251015``
* **INPUTDATA_ROOT_WW3** ``${INPUTDATA_ROOT}/WW3_input_data_20250807``
* **INPUTDATA_ROOT_BMIC**: ``${DISKNM}/NEMSfv3gfs/BM_IC-20220207``
* **INPUTDATA_LM4**: ``${INPUTDATA_ROOT}/LM4_input_data``
Expand All @@ -117,10 +117,10 @@ The regression testing script (``rt.sh``) has certain default data directories (
The corresponding data is publicly available in the data bucket. To view the data, users can visit https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html.
Users can download the data and update the ``rt.sh`` script to point to the appropriate locations in order to run RTs on their own system:

* ``INPUTDATA_ROOT``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20250507/
* ``INPUTDATA_ROOT_WW3``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20250507/WW3_input_data_20250807/
* ``INPUTDATA_ROOT``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20251015
* ``INPUTDATA_ROOT_WW3``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20251015/WW3_input_data_20250807/
* ``INPUTDATA_ROOT_BMIC``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#BM_IC-20220207/
* ``INPUTDATA_LM4``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20250507/LM4_input_data/
* ``INPUTDATA_LM4``: https://noaa-ufs-regtests-pds.s3.amazonaws.com/index.html#input-data-20251015/LM4_input_data/

To download data, users must select the files they want from the bucket and download them either in their browser, via a ``wget`` command, or through the AWS CLI.

Expand Down
2 changes: 1 addition & 1 deletion doc/UsersGuide/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@
# Ignore anchor tags for links that show Not Found even when they exist.
linkcheck_anchors_ignore = [r"L\d*",
r"BM_IC-20220207",
r"input-data-20250507*",
r"input-data-20251015*",
]
# Ignore working links that cause a linkcheck 403 error.
linkcheck_ignore = [r'https://agupubs\.onlinelibrary\.wiley\.com/doi/10\.1029/2020MS002260',
Expand Down
Loading