Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
1c14c21
Fix: use platform-specific separator for AIIDA_PATH in config directo…
lainme Jul 21, 2025
5c70887
CLI: Fix output of `verdi node show` in dump README files (#6971)
GeigerJ2 Aug 13, 2025
aaf1c1b
Don't break on `sqlite_zip` profile deletion despite original aiida a…
GeigerJ2 Aug 22, 2025
a0473a3
Fix OperationalError for `add_nodes` with PSQL backend (#6991)
GeigerJ2 Sep 22, 2025
1db1601
CLI: Drop non-unique `-p` option for `dump` endpoints (#7043)
GeigerJ2 Oct 6, 2025
14df4e5
CLI: Drop non-unique `-n` option for archive import (#7044)
GeigerJ2 Oct 6, 2025
67e7693
🐛 Fix PSQL OpErr on archive creation by batching (#6993)
GeigerJ2 Oct 15, 2025
2b9cc86
Additional logging during `dump` operations (#7046)
GeigerJ2 Oct 31, 2025
191e470
Fix wrongly formatted `verdi code show` output (#7073)
GeigerJ2 Nov 4, 2025
3a4aee8
Adding ./ in front of a portable code binary (#7080)
giovannipizzi Nov 4, 2025
7927ca1
Fix: `verdi code list` for codes without computer (#7081)
giovannipizzi Nov 4, 2025
fcd34b1
CI: Fix install-with-conda job (#7103)
danielhollas Nov 21, 2025
e4a1b09
async_ssh: Use async semaphore instead of manual locking (#7018)
danielhollas Nov 21, 2025
35a18ac
Fix QB `IN` clause to avoid parameter limits (#6998)
GeigerJ2 Nov 24, 2025
fd1ed83
Fix race condition in `JobsList` (#7061)
khsrali Dec 2, 2025
f0132dd
Fix stuck progress bar on "Add nodes" of archive import (#7118)
GeigerJ2 Dec 4, 2025
edae491
Fix `UnboundLocalError` in `ZipfileBackendRepository` (#7129)
t-reents Dec 5, 2025
19364c1
Fix RST code snippets in QB `smarter_in` docstring (#7146)
GeigerJ2 Dec 10, 2025
4005693
CI fixes
GeigerJ2 Dec 10, 2025
2befa6b
Release v2.7.2
GeigerJ2 Dec 10, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .github/workflows/benchmark-config.json
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,14 @@
"xAxis": "id",
"backgroundFill": false,
"yAxisFormat": "logarithmic"
},
"large-archive": {
"header": "large-archive",
"description": "Comparison of import/export of large archives.",
"single_chart": true,
"xAxis": "id",
"backgroundFill": false,
"yAxisFormat": "logarithmic"
}
}
}
36 changes: 27 additions & 9 deletions .github/workflows/test-install.yml
Original file line number Diff line number Diff line change
Expand Up @@ -131,15 +131,6 @@ jobs:
with:
channels: conda-forge

# Use mamba because conda is running out of memory
# see https://github.com/conda-incubator/setup-miniconda/issues/274
- run: |
conda install -n base conda-libmamba-solver
conda config --set solver libmamba

# Temporary workaround: https://github.com/mamba-org/mamba/issues/488
- run: rm /usr/share/miniconda/pkgs/cache/*.json

- name: Test installation
id: test_installation
continue-on-error: ${{ matrix.optional }}
Expand All @@ -153,6 +144,18 @@ jobs:
echo "::warning ::Failed conda installation for
Python ${{ matrix.python-version }}."

- name: Slack notification
if: >-
failure() &&
github.event_name != 'pull_request' &&
env.SLACK_WEBHOOK != null
uses: ./.github/actions/slack-notification
env:
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
with:
title: Installation via Conda failed on `aiida-core/main`
message: '`test-install.yml:install-with-conda` GHA workflow for Python ${{ matrix.python-version }} failed.'

tests:

needs: [install-with-pip]
Expand Down Expand Up @@ -210,3 +213,18 @@ jobs:
AIIDA_WARN_v3: 1
run: |
pytest -n auto --db-backend psql -m 'not nightly' tests/

- name: Slack notification
# Run this step if any of the previous steps fail.
# Don't run on PRs, the failure is clearly visible in GitHub UI.
# Run only when the `secrets.SLACK_WEBHOOK` is available, which is not the case for forks.
if: >-
failure() &&
github.event_name != 'pull_request' &&
env.SLACK_WEBHOOK != null
uses: ./.github/actions/slack-notification
env:
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
with:
title: Install tests of `aiida-core/main` failed
message: '`test-install.yml` GHA workflow for Python ${{ matrix.python-version }} failed.'
37 changes: 37 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,42 @@
# Changelog

## v2.7.2 - 2025-12-10

### Fixes

#### CLI
- CLI: Fix `verdi code list` for codes without computer (#7081) [[27b52da2f]](https://github.com/aiidateam/aiida-core/commit/27b52da2f6dc948af0fab8e71d7089e532548640)
- CLI: Fix wrongly formatted `verdi code show` output (#7073) [[32742c0e0]](https://github.com/aiidateam/aiida-core/commit/32742c0e0b60b53b54a42a5dd3b2b04d3449d2de)
- CLI: Additional logging during `dump` operations (#7046) [[fc00f5dec]](https://github.com/aiidateam/aiida-core/commit/fc00f5deccff2cb030a45f622d5c2cd49f3b0ff6)
- CLI: Drop non-unique `-n` option for archive import (#7044) [[3a7d440e9]](https://github.com/aiidateam/aiida-core/commit/3a7d440e9086d5127b4705034e8c3bda8c323a60)
- CLI: Drop non-unique `-p` option for `dump` endpoints (#7043) [[019172c2d]](https://github.com/aiidateam/aiida-core/commit/019172c2d96c6b2761e865e61ba9a09d6cbb6750)
- CLI: Fix output of `verdi node show` in dump README files (#6971) [[5e4da5b4d]](https://github.com/aiidateam/aiida-core/commit/5e4da5b4dfd00b6ae270e020b8446887b1754230)

#### Storage
- Fix RST code snippets in QB `smarter_in` docstring (#7146) [[cc0bb483d]](https://github.com/aiidateam/aiida-core/commit/cc0bb483d125c81e893ca085161d275015e1d87f)
- Fix QB `IN` clause to avoid parameter limits (#6998) [[8d562b44e]](https://github.com/aiidateam/aiida-core/commit/8d562b44ee37bc8ef9840f1806c48702a70b638e)
- Fix PSQL OperationalError on archive creation by batching (#6993) [[cfbbd687f]](https://github.com/aiidateam/aiida-core/commit/cfbbd687f4e4416a880dce186f9e198fd20c9164)
- Fix OperationalError for `add_nodes` with PSQL backend (#6991) [[9bccdc816]](https://github.com/aiidateam/aiida-core/commit/9bccdc816e859823f8347c540c6e914b43432d43)

#### Archive
- Fix `UnboundLocalError` in `ZipfileBackendRepository` (#7129) [[166d06c25]](https://github.com/aiidateam/aiida-core/commit/166d06c2532309aa500a949b757773abb470e5ca)
- Fix stuck progress bar on "Add nodes" of archive import (#7118) [[4e54cd476]](https://github.com/aiidateam/aiida-core/commit/4e54cd476dd2f089bcb501e9b74ca4b1e8d4f988)
- Don't break on `sqlite_zip` profile deletion despite original aiida archive file missing (#6929) [[274ce6717]](https://github.com/aiidateam/aiida-core/commit/274ce6717d3e7e184e5e058a4bfeab9cd808604a)

#### Engine
- Adding ./ in front of a portable code binary (#7080) [[9256f2fdd]](https://github.com/aiidateam/aiida-core/commit/9256f2fddd71abecf1e649065ef7dca5c67d061c)
- Fix race condition in `JobsList` (#7061) [[e79f0a44c]](https://github.com/aiidateam/aiida-core/commit/e79f0a44c4c816323c24510dfa3462ecd5a9f5ae)

#### Transport
- async_ssh: Use async semaphore instead of manual locking (#7018) [[ad5cafdb1]](https://github.com/aiidateam/aiida-core/commit/ad5cafdb10ff3889e53c3a7473cadc566b3ecb78)

#### Configuration
- Fix: use platform-specific separator for AIIDA_PATH in config directory detection (#6935) [[32d515a6b]](https://github.com/aiidateam/aiida-core/commit/32d515a6bd0fef4ae4abe99ed5a288ac438b7096)

### Devops
- CI: Fix install-with-conda job (#7103) [[0dcd10ac4]](https://github.com/aiidateam/aiida-core/commit/0dcd10ac4a83d65338aeefdf7a158f7a60b15fd2)


## v2.7.1 - 2025-07-16

### Fixes
Expand Down
7 changes: 7 additions & 0 deletions docs/source/nitpick-exceptions
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,12 @@ py:obj aiida.engine.processes.functions.N
py:obj aiida.engine.processes.functions.R_co
py:class P
py:class N
py:class R
py:class T
py:class aiida.engine.processes.functions.N
py:class aiida.engine.processes.functions.R_co
py:class aiida.common.utils.T
py:class aiida.common.utils.R

### third-party packages
# Note: These exceptions are needed if
Expand Down Expand Up @@ -256,9 +260,12 @@ py:class importlib_metadata.EntryPoints
py:class Command

py:class BooleanClauseList
py:class ColumnElement
py:class SQLCompiler
py:class sqlalchemy.orm.decl_api.Model
py:class sqlalchemy.sql.elements.ColumnElement
py:class sqlalchemy.orm.attributes.InstrumentedAttribute
py:class InstrumentedAttribute

py:class packaging.version.Version
py:exc seekpath.hpkot.EdgeCaseWarning
Expand Down
2 changes: 1 addition & 1 deletion src/aiida/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
'For further information please visit http://www.aiida.net/. All rights reserved.'
)
__license__ = 'MIT license, see LICENSE.txt file.'
__version__ = '2.7.1'
__version__ = '2.7.2'
__authors__ = 'The AiiDA team.'
__paper__ = (
'S. P. Huber et al., "AiiDA 1.0, a scalable computational infrastructure for automated reproducible workflows and '
Expand Down
8 changes: 6 additions & 2 deletions src/aiida/cmdline/commands/cmd_archive.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
from aiida.common.exceptions import CorruptStorage, IncompatibleStorageSchema, UnreachableStorage
from aiida.common.links import GraphTraversalRules
from aiida.common.log import AIIDA_LOGGER
from aiida.common.utils import DEFAULT_BATCH_SIZE

EXTRAS_MODE_EXISTING = ['keep_existing', 'update_existing', 'mirror', 'none']
EXTRAS_MODE_NEW = ['import', 'none']
Expand Down Expand Up @@ -128,7 +129,11 @@ def inspect(ctx, archive, version, meta_data, database):
)
@click.option('--compress', default=6, show_default=True, type=int, help='Level of compression to use (0-9).')
@click.option(
'-b', '--batch-size', default=1000, type=int, help='Stream database rows in batches, to reduce memory usage.'
'-b',
'--batch-size',
default=DEFAULT_BATCH_SIZE,
type=int,
help='Stream database rows in batches, to reduce memory usage.',
)
@click.option(
'--test-run',
Expand Down Expand Up @@ -321,7 +326,6 @@ class ExtrasImportCode(Enum):
'mirror: import all extras and remove any existing extras that are not present in the archive. ',
)
@click.option(
'-n',
'--extras-mode-new',
type=click.Choice(EXTRAS_MODE_NEW),
default='import',
Expand Down
105 changes: 98 additions & 7 deletions src/aiida/cmdline/commands/cmd_code.py
Original file line number Diff line number Diff line change
Expand Up @@ -219,16 +219,48 @@ def code_duplicate(ctx, code, non_interactive, **kwargs):
def show(code):
"""Display detailed information for a code."""
from aiida.cmdline import is_verbose
from aiida.common.pydantic import get_metadata

table = []

# These are excluded from the CLI, so we add them manually
table.append(['PK', code.pk])
table.append(['UUID', code.uuid])
table.append(['Type', code.entry_point.name])
for key in code.Model.model_fields.keys():
try:
table.append([key.capitalize().replace('_', ' '), getattr(code, key)])
except AttributeError:

for field_name, field_info in code.Model.model_fields.items():
# Skip fields excluded from CLI
if get_metadata(
field_info,
key='exclude_from_cli',
default=False,
):
continue

# Skip fields that are not stored in the attributes column
# NOTE: this also catches e.g., filepath_files for PortableCode, which is actually a "misuse"
# of the is_attribute metadata flag, as there it is flagging that the field is not stored at all!
# TODO (edan-bainglass) consider improving this by introducing a new metadata flag or reworking PortableCode
# TODO see also Dict and InstalledCode for other potential misuses of is_attribute
if not get_metadata(
field_info,
key='is_attribute',
default=True,
):
continue

value = getattr(code, field_name)

# Special handling for computer field to show additional info
if field_name == 'computer':
value = f'{value.label} ({value.hostname}), pk: {value.pk}'

# Use the field's title as display name.
# This allows for custom titles (class-cased by default from Pydantic).
display_name = field_info.title

table.append([display_name, value])

if is_verbose():
table.append(['Calculations', len(code.base.links.get_outgoing().all())])

Expand Down Expand Up @@ -359,7 +391,7 @@ def relabel(code, label):
)
@options.ALL(help='Include hidden codes.')
@options.ALL_USERS(help='Include codes from all users.')
@options.PROJECT(type=click.Choice(VALID_PROJECTIONS.keys()), default=['full_label', 'pk', 'entry_point'])
@options.PROJECT(type=click.Choice(list(VALID_PROJECTIONS.keys())), default=['full_label', 'pk', 'entry_point'])
@options.RAW()
@click.option('-o', '--show-owner', 'show_owner', is_flag=True, default=False, help='Show owners of codes.')
@with_dbenv()
Expand Down Expand Up @@ -398,19 +430,40 @@ def code_list(computer, default_calc_job_plugin, all_entries, all_users, raw, sh

query = orm.QueryBuilder()
query.append(orm.Code, tag='code', project=projections.get('code', None), filters=filters.get('code', None))
# Above, a join on the computer is appended to project its label. However, this implicitly requires a computer
# to be set. If no computer is set for a code node (i.e. for PortableCode), the code would be excluded from the
# results. Therefore, later a second query will be run to get all codes without a computer.
query.append(
orm.Computer,
tag='computer',
with_node='code',
project=projections.get('computer', None),
filters=filters.get('computer', None),
filters=filters.get('computer', None), # Possibly filter on computers, if required on the command line
)
query.append(
orm.User, tag='user', with_node='code', project=projections.get('user', None), filters=filters.get('user', None)
)
query.order_by({'code': {'id': 'asc'}})
tot_num_results = query.count()

if computer is None:
# Run a second query to get codes without a computer.
# This needs to be done only if no explicit filter on a computer is set.
query_nocomp = orm.QueryBuilder()
code_filters = {'dbcomputer_id': {'==': None}} # Filter all those without a computer
code_filters.update(filters.get('code', {}))
query_nocomp.append(orm.Code, tag='code', project=projections.get('code', None), filters=code_filters)
query_nocomp.append(
orm.User,
tag='user',
with_node='code',
project=projections.get('user', None),
filters=filters.get('user', None),
)
query_nocomp.order_by({'code': {'id': 'asc'}})
tot_num_results += query_nocomp.count()

if not query.count():
if tot_num_results == 0:
echo.echo_report('No codes found matching the specified criteria.')
return

Expand All @@ -429,6 +482,44 @@ def code_list(computer, default_calc_job_plugin, all_entries, all_users, raw, sh
row.append('@'.join(str(result[entity][projection]) for entity, projection in VALID_PROJECTIONS[key]))
table.append(row)

# If there is no computer filter, add also the results from the second query
if computer is None:
valid_projections_no_computer = {}
for k, v in VALID_PROJECTIONS.items():
if k != 'computer':
valid_projections_no_computer[k] = [_ for _ in v if _[0] != 'computer']

for result in query_nocomp.iterdict():
row = []
for key in project:
if key == 'entry_point':
node_type = result['code']['node_type']
entry_point = load_node_class(node_type).entry_point
row.append(entry_point.name)
else:
row.append(
'@'.join(
str(result[entity][projection]) for entity, projection in valid_projections_no_computer[key]
)
)
table.append(row)

# The `table` list now contains all results, both from the query with computer and the one without.
# However, the order might not be correct anymore. Therefore, sort the table based on ascending `full_label`.
# If 'full_label' projection is not requested, fallback, on PK, and if that is also not requested,
# do not sort.
try:
index_full_label = headers.index('Full label') # Note new capitalization after string replacement earlier
table.sort(key=lambda x: x[index_full_label])
except ValueError:
# No 'full_label' in projections: fall back on PK
try:
index_pk = headers.index('Pk') # Note capitalization after string replacement earlier
table.sort(key=lambda x: x[index_pk])
except ValueError:
# No 'full_label' nor 'pk' in projections: do not sort
pass

echo_tabulate(table, headers=headers, tablefmt=table_format)

if not raw:
Expand Down
1 change: 0 additions & 1 deletion src/aiida/cmdline/params/options/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -766,7 +766,6 @@ def set_log_level(ctx, _param, value):
)

PATH = OverridableOption(
'-p',
'--path',
type=click.Path(path_type=pathlib.Path),
show_default=False,
Expand Down
2 changes: 1 addition & 1 deletion src/aiida/common/pydantic.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ class Model(BaseModel):
:param model_to_orm: Optional callable to convert the value of a field from a model instance to an ORM instance.
:param exclude_to_orm: When set to ``True``, this field value will not be passed to the ORM entity constructor
through ``Entity.from_model``.
:param exclude_to_orm: When set to ``True``, this field value will not be exposed on the CLI command that is
:param exclude_from_cli: When set to ``True``, this field value will not be exposed on the CLI command that is
dynamically generated to create a new instance.
:param is_attribute: Whether the field is stored as an attribute.
:param is_subscriptable: Whether the field can be indexed like a list or dictionary.
Expand Down
Loading
Loading