Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Python 3.13 #6600

Draft
wants to merge 27 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
3ff310f
add py313 support
agoscinski Jan 17, 2025
6648421
Implement `__iter__` for `DbSearchResultsIterator`
agoscinski Jan 23, 2025
81b3156
introduce circus install from testpypi my release hack
agoscinski Jan 20, 2025
87bc5ad
debug ci
agoscinski Jan 20, 2025
63a4b81
rmove debug
agoscinski Jan 24, 2025
124177b
debug ci
agoscinski Jan 24, 2025
070b6af
debug: rm breakpoint
agoscinski Jan 24, 2025
0efcfb6
move upterm debug before ssh
agoscinski Jan 24, 2025
4feb104
increase banner timeout
agoscinski Jan 24, 2025
048d3a6
run only test_ssh
agoscinski Jan 27, 2025
7d59d0e
add time timeout to 120 mins
agoscinski Jan 27, 2025
60da561
comment out all other ssh test that might interfere
agoscinski Jan 27, 2025
8859180
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jan 27, 2025
af8971d
disable test_all_plugins tests
agoscinski Jan 27, 2025
8d5325b
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jan 27, 2025
ef4d9a5
set ulimit
agoscinski Jan 27, 2025
f3c4c23
ulimit cannot be increased, checking number of open file descriptors …
agoscinski Jan 27, 2025
f8639e7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jan 27, 2025
a282c67
raise beforehand
agoscinski Jan 27, 2025
d4f9706
try without xdist
agoscinski Jan 27, 2025
c426d3d
remove -s option and raise value error to only raise when banner error
agoscinski Jan 28, 2025
cbff24e
add open files info
agoscinski Jan 28, 2025
c09a120
log the open fds per function and per module
agoscinski Jan 28, 2025
fd80914
introduce before and after logging
agoscinski Jan 29, 2025
3d3d772
add log session, add log before
agoscinski Jan 29, 2025
d4f34d6
wip make container singleton to reuse resources
agoscinski Jan 29, 2025
17bb529
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jan 29, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 33 additions & 3 deletions .github/workflows/ci-code.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@ jobs:
tests:

runs-on: ubuntu-24.04
timeout-minutes: 45
timeout-minutes: 180

strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.12']
python-version: ['3.9', '3.13']
database-backend: [psql]
include:
- python-version: '3.9'
Expand Down Expand Up @@ -67,17 +67,43 @@ jobs:
with:
python-version: ${{ matrix.python-version }}

# uncomment for debug purposes
#- name: Setup upterm session
# env:
# AIIDA_TEST_PROFILE: test_aiida
# AIIDA_WARN_v3: 1
# uses: lhotari/action-upterm@v1

- name: Setup environment
run: .github/workflows/setup.sh

- name: ulimit
run: |
ulimit -a

- name: Run test suite
env:
AIIDA_TEST_PROFILE: test_aiida
AIIDA_WARN_v3: 1
# NOTE1: Python 3.12 has a performance regression when running with code coverage
# so run code coverage only for python 3.9.
run: |
pytest -n auto --db-backend ${{ matrix.database-backend }} -m 'not nightly' tests/ ${{ matrix.python-version == '3.9' && '--cov aiida' || '' }}
pytest --db-backend ${{ matrix.database-backend }} -m 'not nightly' tests/ ${{ matrix.python-version == '3.9' && '--cov aiida' || '' }}
- name: Cat module open fds
if: ${{ always() }}
env:
AIIDA_TEST_PROFILE: test_aiida
AIIDA_WARN_v3: 1
run: |
cat module_open_fds.log

- name: Cat function open fds
if: ${{ always() }}
env:
AIIDA_TEST_PROFILE: test_aiida
AIIDA_WARN_v3: 1
run: |
cat function_open_fds.log

- name: Upload coverage report
if: matrix.python-version == 3.9 && github.repository == 'aiidateam/aiida-core'
Expand Down Expand Up @@ -107,6 +133,10 @@ jobs:
- name: Setup SSH on localhost
run: .github/workflows/setup_ssh.sh

# uncomment for debug purposes
#- name: Setup upterm session
# uses: lhotari/action-upterm@v1

- name: Run test suite
env:
AIIDA_WARN_v3: 0
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test-install.yml
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ jobs:
fail-fast: false
matrix:

python-version: ['3.9', '3.10', '3.11', '3.12']
python-version: ['3.9', '3.10', '3.11', '3.12', '3.13']

# Not being able to install with conda on a specific Python version is
# not sufficient to fail the run, but something we want to be aware of.
Expand Down
10 changes: 9 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ dependencies = [
'alembic~=1.2',
'archive-path~=0.4.2',
"asyncssh~=2.19.0",
'circus~=0.18.0',
'circus~=0.19.0',
'click-spinner~=0.1.8',
'click~=8.1',
'disk-objectstore~=1.2',
Expand Down Expand Up @@ -521,3 +521,11 @@ commands = molecule {posargs:test}
# .github/actions/install-aiida-core/action.yml
# .readthedocs.yml
required-version = ">=0.5.21"

[[tool.uv.index]]
explicit = true
name = "testpypi"
url = "https://test.pypi.org/simple/"

[tool.uv.sources]
circus = {index = "testpypi"}
58 changes: 53 additions & 5 deletions src/aiida/storage/psql_dos/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -182,20 +182,66 @@ def get_session(self) -> Session:
return self._session_factory()

def close(self) -> None:
if self._session_factory is None:
return # the instance is already closed, and so this is a no-op
# close the connection
import os

import psutil

# does not create a ref
from sqlalchemy.orm.session import _sessions

# this seems to make the gc active to delte sessnios, wtf?
list(_sessions.values())

def list_open_fds():
process = psutil.Process(os.getpid())
return process.open_files()

breakpoint()
# sessions = [sess for sess in _sessions.values()]
# if self._session_factory is None:
# return # the instance is already closed, and so this is a no-op
## close the connection

## somehow this closes it but it should be equivalent to
# engines = []
# for sess in sessions:
# engines.append(sess.bind)
# for sess in sessions:
# #sess.flush()
# #sess.expunge_all()
# sess.close()

# somehow it is important to close the session before the engine is first time disposed
engine = self._session_factory.bind
if engine is not None:
engine.dispose() # type: ignore[union-attr]
from sqlalchemy.orm.session import close_all_sessions

close_all_sessions()

self._session_factory.expunge_all()
# gc.collect()
# self._session_factory.session_factory.close_all()
# gc.collect()
# self._session_factory.remove()
# gc.collect()
self._session_factory.close()
# gc.collect()
# self._session_factory.close_all()
# gc.collect()
self._session_factory = None

# for engine in engines:
# engine.dispose()
if engine is not None:
engine.dispose() # type: ignore[union-attr]

# Without this, sqlalchemy keeps a weakref to a session
# in sqlalchemy.orm.session._sessions
gc.collect()
# from aiida.manage import get_manager
# get_manager().reset_profile_storage()
# with self.migrator_context(self._profile) as migrator:
# migrator.get_container().close()
breakpoint()

def _clear(self) -> None:
from aiida.storage.psql_dos.models.settings import DbSetting
Expand Down Expand Up @@ -277,6 +323,7 @@ def transaction(self) -> Iterator[Session]:
with session.begin_nested() as savepoint:
yield session
savepoint.commit()
session.close()

@property
def in_transaction(self) -> bool:
Expand Down Expand Up @@ -367,6 +414,7 @@ def delete(self, delete_database_user: bool = False) -> None:

if repository.exists():
shutil.rmtree(repository)
breakpoint()
LOGGER.report(f'Deleted repository at `{repository}`.')

if postgres.db_exists(config['database_name']):
Expand Down
10 changes: 9 additions & 1 deletion src/aiida/storage/psql_dos/migrator.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,8 @@

REPOSITORY_UUID_KEY = 'repository|uuid'

_CONTAINERS = {}


class PsqlDosMigrator:
"""Class for validating and migrating `psql_dos` storage instances.
Expand Down Expand Up @@ -179,7 +181,12 @@ def get_container(self) -> 'Container':

from .backend import get_filepath_container

return Container(get_filepath_container(self.profile))
# TODO need upddate for each profile
breakpoint()
global _CONTAINER
if _CONTAINERS.get(self.profile, None) is None:
_CONTAINERS[self.profile] = Container(get_filepath_container(self.profile))
return _CONTAINERS[self.profile]

def get_repository_uuid(self) -> str:
"""Return the UUID of the repository.
Expand All @@ -206,6 +213,7 @@ def initialise(self, reset: bool = False) -> bool:
tests having run.
:returns: ``True`` if the storage was initialised by the function call, ``False`` if it was already initialised.
"""
breakpoint()
if reset:
self.reset_repository()
self.reset_database()
Expand Down
3 changes: 3 additions & 0 deletions src/aiida/storage/psql_dos/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,8 +50,11 @@ def create_sqlalchemy_engine(config: PsqlConfig):
port=config['database_port'],
name=config['database_name'],
)
from sqlalchemy.pool import NullPool

return create_engine(
engine_url,
poolclass=NullPool,
json_serializer=json.dumps,
json_deserializer=json.loads,
**config.get('engine_kwargs', {}),
Expand Down
9 changes: 7 additions & 2 deletions src/aiida/storage/sqlite_dos/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@
ALEMBIC_REL_PATH = 'migrations'

REPOSITORY_UUID_KEY = 'repository|uuid'
_CONTAINERS = {}


class SqliteDosMigrator(PsqlDosMigrator):
Expand Down Expand Up @@ -84,8 +85,12 @@ def get_container(self) -> Container:

:returns: The disk-object store container configured for the repository path of the current profile.
"""
filepath_container = Path(self.profile.storage_config['filepath']) / FILENAME_CONTAINER
return Container(str(filepath_container))
# TODO also is in the migrator, wtf?
global _CONTAINERS
if _CONTAINERS.get(self.profile, None) is None:
filepath_container = Path(self.profile.storage_config['filepath']) / FILENAME_CONTAINER
_CONTAINERS[self.profile] = Container(str(filepath_container))
return _CONTAINERS[self.profile]

def initialise_database(self) -> None:
"""Initialise the database.
Expand Down
6 changes: 5 additions & 1 deletion src/aiida/storage/sqlite_zip/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,11 @@ def register_json_contains(dbapi_connection, _):

def create_sqla_engine(path: Union[str, Path], *, enforce_foreign_keys: bool = True, **kwargs) -> Engine:
"""Create a new engine instance."""
engine = create_engine(f'sqlite:///{path}', json_serializer=json.dumps, json_deserializer=json.loads, **kwargs)
from sqlalchemy.pool import NullPool

engine = create_engine(
f'sqlite:///{path}', poolclass=NullPool, json_serializer=json.dumps, json_deserializer=json.loads, **kwargs
)
event.listen(engine, 'connect', sqlite_case_sensitive_like)
if enforce_foreign_keys:
event.listen(engine, 'connect', sqlite_enforce_foreign_keys)
Expand Down
3 changes: 3 additions & 0 deletions src/aiida/tools/dbimporters/baseclasses.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,9 @@ def __init__(self, results, increment=1):
self._position = 0
self._increment = increment

def __iter__(self):
return self

def __next__(self):
"""Return the next entry in the iterator."""
pos = self._position
Expand Down
29 changes: 28 additions & 1 deletion src/aiida/tools/pytest_fixtures/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,7 @@ def reset_storage():
pass

manager.get_profile_storage()._clear()
breakpoint()
manager.reset_profile()

User(email=profile.default_user_email or email).store()
Expand Down Expand Up @@ -208,8 +209,34 @@ def aiida_profile_clean(aiida_profile):

:returns :class:`~aiida.manage.configuration.profile.Profile`: The loaded temporary profile.
"""
import os

import psutil

def list_open_fds():
process = psutil.Process(os.getpid())
return process.open_files()

# import tracemalloc
# tracemalloc.start()
## ... start your application ...

# snapshot1 = tracemalloc.take_snapshot()
## ... call the function leaking memory ...
# for _ in range(100):
# aiida_profile.reset_storage()
breakpoint()
aiida_profile.reset_storage()
yield aiida_profile
breakpoint()
# yield aiida_profile

# snapshot2 = tracemalloc.take_snapshot()

# top_stats = snapshot2.compare_to(snapshot1, 'lineno')

# print("[ Top 10 differences ]")
# for stat in top_stats[:10]:
# print(stat)


@pytest.fixture(scope='class')
Expand Down
1 change: 1 addition & 0 deletions src/aiida/transports/plugins/ssh.py
Original file line number Diff line number Diff line change
Expand Up @@ -496,6 +496,7 @@ def open(self):
connection_arguments['sock'] = self._proxy

try:
connection_arguments['banner_timeout'] = 200
self._client.connect(self._machine, **connection_arguments)
except Exception as exc:
self.logger.error(
Expand Down
Loading
Loading