Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Add exporter code to storage #7218

Open
wants to merge 97 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 60 commits
Commits
Show all changes
97 commits
Select commit Hold shift + click to select a range
a630375
added new accepted folder path
Feb 12, 2025
1503482
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 12, 2025
ebe1b37
initial implementation
Feb 12, 2025
27831e9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 12, 2025
d33f717
removed
Feb 12, 2025
12440f7
refactor
Feb 14, 2025
74611a9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
edbc905
removed uneccessary
Feb 14, 2025
a7d1dbc
refactor
Feb 14, 2025
e884e54
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
a1b8822
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 14, 2025
650350e
remove duplicate import
Feb 14, 2025
fcf8d34
remove unused
Feb 14, 2025
a4d6b01
update comments
Feb 14, 2025
442c700
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 17, 2025
58f0b85
refactor
Feb 17, 2025
d8f37a9
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 17, 2025
a9feeb5
repalce with archiver
Feb 17, 2025
88fb792
rename
Feb 17, 2025
7b66adf
extended tests
Feb 17, 2025
81810c3
added progress_callbacks
Feb 17, 2025
72b9e2b
ensure progress
Feb 17, 2025
3eafcc1
remove else
Feb 17, 2025
3b0a073
added base task to use
Feb 18, 2025
7359f6d
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 19, 2025
1dd6de2
extended base messages
Feb 19, 2025
b3ee27a
added user_notifications base
Feb 19, 2025
fc59ec1
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Feb 19, 2025
dbf3f29
updated openapispecs
Feb 19, 2025
411f9ee
add progress sending via rabbitmq
Feb 19, 2025
412bb0f
added cleanup fixtures
Feb 19, 2025
981cde9
mypy
Feb 19, 2025
e2b7e6f
fixed test
Feb 19, 2025
ded5d00
fixed missing field
Feb 19, 2025
f6a77cb
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 19, 2025
7ced8e7
fixed
Mar 19, 2025
e5859f6
fixed export task
Mar 19, 2025
ed60ba7
fixed task
Mar 19, 2025
1c0d5ab
connected export job
Mar 19, 2025
f3ca256
purged unused
Mar 19, 2025
1cd7bbd
removed
Mar 20, 2025
949cb09
refactor
Mar 20, 2025
0c46c25
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 20, 2025
335b550
pylint
Mar 20, 2025
53b257a
refactor progress
Mar 20, 2025
3a2b858
rename
Mar 20, 2025
c13ffb1
refactor
Mar 20, 2025
4abd21a
fixed test
Mar 20, 2025
85fb767
fixed tests
Mar 20, 2025
16dd061
rename
Mar 20, 2025
39fe1a7
fixed test
Mar 20, 2025
87ba884
refactor
Mar 20, 2025
a796af3
refactor
Mar 20, 2025
5db89cb
minor
Mar 20, 2025
65f6817
moved modules
Mar 20, 2025
1e7aa1c
revert change
Mar 20, 2025
9031e8d
refactored tests
Mar 20, 2025
f49232a
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 20, 2025
3fdf056
rephrase message
Mar 20, 2025
2766d20
updated spec
Mar 20, 2025
2f72676
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 21, 2025
1572713
using shared value
Mar 21, 2025
3d7373e
refactor messages
Mar 21, 2025
7d239dc
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 24, 2025
7622e15
enforce UUID pattern on exports
Mar 24, 2025
1413afb
refactor
Mar 24, 2025
c17b21e
removed exception
Mar 24, 2025
3fde9ad
rename
Mar 24, 2025
899b4fa
renamed
Mar 24, 2025
596c3ac
limit permissions
Mar 24, 2025
0cbc148
rename
Mar 24, 2025
02208c5
feedback
Mar 24, 2025
e313bc6
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 25, 2025
7af7f46
added note
Mar 25, 2025
5de596d
added note
Mar 25, 2025
b091c3e
using progress bar
Mar 25, 2025
92c6090
using existing fixture
Mar 25, 2025
bb38d8b
fixed regex
Mar 25, 2025
8badfd2
fixed tests
Mar 25, 2025
552de63
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 25, 2025
afa7858
fixeed specs
Mar 25, 2025
938c6ae
Merge remote-tracking branch 'upstream/master' into pr-osparc-s3-zip-…
Mar 28, 2025
c6f48f1
fixed
Mar 28, 2025
1aa630e
fixed test
Mar 28, 2025
9f021e3
revert interfaces
Mar 28, 2025
5632add
revert changes
Mar 28, 2025
37c05ba
refactor
Mar 28, 2025
d3437c4
refactor
Mar 28, 2025
585ba54
restructured imports
Mar 28, 2025
4bae784
refactir
Mar 28, 2025
9f2d99f
fixed tests
Mar 28, 2025
8cbfeb1
not required
Mar 28, 2025
3094f87
limit to simcore_s3 only
Mar 28, 2025
b5c68e0
interface cleanup
Mar 28, 2025
9a589c0
aligned interface
Mar 28, 2025
4185157
fixed imports
Mar 28, 2025
d91899e
updated interface and tests
Mar 28, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion packages/models-library/src/models_library/basic_regex.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

SEE tests_basic_regex.py for examples
"""

# TODO: for every pattern we should have a formatter function
# NOTE: some sites to manualy check ideas
# https://regex101.com/
Expand Down Expand Up @@ -45,7 +46,9 @@
)

# Storage basic file ID
SIMCORE_S3_FILE_ID_RE = rf"^(api|({UUID_RE_BASE}))\/({UUID_RE_BASE})\/(.+)$"
SIMCORE_S3_FILE_ID_RE = (
rf"^(api\/{UUID_RE_BASE}|exports\/\d+|{UUID_RE_BASE}\/{UUID_RE_BASE})\/(.+)$"
)
SIMCORE_S3_DIRECTORY_ID_RE = rf"^({UUID_RE_BASE})\/({UUID_RE_BASE})\/(.+)\/$"

# S3 - AWS bucket names [https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html]
Expand Down
24 changes: 12 additions & 12 deletions packages/models-library/src/models_library/rabbitmq_messages.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,9 @@ class ProgressType(StrAutoEnum):


class ProgressMessageMixin(RabbitMessageBase):
channel_name: Literal[
channel_name: Literal["simcore.services.progress.v2"] = (
"simcore.services.progress.v2"
] = "simcore.services.progress.v2"
)
progress_type: ProgressType = (
ProgressType.COMPUTATION_RUNNING
) # NOTE: backwards compatible
Expand All @@ -118,9 +118,9 @@ def routing_key(self) -> str | None:


class InstrumentationRabbitMessage(RabbitMessageBase, NodeMessageBase):
channel_name: Literal[
channel_name: Literal["simcore.services.instrumentation"] = (
"simcore.services.instrumentation"
] = "simcore.services.instrumentation"
)
metrics: str
service_uuid: NodeID
service_type: str
Expand Down Expand Up @@ -210,9 +210,9 @@ def routing_key(self) -> str | None:


class RabbitResourceTrackingStartedMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_STARTED] = (
RabbitResourceTrackingMessageType.TRACKING_STARTED
] = RabbitResourceTrackingMessageType.TRACKING_STARTED
)

wallet_id: WalletID | None
wallet_name: str | None
Expand Down Expand Up @@ -250,9 +250,9 @@ class RabbitResourceTrackingStartedMessage(RabbitResourceTrackingBaseMessage):


class RabbitResourceTrackingHeartbeatMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT] = (
RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT
] = RabbitResourceTrackingMessageType.TRACKING_HEARTBEAT
)


class SimcorePlatformStatus(StrAutoEnum):
Expand All @@ -261,9 +261,9 @@ class SimcorePlatformStatus(StrAutoEnum):


class RabbitResourceTrackingStoppedMessage(RabbitResourceTrackingBaseMessage):
message_type: Literal[
message_type: Literal[RabbitResourceTrackingMessageType.TRACKING_STOPPED] = (
RabbitResourceTrackingMessageType.TRACKING_STOPPED
] = RabbitResourceTrackingMessageType.TRACKING_STOPPED
)

simcore_platform_status: SimcorePlatformStatus = Field(
...,
Expand Down Expand Up @@ -297,9 +297,9 @@ class CreditsLimit(IntEnum):


class WalletCreditsLimitReachedMessage(RabbitMessageBase):
channel_name: Literal[
channel_name: Literal["io.simcore.service.wallets-credit-limit-reached"] = (
"io.simcore.service.wallets-credit-limit-reached"
] = "io.simcore.service.wallets-credit-limit-reached"
)
created_at: datetime.datetime = Field(
default_factory=lambda: arrow.utcnow().datetime,
description="message creation datetime",
Expand Down
20 changes: 20 additions & 0 deletions packages/models-library/tests/test_project_nodes_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,9 @@
DatCoreFileLink,
SimCoreFileLink,
SimcoreS3DirectoryID,
SimcoreS3FileID,
)
from models_library.users import UserID
from pydantic import TypeAdapter, ValidationError


Expand Down Expand Up @@ -180,3 +182,21 @@ def test_simcore_s3_directory_get_parent():
SimcoreS3DirectoryID._get_parent( # noqa SLF001
"/hello/object/", parent_index=4
)


USER_ID_0: UserID = 0


@pytest.mark.parametrize(
"object_key",
[
f"api/{UUID_0}/some-random-file.png",
f"exports/{USER_ID_0}/some-random-file.png",
f"{UUID_0}/{UUID_0}/some-random-file.png",
f"api/{UUID_0}/some-path/some-random-file.png",
f"exports/{USER_ID_0}/some-path/some-random-file.png",
f"{UUID_0}/{UUID_0}/some-path/some-random-file.png",
],
)
def test_simcore_s3_file_id_accepted_patterns(object_key: str):
assert str(TypeAdapter(SimcoreS3FileID).validate_python(object_key)) == object_key
6 changes: 2 additions & 4 deletions packages/models-library/tests/test_rabbit_messages.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
from typing import Union

import pytest
from faker import Faker
from models_library.progress_bar import ProgressReport
Expand Down Expand Up @@ -41,6 +39,6 @@
)
async def test_raw_message_parsing(raw_data: str, class_type: type):
result = TypeAdapter(
Union[ProgressRabbitMessageNode, ProgressRabbitMessageProject]
ProgressRabbitMessageNode | ProgressRabbitMessageProject
).validate_json(raw_data)
assert type(result) == class_type
assert type(result) is class_type
8 changes: 1 addition & 7 deletions packages/notifications-library/tests/email/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,7 @@ def app_environment(
env_devel_dict: EnvVarsDict,
external_envfile_dict: EnvVarsDict,
) -> EnvVarsDict:
return setenvs_from_dict(
monkeypatch,
{
**env_devel_dict,
**external_envfile_dict,
},
)
return setenvs_from_dict(monkeypatch, {**env_devel_dict, **external_envfile_dict})


@pytest.fixture
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
from typing import Final

from pydantic import ByteSize, NonNegativeFloat, NonNegativeInt
from pydantic import ByteSize, NonNegativeInt

_UNIT_MULTIPLIER: Final[NonNegativeFloat] = 1024.0
TQDM_FILE_OPTIONS: Final[dict] = {
"unit": "byte",
"unit_scale": True,
Expand Down
20 changes: 10 additions & 10 deletions services/storage/openapi.json
Original file line number Diff line number Diff line change
Expand Up @@ -332,7 +332,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -430,7 +430,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -509,7 +509,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -643,7 +643,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -707,7 +707,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -771,7 +771,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -852,7 +852,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -923,7 +923,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -2078,7 +2078,7 @@
"anyOf": [
{
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$"
},
{
"type": "string",
Expand Down Expand Up @@ -2629,7 +2629,7 @@
"properties": {
"link_id": {
"type": "string",
"pattern": "^(api|([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}))\\/([0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$",
"pattern": "^(api\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}|exports\\/\\d+|[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12}\\/[0-9a-fA-F]{8}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{4}-?[0-9a-fA-F]{12})\\/(.+)$",
"title": "Link Id"
}
},
Expand Down
1 change: 1 addition & 0 deletions services/storage/requirements/_test.in
Original file line number Diff line number Diff line change
Expand Up @@ -29,4 +29,5 @@ python-dotenv
respx
simcore-service-storage-sdk @ git+https://github.com/ITISFoundation/osparc-simcore.git@cfdf4f86d844ebb362f4f39e9c6571d561b72897#subdirectory=services/storage/client-sdk/python # to test backwards compatibility against deprecated client-sdk (used still in old versions of simcore-sdk)
sqlalchemy[mypy] # adds Mypy / Pep-484 Support for ORM Mappings SEE https://docs.sqlalchemy.org/en/20/orm/extensions/mypy.html
types_tqdm
types-aiofiles
5 changes: 5 additions & 0 deletions services/storage/requirements/_test.txt
Original file line number Diff line number Diff line change
Expand Up @@ -402,6 +402,10 @@ termcolor==2.5.0
# via pytest-sugar
types-aiofiles==24.1.0.20241221
# via -r requirements/_test.in
types-requests==2.32.0.20241016
# via types-tqdm
types-tqdm==4.67.0.20241221
# via -r requirements/_test.in
typing-extensions==4.12.2
# via
# -c requirements/_base.txt
Expand Down Expand Up @@ -429,6 +433,7 @@ urllib3==2.3.0
# requests
# responses
# simcore-service-storage-sdk
# types-requests
vine==5.1.0
# via
# -c requirements/_base.txt
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
import logging
from typing import cast

from celery import Task # type: ignore[import-untyped]
from models_library.progress_bar import ProgressReport
from models_library.projects_nodes_io import StorageFileID
from models_library.users import UserID

from ...dsm import get_dsm_provider
from ...modules.celery.utils import get_celery_worker_client, get_fastapi_app
from ...simcore_s3_dsm import SimcoreS3DataManager
from ._progress_utils import get_tqdm_progress, set_tqdm_absolute_progress

_logger = logging.getLogger(__name__)


async def data_export(
task: Task,
*,
user_id: UserID,
paths_to_export: list[StorageFileID],
) -> StorageFileID:
_logger.info("Exporting (for user='%s') selection: %s", user_id, paths_to_export)

dsm = cast(
SimcoreS3DataManager,
get_dsm_provider(get_fastapi_app(task.app)).get(
SimcoreS3DataManager.get_location_id()
),
)

with get_tqdm_progress(total=1, description=f"{task.name}") as pbar:

async def _progress_cb(report: ProgressReport) -> None:
set_tqdm_absolute_progress(pbar, report)
await get_celery_worker_client(task.app).set_task_progress(task, report)

return await dsm.create_s3_export(
user_id, paths_to_export, progress_cb=_progress_cb
)
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from typing import Final

from models_library.progress_bar import ProgressReport
from pydantic import NonNegativeFloat
from tqdm import tqdm

TQDM_EXPORT_OPTIONS: Final[dict] = {
"unit": "",
"unit_scale": True,
"unit_divisor": 1024,
"colour": "yellow",
"miniters": 1,
"ncols": 100,
}


def get_tqdm_progress(total: NonNegativeFloat, *, description: str) -> tqdm:
return tqdm(**TQDM_EXPORT_OPTIONS, total=total, desc=description)


def set_tqdm_absolute_progress(pbar: tqdm, report: ProgressReport) -> None:
"""used when the progress does not come in chunk by chunk but as the total current value"""
pbar.n = report.actual_value
pbar.refresh()
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

from ...modules.celery._celery_types import register_celery_types
from ...modules.celery._task import define_task
from ...modules.celery.tasks import export_data
from ._data_export import data_export
from ._paths import compute_path_size

_logger = logging.getLogger(__name__)
Expand All @@ -18,5 +18,5 @@ def setup_worker_tasks(app: Celery) -> None:
logging.INFO,
msg="Storage setup Worker Tasks",
):
define_task(app, export_data)
define_task(app, data_export)
define_task(app, compute_path_size)
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
from ...modules.celery import get_celery_client
from ...modules.datcore_adapter.datcore_adapter_exceptions import DatcoreAdapterError
from ...simcore_s3_dsm import SimcoreS3DataManager
from .._worker_tasks._data_export import data_export

router = RPCRouter()

Expand Down Expand Up @@ -54,9 +55,10 @@ async def start_data_export(

try:
task_uuid = await get_celery_client(app).send_task(
"export_data",
data_export.__name__,
task_context=job_id_data.model_dump(),
files=data_export_start.file_and_folder_ids, # ANE: adapt here your signature
user_id=job_id_data.user_id,
paths_to_export=data_export_start.file_and_folder_ids,
)
except CeleryError as exc:
raise JobSchedulerError(exc=f"{exc}") from exc
Expand Down
Loading
Loading