Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flex "upload" function so users can define local system instead of S3 #382

Open
alanwilter opened this issue Dec 3, 2021 · 4 comments
Open
Assignees
Labels
enhancement New feature or request Frontend help wanted Extra attention is needed Python API

Comments

@alanwilter
Copy link
Collaborator

In order to allow users to have a customised version of PGB working locally we need to address the way patient files are uploaded.

Currently, all goes to our S3:

aws s3 ls --profile phenopolis s3://phenopolis-website-uploads
PRE PH00000001/
PRE PH00008258/
PRE PH00008268/
PRE PH00008628/
PRE PH00008629/
PRE vcf/

And it's essentially done in React (frontend) via Uppy with some support endpoints in views/upload.py (but only for managing files, not for really uploading them).

Uppy seems to be very powerful and entails several options for upload including "local device".

We need, somehow, to have this option, for "local device". However, how is it going to be configured, either via frontend or in, e.g., public.env file, is still an opened question.

@alanwilter alanwilter added enhancement New feature or request help wanted Extra attention is needed Frontend Python API labels Dec 3, 2021
@IsmailM
Copy link
Member

IsmailM commented Dec 12, 2021

I would argue that the easiest option is to have a local S3-compatible storage system (e.g. Minio) - and then everything would be compatible with minimal changes?

With docker-compose - it should be super simple to get minio running - and mount a local hard drive path to Minio

See https://docs.min.io/docs/deploy-minio-on-docker-compose.html - They have an exemplar docker-compose...

@alanwilter alanwilter self-assigned this Jan 11, 2022
@alanwilter
Copy link
Collaborator Author

I'll work on it.

@alanwilter
Copy link
Collaborator Author

See #386

@alanwilter
Copy link
Collaborator Author

@IsmailM

I tried to work on upload.py so to get rid of the two s3_client1 and s3_client2 and use endpoint_url="http://minio-server:9000" but I didn't get it to work.

What happens is that the actions using uppy (the JS module) for presign (upload) and for download, do request localhost (for delete and listing minio-server works fine).

If I try, e.g., to download a file, it redirects the browser to e.g.

http://minio-server:9000/phenopolis-website-uploads/PH00008258/PH00008258_mytest2.vcf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=minio%2F20220212%2Feu-west-2%2Fs3%2Faws4_request&X-Amz-Date=20220212T103411Z&X-Amz-Expires=300&X-Amz-SignedHeaders=host&X-Amz-Signature=91e96208259aa285dcb63a03d5882b9f2a09efdb16db2573ef61d5cb9e0c0f4d

and that won't work for the host.

Uploading fails, as I said, and though I thought it was a CORS issue, I don't think it is, and your suggestion to add in nginx.conf didn't work anyway.

So, I'm still using two s3_clients approach with some simplification. It works with Wasabi as well.
Please see my last commit in #386.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Frontend help wanted Extra attention is needed Python API
Projects
None yet
Development

No branches or pull requests

4 participants