-
Notifications
You must be signed in to change notification settings - Fork 480
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-719759: Is there subset of requirements that include Python Pandas that would fit as a lambda layer in AWS (<= 50MB) #1384
Comments
@sfc-gh-aling You may have more information. |
This stale related operation is impacted by a test run of a stale bot workflow. |
@ArashMehraban We are actively working on unlocking this functionality with a current ETA of June 2023. Unfortunately, there is not something that works currently. |
@sfc-gh-achandrasekaran Thanks for the reply! By "this functionality", in your statement, do you mean, have a list of snowflake requirements in addition to pandas and numpy to be less that 50M zipped to be used in aws lambda? |
We cannot create a subset of the requirements for the connector. That being said we are actively working on reducing the size of the python connector overall to support scenarios like yours. Lambda has an unzipped file limit of 250MB ( as opposed to the 50 MB limit for zipped files). Does that work for you in the meantime? |
-Reducing the size of the connector by June, will work for me! Meanwhile, I have found ways to reduce the size of numpy/pandas as well by deleting the docs, etc. that come with those packages.
|
Hi All , We have released a new preview version of connector with reduced sized with nanoarrow which you can check at this blog post https://medium.com/snowflake/supercharging-the-snowflake-python-connector-with-nanoarrow-8388cb57eeba Do let us know your feedback. Do note this is still in preview, so we dont recommend it used for production. Thanks |
Hi all, we're thrilled to announce that snowflake-connector-python 3.5.0 is released which removes the restriction of pyarrow dependency as well as reduces the package size: https://pypi.org/project/snowflake-connector-python/3.5.0/ please give it a try! |
Hey guys, I'm experiencing this as well. obviously it's been a while since 3.5.0, I'm now using the latest, 3.12.3. I'm using the recommended by AWS method for creating a binary package to be deployed as a layer: pip install I can see that the dependency on pyarrow is gone, but the total uncompressed size is still far too large. Any recommendations? |
Is there a subset of these requirements: https://github.com/snowflakedb/snowflake-connector-python/blob/main/tested_requirements/requirements_38.reqs that could include Python Pandas (an Numpy as a pre-requisite for Pandas) for which the build will be less than 50MB in size (zipped) to be utilized as a layer in AWS? The requirements listed above will build a 33MB zipped folder and Pandas/Numpy will add another 25MB. Together, they are more than 50MB that is allowed as a lambda layer in AWS.
The text was updated successfully, but these errors were encountered: