-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-1858341: limit size of data responses #220
Comments
+1 |
hi, thank you for submitting this issue. we'll take a look how this could be handled. |
in the meantime , as a possible workaround, setting could help mitigate the issue. probably you guys are already aware and using something similar as a workaround, but still leaving it here in case anyone new stumbles into this issue |
Seeking help @mx2323 @ghenkhaus @sfc-gh-dszmolka @sfc-gh-jfan and others `// Load the Snowflake Node.js driver.
var snowflake = require('snowflake-sdk');
// Create a Connection object that we can use later to connect.
var connection = snowflake.createConnection({
account: "MY_SF_ACCOUNT",
database: "MY_DB",
schema: "MY_SCHEMA",
warehouse: "MY_WH",
username: "MY_USER",
password: "MY_PWD"
});
// Try to connect to Snowflake, and check whether the connection was successful.
connection.connect(
function(err, conn) {
if (err) {
console.error('Unable to connect: ' + err.message);
}
else {
console.log('Successfully connected to Snowflake.');
// Optional: store the connection ID.
connection_ID = conn.getId();
}
}
);
var statement = connection.execute({
sqlText: "Select * from LargeDataSet limit 100",
//sqlText: "Select * from LargeDataSet", -- fails with Request to S3/Blob failed
complete: function(err, stmt, rows) {
if (err) {
console.error('Failed to execute statement due to the following error: ' + err.message);
} else {
console.log('Successfully executed statement: ' + stmt.getSqlText());
}
}
});` We are observing this while upgrading the Is there a resolution for fetching large data sets? |
hi @bhaskarbanerjee this issue you're seeing is not related to the original one, which is a feature/improvement request for something which doesn't exist yet. Let's keep this Issue for what it was originally intended for; tracking the original improvement request. since small result sets work for you, only bigger have problems fetching, i would suspect the host you're running snowflake-sdk on, cannot reach the Snowflake internal stage (= S3 bucket) on which the query results are temporarily stored. to fix this, I recommend running If you confirmed nothing blocks the connectivity to the stage and it still doesn't work, kindly open a new issue here or open a Snowflake Support case and we can help further. |
Thanks @sfc-gh-dszmolka let me try that but if it is a server side problem, then why does v1.6.*-1.8.0 work as a charm for large data set of 6-7 MB. |
Verified. We have 2 VPCs listed there and both are set to type= 'STAGE'. @sfc-gh-dszmolka |
Ran snowcd tool and |
currently with the node.js snowflake-sdk, if you do a
SELECT * FROM LARGE_TABLE
, the node.js snowflake client can crash the entire process if even one row is too large in the result set.we would like to be able to set an option that limits the amount of data that is held in memory for each request, the sdk could either throw an exception, or gracefully return some subset of received data.
without this, we are UNABLE to prevent any user from overloading the calling node process and causing OOM errors.
note: this is still an issue, even with streaming rows, because that single row may still be too large.
The text was updated successfully, but these errors were encountered: