Monday 10 April 2017

How do I configure BigQuery query job to allow large results in Node.js?

I am trying to allow my Node app to write and execute queries that are then saved to a table in my BQ project.

All of this works just fine, until my results exceed the size threshold. Then I get the usual 'Results too large to return' error.

I have read the node.js api documentation here, and attempted to set destination and allowLargeResults options as configuration. But they seem to be ignored.

How can I use Node.js to run my query and write the results to a specified table that allows large results?

Below is the function I am using.

function getData(file, outfile, email, callback){
    fs.writeFile('public/downloads/' + outfile, 'email_sha256'+'\n', function(err){
        console.log(err);
    });
    const tableName = outfile.substring(0, outfile.length - 4);
    console.log('getData function started');
    const sql = 'SELECT email_sha256 FROM temp.{table} cid JOIN etl.customer email ON cid.customer_id = email.customer_id GROUP BY email_sha256';
    const sql2 = sql.replace("{table}", tableName);
    console.log(tableName);
    const options = {
        destination: 'nf_hashed',
        query: sql2,
        timeoutMs: 10000000,
        useLegacySql: false,
        defaultDataset: 'temp'
    };
    console.log('Starting Query');
    bigquery.query(options);
}

Let me know if there's anything I can do to make the question more clear.

Thanks for any help.



via nFrain

No comments:

Post a Comment