Friday 21 April 2017

Trying to insert data into BigQuery fails from container engine pod

I have a simple node.js application that tries to insert some data into BigQuery. It uses the provided gcloud node.js library.

The BigQuery client is created like this, according to the documentation:

google.auth.getApplicationDefault(function(err, authClient) {
  if (err) {
    return cb(err);
  }
  let bq = BigQuery({
    auth: authClient,
    projectId: "my-project"
  });
  let dataset = bq.dataset("my-dataset");
  let table = dataset.table("my-table");
});

With that I try to insert data into BiqQuery.

table.insert(someRows).then(...)

This fails, because the BigQuery client returns a 403 telling me that the authentication is missing the required scopes. The documentation tells me to use the following snippet:

if (authClient.createScopedRequired &&
    authClient.createScopedRequired()) {
  authClient = authClient.createScoped([
     "https://www.googleapis.com/auth/bigquery",
     "https://www.googleapis.com/auth/bigquery.insertdata",
     "https://www.googleapis.com/auth/cloud-platform"
  ]);
}

This didn't work either, because the if statement never executes. I skipped the if and set the scopes every time, but the error remains.

What am I missing here? Why are the scopes always wrong regardless of the authClient configuration? Has anybody found a way to get this or a similar gcloud client library (like Datastore) working with the described authentication scheme on a Container Engine pod?

The only working solution I found so far is to create a json keyfile and provide that to the BigQuery client, but I'd rather create the credentials on the fly then having them next to the code.

Side note: The node service works flawless without providing the auth option to BigQuery, when running on a Compute Engine VM, because there the authentication is negotiated automatically by Google.



via Holger Adam

No comments:

Post a Comment