jsonEncode to Big Query

Im having some difficulty pushing data to Big Query after some assistance ( I’m sure its a basic fix).

  • Sending 3 things to a big query data table namely SerNo, timestamp and a json string.
  • I am using the jsonEncode node to encode the payload to {{data.body2}}
  • I then insert the result into the big query node as follows:
    Screen Shot 2023-03-23 at 11.20.46 PM

I am receiving the following error from google: “parse error at line 1, column 2: syntax error while parsing object key - invalid literal; last read: ‘{&’; expected string literal”

Screen Shot 2023-03-23 at 11.22.30 PM

Not sure what I am doing wrong, if instead of using the handlebar insert {{data.body2}} in the big query table data json template, I just paste the result of {{data.body2}} direct from the debug console it seems to work fine. An example here of the table data json template:

Screen Shot 2023-03-23 at 11.25.43 PM

result in google:

So it seems by using the {{data.body2}} in the json template it is messing with the json structure?

Hi @rwild_node, can you post the entirety of a successful and an unsuccessful Debug log here? Or, if you’d prefer, you can DM it to me.

1 Like

I think this is the problem …

data.body2 is stored on your payload as a stringified JSON object - the string {"foo": "bar"}.

You are then trying to construct a JSON template where you are placing that string value inside of double quotes -

{
  "myValue": "{{data.body2}}"
}

When this renders, it becomes …

{
  "myValue": "{"foo": "bar"}"
}

… this is not valid JSON.

The reason it is working when you copy the value from the debug log is because we are escaping those quotes - stringifying the already stringified JSON …

{
  "myValue": "{\"foo\": \"bar\"}"
}

… which is valid JSON.


So what you want to do instead is this …

{
  "myValue": {{jsonEncode data.body2}}
}

The {{jsonEncode}} helper will properly escape the values in the JSON string for you. Note the lack of quotes around the helper; since the end value is a string, the helper will take care of adding those for you.

1 Like

That was it, thanks @Dylan_Schuster

@Dylan_Schuster looks like I am running into a problem on the Query side also with the JSON field type in big query. Not sure if this is underlying in the GCP node or if it is my poor syntax. Example of query on gcp platform:

Current workflow GCP node JSON template config to mirror above query:

{
“query”: “SELECT* FROM [knowsy-warm-store.knowsy_production_warm.data_record_oem] LIMIT 100”
}

Resulting error:

Screen Shot 2023-03-24 at 11.45.11 AM

Detail: “Querying tables with INTERVAL or JSON type is not supported in Legacy SQL: 640499537116:knowsy_production_warm.data_record_oem.”

I spent a while looking into this and verified the issue. The underlying library we use to make requests to BigQuery defaults to using Legacy SQL for the Jobs: Query action, and as the error message says, that is not allowed when querying a table that has JSON field types.

I’m filing a feature request to allow for choosing Legacy SQL or Google SQL as the query syntax. I’m hoping that makes our May release. In the meantime, you’ll have to work around this by either using a GCP: Function Node to invoke a function on your infrastructure that then returns results from this table, or use an HTTP Node to make the request to the jobs.query method.

Thanks for bringing this to our attention. I will let you know when we expose that new option in the BigQuery Node.

@rwild_node our developers looked into this and figured out that you can resolve this issue right now without any necessary platform changes.

You need to add "useLegacySql": false to your query request, like so:

{
  "query": "SELECT* FROM [knowsy-warm-store.knowsy_production_warm.data_record_oem] LIMIT 100",
  "useLegacySql": false
}

I tested this on my end and was able to make a query to a table with a JSON-type column.

1 Like

Thanks for the follow up @Dylan_Schuster , appreciate the solution here!