How can we connect a Device to Google Cloud IoT Core using MQTT

I wanted to understand how can i create an integration using MQTT to the Google Cloud IoT Core to publish the telemetry data from the device.

Hi there!

You really have two main ways:

  1. Google Pub/Sub - trigger a Losant Workflow when you receive a message.
  1. Webhooks - you can trigger a Google Cloud Function when ever your device reports. That function can then make an HTTP request to Losant.

Hello, we are trying to create a GCP integration according to the instructions, but fail to do so.
Attached are screenshot that show our GCP PUBSUB and Losant integration page.
The Account Key JSON Template is identical to the one we use in Google pubsub applications
that work. As the screenshot shows, it fails to connect.
In addition, as soon as we add a / to the topic, it is considered invalid.
What are we doing wrong?
Thank you.


Hi @Gambit_Support,

Just so I’m clear… you have two problems here:

  1. The GCP integration will not connect when no topics are listed, and…
  2. When you add a Topic, it is considered invalid

Do I have that correct?

In the case of #1, I would move your Account Key JSON to a Global and use that global that you create as the JSON template. For example, you create a JSON global where the value is your entire Account Key, then, in your Integration, you reference the global in the Account Key JSON Template field. Here’s what that would look like if you had a global named gcpAccountKey:

image

For #2, for the topic, it is only the part of the topic after the .../topics/yourTopic. So, in your case (thank you for including the screenshot) your topic would just be telemetry.

Let me know if this helps.

Thank you,
Heath

Thank you. It was #2, the topic name. A simple rewording in your dialog to say it’s the trailing part of the topic will prevent others from doing the same mistake.

@Gambit_Support,

A simple rewording in your dialog to say it’s the trailing part of the topic will prevent others from doing the same mistake.

I’d be happy to make this update. Could you point me in the direction of the exact documentation that you are referring to?

Thank you,
Heath

In my first screenshot under “TOPICS”, “Enter as many topics as you would like. Messages sent to these topics will cause workflows triggering off of this integration to fire, and these messages will count towards your payload limits.”. Maybe add something like “Enter the trailing part of the Google topic, eg. for projects/myproject/topics/mytopic you would enter mytopic.”

@Gambit_Support,

Thank you! I’ll get some updated text in there.

Glad we were able to get you sorted out. Let me know if you have any other questions!

Heath

We have figured out the GCP integration.
The aim is to create a dashboard that shows number of devices, number of total messages/sec,
etc, and improve monitoring from there.
We are collecting messages from devices publishing their serial number and data to GCP,
then the attached workflow collects them into a data table by serial number.
The concern is the size of the table. It says in your documentation there is a limit of 1000.
We would have tens of thousands of unique serial numbers, starting at 30000.
Is there another way to get at least the above?

@Gambit_Support,

Just so I understand the flow of data here… You have devices in Losant that are sending messages to GCP through an Integration, is this correct?

First, I’d like to point something out:

We are collecting messages from devices publishing their serial number and data to GCP,
then the attached workflow collects them into a data table by serial number.

The first item under When to not use Data Tables is “Storing time series data.” Collecting MQTT messages from devices and putting them in a data table is time series data and is not recommended.

You should instead add the serial number to the device with a Device Tag and the message should be a Device Attribute.

The concern is the size of the table. It says in your documentation there is a limit of 1000.
We would have tens of thousands of unique serial numbers, starting at 30000.

Could you send me a link to what limit you are referring to? Data Tables are limited by the total number of data tables in your Losant Organization and total data table data stored in your organization. Are you referring to the Device limit?

Please let me know if this helps.

Heath

As documented in your pubsub documentation, the devices connect to GCP.
Your integration is a subscriber, which gets a copy of the payloads, including the data I want to collect.
We don’t want a time series data, we only want the latest temp for each serial number (an associative
array keyed by serial number).
I thought I saw 1000 somewhere, but the limit is documented at

as a “hard-limited resource”. I cannot find this number anywhere in my dashboard.

Found the 1 GB Table Storage limit. That should be sufficient for the 30k entries.

@Gambit_Support,

Just so I am on the same page, you will have 30,000 unique rows keyed by serial number that you intend to overwrite repeatedly?

If this is the case, I must advise that this is not an appropriate use case for Data Tables. What you are describing is precisely Devices with a temperature Attribute that stores that value as Device State. You can get the last value reported by a device with a Gauge Query Node.

Using device state is also the only way to be able to use dashboard blocks like the Time Series Graph block and Gauge Block.

I would like to understand more about your use case and why Devices, Device Attributes, and Device State will not work for you. Any additional information is very beneficial.

Thank you,
Heath

Ok. How do devices get created dynamically on receipt of GCP messages?
This video shows what we want to achieve, but with the data table approach.
There is one virtual device which maintains statistics to display in the dashboard.
It has limitations, related to rate limiting. The video shows the demo as throttled
as we can make it to not cause rate limit errors in Losant.

I appreciate the video. It was helpful in trying to understand specifically what you’re trying to accomplish.

Looks like you’ve got two main goals:

  1. Total count of devices.
  2. Displaying messages per second from the GCP integration.

Let’s approach both individually.

1. Total count of devices

While a Data Table does technically work at the moment, the underlying technology is optimized for write light / read heavy. This means it works best with relatively static data being read often. In your scenario, rows are being written on every message from PubSub, which is the reverse of what the technology is optimized for. This is an architecture where the performance will degrade quickly as your number devices grow. I would not recommend this approach.

The best way to get the count of devices is to have devices to count. Your PubSub messages contain unique serial numbers to identify devices. When a message is received, use a Device: Get Node to find the Losant device tagged with that serial. If no device is found, use a Device: Create Node to create it dynamically (with the serial on the device tag).

If you want the devices to go away automatically, you could also use the Device: Inactive Trigger followed by the Device: Delete Node

To display the number of devices on a dashboard, I’d recommend a workflow on a 1 minute Timer Trigger. That workflow uses the Device: Get Node to obtain the total number of devices and records that to an attribute on your stats virtual device. This way, the total number of devices is updated every minute. Since it’s on an attribute, you also get historical numbers.

2. Messages per second

The first thing I’d recommend is to look at GCP Cloud Metrics, which collects a fair amount of useful stats for PubSub, and are accessible via an API. The data you require may already be available without recalculating it within Losant.

If the metrics you require are available, I’d recommend a 1 minute timer pulling those metrics and recording them to your stats virtual device.

If the metrics are not available, I would recommend using Workflow Storage to count the total number of incoming messages. On every PubSub message, use the Storage: Set Value Node and the Increment operation to add 1 to a value in workflow storage.

Then under a 1 minute timer trigger, I would recommend using the Clear operation, which returns the current value and will reset the counter back to 0. The returned value can be recorded to your stats virtual device. This attribute then represents the total number of messages received every minute. You can use the Indicator Block with a template and the divide helper to convert messages per minute to messages per second.

{{divide value-0 60}}

Thank you for the help.
What we are struggling with right now is the Conditional block after the Device: Get node.
Your documentation shows the use case, but the screenshot does not show how to test for null.
We have tried {{data.myresult}} === null and 5 other guesses, luckily quickly because of the simulator.
Would be nice if that as documented in the Device:Get and Conditional pages.
In general, we are missing coding templates for common scenarios.

I would expect {{data.myResult}} === null to take the true path in a Conditional Node if the value at that path is indeed null (but not undefined, false, or 0).

So I took a look in your workflow and, at least currently, you’re not using a Conditional Node but a Switch Node, and placing that value in the “Switch Template” field would indeed produce some unexpected results.

The Switch Node works like a JavaScript switch statement. Given a variable, execute one of many different paths based on the value of that variable.

If you were indeed using a Conditional Node previously, could you provide the exact configuration you applied to the node, the payload that was evaluated against the conditional statement, and your expected result?