Webhook Throttling Question

We have successfully connected our Azure IoT Hub the Losant IoT Platform using a webhook to send device telemetry messages to Losant. From the documentation we noticed “Webhook requests are limited to 100 calls in a 10-second window”. We are currently using 1 device for testing and have already reached the limit under certain circumstances. For our application it is critical not to discard any device messages.

Messages from the device are first processed in an Azure Function then sent to Losant. If a large amount of messages are sent from a device or devices at a time, or if there is a backlog of messages for whatever reason the Azure Function scales out to meet the increased demand. When a large amount of messages are sent to Losant we notice the “Over rate limit, request throttled error” is encountered as stated in the documentation.

This webhook throttling limit seems low given that all device messages from an Iot Hub will be sent to a single webhook. If we were to use multiple webhooks the limit could still be reached under circumstances even with a single device.

We could send batches of messages every 10 seconds but this would add latency and could cause a backlog so this is not desirable.

Is there something we are missing regarding the use of webhooks? Is there a better recommended way for sending messages from Azure IoT Hub to Losant?

Hello @Marlex2,

Welcome to the Losant Forums, we’re happy you’re here!

I would really like to get more information about your use case. Would you be able to tell me what kind of data you are sending and why you are sending it that fast?

The 100 calls in a 10-second window is a safety limit we do have in place, and can be adjusted for paying customers, but only within reason. There is another limit that will come into play here that is not adjustable, and that is the Device State Limit.

Again, any additional information you can provide around your use case would be great.

Thank you,
Heath

Hi @Heath, thank you for the quick reply!

For this use case we have RFID antennas that detect RFID tags. Every time a tag is detected a message is sent to the Azure IoT Hub, the message includes the antenna id, tag id, GPS coordinates, etc. Messages are sent as fast as possible from the end device. Sometimes we have multiple detections from multiple antennas at the same time, a few logs are aggregated at the source but multiple messages may be sent at the same time. Occasionally there may be a backlog from the Azure IoT Hub which would cause a large amount of messages to be sent to Losant at the same time.

For this application the messages are not traditional time series data points (like temperature, humidity), in this application log messages can occur at any time and cannot be discarded.

It looks like we will have to send batches of at most 100 messages every 10 seconds, these 100 messages could be from the same device or from any number of devices. What do you think about this approach, do you see this being a problem with the device state limit?

Hey @Marlex2,

Your use case is certainly valid, and sounds awesome!

do you see this being a problem with the device state limit?

This definitely depends on how you will be associating devices in your Losant application. The webhook rate limit and device state limit are also per instance rate limits. What I mean by that is that each webhook you have will have a rate limit of 100 calls in a 10-second window, and each device state topic will have a rate limit of 30 calls in a 15-second window.

So, this means that in the case where 1 of your devices could send 100 calls in a 10-second window, yes, you will run into the device state rate limit. But, in the case where 10 devices collectively send 100 requests in a 10-second window, depending on how they’re queued, you should be OK with the device state rate limit. But, with the latter case, there does exist a case where, if the requests aren’t queued correctly, you will hit the device state limit.

I hope this helps. I will spend some more time thinking about potential solutions to this for you.

Thank you,
Heath