azure-stream-analytics

Basic query with TIMESTAMP by not producing output

本秂侑毒 提交于 2019-12-30 07:19:07
问题 I have a very basic setup, in which I never get any output if I use the TIMESTAMP BY statement. I have a stream analytics job which is reading from Event Hub and writing to the table storage. The query is the following: SELECT * INTO MyOutput FROM MyInput TIMESTAMP BY myDateTime; If the query uses timestamp statement, I never get any output events. I do see incoming events in the monitoring, there are no errors neither in monitoring nor in the maintenance logs. I am pretty sure that the

Getting error in Azure Stream Analytics with DocumentDB as sink

早过忘川 提交于 2019-12-25 08:46:49
问题 I'm using Azure Stream Analytics to stream events from Event Hubs to DocumentDB. I have configured the input , query and output as documented, tested it with sample data and it managed to return results as expected. But when I start the streaming job and send the same payload as the sample data earlier, I got this error message: There was a problem formatting the document [id] column as per DocumentDB constraints for DocumentDB db:[my-database-name], and collection:[my-collection-name]. My

Azure Stream Analytics: Multiple Windows + JOINS

流过昼夜 提交于 2019-12-25 03:08:12
问题 My architecture: 1 EventHub with 8 Partitions & 2 TPUs 1 Streaming Analytics Job 6 Windows based on the same input (from 1mn to 6mn) Sample Data: {side: 'BUY', ticker: 'MSFT', qty: 1, price: 123, tradeTimestamp: 10000000000} {side: 'SELL', ticker: 'MSFT', qty: 1, price: 124, tradeTimestamp:1000000000} The EventHub PartitionKey is ticker I would like to emit every second, the following data: (Total quantity bought / Total quantity sold) in the last minute, last 2mn, last 3mn and more What I

U-SQL get file paths from pattern

孤者浪人 提交于 2019-12-25 00:45:32
问题 I need to get a list of files to then filter this set DECLARE @input_file string = @"\data\{*}\{*}\{*}.avro"; @filenames = SELECT filename FROM @input_file; @filtered = SELECT filename FROM @filenames WHERE {condition} Something like this if it's possible... 回答1: The way to do that is define virtual columns in your fileset. You can then extract and manipulate these virtual columns like they were data fields extracted from your file. Example: DECLARE @input_file string = "/data/{_partition1}/{

An error occurred Send Events: Azure Function Output Adapter failed to write events Azure Function as Stream Analytics Job Output

╄→尐↘猪︶ㄣ 提交于 2019-12-24 21:06:05
问题 I have an Azure IoT Dev Kit MXChip and I am sending the sensor data to the IoT Hub. I have also set up the Stream Analytics Job with Input as IoT Hub and Output as SQL Server and Azure Function. The output is getting written to the SQL Database, so I am confirming that the Query is correct. When I check my stream analytics job log, I am seeing some error as below. { "channels": "Operation", "correlationId": "4a9c9b61-631a-4f4f-a403-c2869b2af66c", "description": "", "eventDataId": "97250faf

Send IoT Hub Cloud-to-Device message from Stream Analytics Output (Using Event Hub endpoint)

谁说胖子不能爱 提交于 2019-12-24 19:19:03
问题 We use Stream Analytics successfully for ingesting event messages sent from IoT Hub devices. Stream Analytics supports IoT Hub as Input stream out of the box. But now we have a requirement to send the Stream Analytics Output result to an IoT Hub Device. We are able to setup an Event Hub connection to IoT Hub as the Output sink for Stream Analytics, but we get an error event raised: Message: Access to the Event Hub has been denied. The token may have an invalid signature. Which raises the

Select the first element in a JSON array in Microsoft stream analytics query

半腔热情 提交于 2019-12-23 14:24:13
问题 So I got a bit of a problem. I retrieve some weatherdata from an external API. This is returned as JSON and send to an Azure IoT hub. Stream analytics processes the json into a proper format, but I got a problem here. The element: Current_Condition, is of an array format. It always has one element on the [0] position. I only need to get the data of that array from that very first position, without a filter for things like id etc. Under here is the complete data { "deviceId": "aNewDevice",

Select the first element in a JSON array in Microsoft stream analytics query

梦想的初衷 提交于 2019-12-23 14:23:30
问题 So I got a bit of a problem. I retrieve some weatherdata from an external API. This is returned as JSON and send to an Azure IoT hub. Stream analytics processes the json into a proper format, but I got a problem here. The element: Current_Condition, is of an array format. It always has one element on the [0] position. I only need to get the data of that array from that very first position, without a filter for things like id etc. Under here is the complete data { "deviceId": "aNewDevice",

Stream Analytics: Dynamic output path based on message payload

喜你入骨 提交于 2019-12-20 03:09:11
问题 I am working on an IoT analytics solution which consumes Avro formatted messages fired at an Azure IoT Hub and (hopefully) uses Stream Analytics to store messages in Data Lake and blob storage. A key requirement is the Avro containers must appear exactly the same in storage as they did when presented to the IoT Hub, for the benefit of downstream consumers. I am running into a limitation in Stream Analytics with granular control over individual file creation. When setting up a new output

Stream Analytics Egress to Azure Functions

孤街浪徒 提交于 2019-12-13 17:33:46
问题 Microsoft announced support for sending data from Azure Stream Analytics to Azure Functions few days back: https://azure.microsoft.com/en-us/blog/new-in-stream-analytics-output-to-azure-functions-built-in-anomaly-detection-etc/ I tried this but couldn't send data to Azure Functions. Is there any guide how to send data packet from IoT-hub -> Azure Stream Analytics -> Azure Functions? The output is fine to other sources. This is the query I have: WITH rpidata AS ( SELECT *, DATEADD(Hour, 3,