azure-stream-analytics

Accessing Array Elements in Azure Stream Analytics

拥有回忆 提交于 2019-12-11 19:38:36
问题 Let's say I have a JSON object coming into an Azure Stream Analytics job: { "coordinates":{ "type":"Point", "LongLat":[ 115.17348, -8.72263 ] }, } I could do the following to get the individual values of "Long" and "Lat"? Could I do the following to isolate the array value [115.17348, -8.72263]: Select coordinates.longlat However I'm having trouble just grabbing individual elements of the array. I've seen fuzzy documentation on the web about GetArrayElement(), stream analytics keeps saying

Azure Stream Analytics - no output events

僤鯓⒐⒋嵵緔 提交于 2019-12-11 10:58:41
问题 I have a problem with azure stream analytics job. Job monitor shows incoming input events (from Event Hub) but there are no output events or errors. Job is really simple, just to write every input to azure blob storage: SELECT * FROM input Any suggestions what could be wrong? Update! It was a bug in Azure Stream Analytics and it's already solved by Microsoft. 回答1: Did you try to include INTO clause? SELECT * INTO [output] FROM [input] 回答2: Since you have verified that events are coming into

Azure Stream Analytics is not feeding DocumentDB output sink

老子叫甜甜 提交于 2019-12-11 08:00:52
问题 I am trying to integrate Azure Stream Analytics with DocumentDB and use it as a output sink. Problem is, that there are no documents created in DocDB when the processing job is running. I tried to test my query and I have even tried to mirror the output to storage account. There is json file being created in the storage containing all the values, but DocDB stays empty. Here is my query: WITH Res1 AS ( SELECT id, concat( cast( datepart(yyyy,timestamp) as nvarchar(max)), '-', cast( datepart(mm

Does azure stream analytics read data coming from all partitions

若如初见. 提交于 2019-12-11 04:25:13
问题 Azure event hub has partition feature for scalability. While reading data using app service, one eventprocessorHost can be tied to one partition only. There is no way to act collectively on data coming from multiple partitions. But while using Stream analytics, we can aggregate data based on time. So, does it take care of all the partitions while aggregating the data? Means, if reading are passed to 8 partitions, aggregate should includes all these readings in calculation. Thanks 回答1: Yes.

Azure Storm vs Azure Stream Analytics

我是研究僧i 提交于 2019-12-08 18:15:59
问题 Looking to do real time metric calculations on event streams, what is a good choice in Azure? Stream Analytics or Storm? I am comfortable with either SQL or Java, so wondering what are the other differences. 回答1: It depends on your needs and requirements. I'll try to lay out the strengths and benefits of both. In terms of setup, Stream Analytics has Storm beat. Stream Analytics is great if you need to ask a lot of different questions often. Stream Analytics can also only handle CSV or JSON

Wrapping JSON into output Stream Analytics query

我的未来我决定 提交于 2019-12-08 09:37:13
问题 I am using Stream Analytics query to filter my inputted Complex Json object. Input: { "id" : "001", "firstArray":[ { "tid" : 9, "secondArray":[ { "key1" : "value1", "key2" : "value2" }, {...} ] }, { "tid" : 8, "secondArray":[ { "key1" : "value1", "key2" : "value2" }, {...} ] } ] } This is my query: WITH T1 AS ( SELECT FirstArray.ArrayValue.Tid as Tid, FirstArray.ArrayValue.secondArray as SecondArray FROM inputfromeventhub MySource OUTER APPLY GetElements(MySource.firstArray) AS FirstArray )

System.Object type which is not supported by PBI (Power BI)service

落花浮王杯 提交于 2019-12-08 08:10:26
问题 I have set Azure Stream Analytic job output as Power BI. But I am getting warning in the analytic job that System.Object type which is not supported by PBI service. Also I am not able to see any data in Power BI . But I can see the database created there which ensure that stream analytic job output is coming there. Below is the sample data sent to Power BI. I know the error happens because one of the property is an object . Is there any thing I can do at Power BI to handle this? {"test": {

SMS Messaging from Azure IOT Hub

纵然是瞬间 提交于 2019-12-08 07:29:03
问题 I am currently building a project using an Adruino Uno to collect weather data such as temperature and humidity, this data is then passed onto the Azure IOT hub, the messages are then processed and stored to an SQL database again in Azure. Finally the data is then displayed on a website which users can sign up to and view the weather data I have collected. I am trying to implement SMS notifications into the system so that if the temperature is to hit a certain threshold say 0 degrees Celsius,

System.Object type which is not supported by PBI (Power BI)service

為{幸葍}努か 提交于 2019-12-06 14:50:08
I have set Azure Stream Analytic job output as Power BI. But I am getting warning in the analytic job that System.Object type which is not supported by PBI service. Also I am not able to see any data in Power BI . But I can see the database created there which ensure that stream analytic job output is coming there. Below is the sample data sent to Power BI. I know the error happens because one of the property is an object . Is there any thing I can do at Power BI to handle this? {"test": {"name":"testApp", "date":"2015-07-31T10:38:45.1276956+05:30", "flag":true, "val":"2015-07-31T10:38:45

Stream Analytics Job -> DataLake ouput

断了今生、忘了曾经 提交于 2019-12-06 13:32:39
I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store. https://docs.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts The issue comes with refreshToken: "It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token" Furthermore after 90-days refresh token is outdated and you need to do "Renvew Authorization" https://docs.microsoft.com/en-us/azure