azure-stream-analytics

Authorization failure when creating a Stream Analytics job

故事扮演 提交于 2019-12-06 07:01:28
I've been trying (and failing) to create an Azure Stream Analytics job programatically. I was following this example originally: https://azure.microsoft.com/en-gb/documentation/articles/stream-analytics-dotnet-management-sdk/ But it pops up a dialog for you to log in. I want to be able to do this server side. It looks like I need to use Azure AD to use the Resource Manager APIs. I've been working my way through this: https://msdn.microsoft.com/en-us/library/azure/dn790557.aspx#bk_portal And the code looks like this: var authContext = new AuthenticationContext("https://login.microsoftonline.com

Simplest way to log all messages from an Azure Event Hub

こ雲淡風輕ζ 提交于 2019-12-03 13:28:34
I'm using a service which outputs to an Event Hub. We want to store that output, to be read once per day by a batch job running on Apache Spark. Basically we figured, just get all messages dumped to blobs. What's the easiest way to capture messages from an Event Hub to Blob Storage? Our first thought was a Streaming Analytics job, but it demands to parse the raw message (CSV/JSON/Avro), our current format is none of those. Update We solved this problem by changing our message format. I'd still like to know if there's any low-impact way to store messages to blobs. Did EventHub have a solution

Connection Test Failed when trying to add an Azure function as an output sink to Stream Analytics Job

你说的曾经没有我的故事 提交于 2019-12-02 12:36:49
问题 I always get a Connection Test Failed when trying to add an Azure function as an output sink to Stream Analytics Job. The Azure function works fine by itself and I can also call it using Postman. As soon as I add the Azure function, I get the Connection Test failed error message. Azure function returned an HTTP error. An error occurred while sending the request. 回答1: I got the same issue. The solution that works for me was to set up: Minimum TLS version: 1.0 (by default it was 1.2) for my

Azure Stream Analytics - Error with customized “timestamp by” while applying window tumbling

前提是你 提交于 2019-12-01 12:55:35
I have a json file as below: {"imei": {"imei": "358174069248418F", "imeiBinary": "NYF0BpJIQY8=","imeiNotEncoded": "358174069248418","valid": 1},"dataPackets": [["msy.mxp.datapacket.AlarmNotification",{"version": 1, "id": 21, "op": 2,"sizeDynamic": 0, "alarmStatus": 4}],["msy.mxp.datapacket.IOStatus",{"version": 1,"id": 15, "op": 2,"sizeDynamic": 0,"ioStatus": 135,"ioDirections": 120}], ["msy.mxp.datapacket.LogicalStatus",{"version": 1,"id": 16, "op": 2,"sizeDynamic": 0,"logicalStatus": 5} ],[ "msy.mxp.datapacket.Position", {"version": 1,"id": 19,"op": 2,"latitude": 40.835243,"longitude": 14

Azure Stream Analytics - Error with customized “timestamp by” while applying window tumbling

烂漫一生 提交于 2019-12-01 11:46:43
问题 I have a json file as below: {"imei": {"imei": "358174069248418F", "imeiBinary": "NYF0BpJIQY8=","imeiNotEncoded": "358174069248418","valid": 1},"dataPackets": [["msy.mxp.datapacket.AlarmNotification",{"version": 1, "id": 21, "op": 2,"sizeDynamic": 0, "alarmStatus": 4}],["msy.mxp.datapacket.IOStatus",{"version": 1,"id": 15, "op": 2,"sizeDynamic": 0,"ioStatus": 135,"ioDirections": 120}], ["msy.mxp.datapacket.LogicalStatus",{"version": 1,"id": 16, "op": 2,"sizeDynamic": 0,"logicalStatus": 5} ],[

Latest value in PowerBi from ASA

一笑奈何 提交于 2019-12-01 09:05:33
Is it possible to show the latest value that has arrived in PowerBi from Stream Analytics? In the card diagram type for example I imagine having a filter value for a measurementtime field selecting latest value or something? Lukasz P. Best you can do right now is use Q&A to ask a question like "show value in the last 10 seconds". It's a valid request, could you submit an item through support.powerbi.com? 来源: https://stackoverflow.com/questions/32199941/latest-value-in-powerbi-from-asa

Basic query with TIMESTAMP by not producing output

可紊 提交于 2019-11-30 23:50:11
I have a very basic setup, in which I never get any output if I use the TIMESTAMP BY statement. I have a stream analytics job which is reading from Event Hub and writing to the table storage. The query is the following: SELECT * INTO MyOutput FROM MyInput TIMESTAMP BY myDateTime; If the query uses timestamp statement, I never get any output events. I do see incoming events in the monitoring, there are no errors neither in monitoring nor in the maintenance logs. I am pretty sure that the source data has the right column in the right format. If I remove the timestamp statement, then everything

Can one have multiple queries in streaming analytics job?

三世轮回 提交于 2019-11-29 10:54:09
As the title says, can you have more than one query in an Azure Streaming Analytics job? If so, how should that be structured? Vignesh Chandramohan yes, you can have multiple queries in stream analytics job. You would do something like below select * into type1Output from inputSource where type = 1 select * into type2Output from inputSource where type = 2 The job has two outputs defined, called type1Output and type2Output. Each query writes to a different output. 来源: https://stackoverflow.com/questions/36287058/can-one-have-multiple-queries-in-streaming-analytics-job

Can one have multiple queries in streaming analytics job?

十年热恋 提交于 2019-11-28 04:28:28
问题 As the title says, can you have more than one query in an Azure Streaming Analytics job? If so, how should that be structured? 回答1: yes, you can have multiple queries in stream analytics job. You would do something like below select * into type1Output from inputSource where type = 1 select * into type2Output from inputSource where type = 2 The job has two outputs defined, called type1Output and type2Output. Each query writes to a different output. 来源: https://stackoverflow.com/questions