azure-stream-analytics

using split in azure stream analytics

烈酒焚心 提交于 2020-01-07 09:03:06
问题 I have strings in the format "1234.567.111". I wish to break it into three int. I do not see a split function in azure stream analytics. Is it possible to do this any other way. Thanks Update: I have added a request for split function here.., would appreciate if you guys voted for the same.. 回答1: I wish Stream Analytics had a split function. You may have to use CHARINDEX and SUBSTRING for now: https://msdn.microsoft.com/en-us/library/azure/dn835064.aspx It's a bit of a pain, but the following

Azure Stream Analytics job expensive for small data?

大兔子大兔子 提交于 2020-01-06 05:51:36
问题 In order to write sensor data from an IoT device to a SQL database in the cloud I use an Azure Streaming Analytics job. The SA job has an IoT Hub input and a SQL database output. The query is trivial; it just sends all data through). According to the MS price calculator, the cheapest way of accomplishing this (in western Europe) is around 75 euros per month (see screenshot). Actually, only 1 message per minute is send through the hub and the price is fixed per month (regardless of the amount

Azure Stream Analytics query to detect missing alive event for a specific deviceId

倾然丶 夕夏残阳落幕 提交于 2020-01-05 03:59:05
问题 I do not see a way to analyse a stream for the absence of a specific event with azure stream analytics query language. The stream may contain DeviceAlive and BeaconDetected events containing a DeviceId and in case of BeaconDetected also a BeaconId. Now I want to generate an error event if the DeviceAlive event is missing. How can I achieve this? I tried to use reference data with all valid deviceIds. But I am not allowed to do a linq-wise "contains" query like this SELECT * FROM inputStream

Azure Stream Analytics -> how much control over path prefix do I really have?

妖精的绣舞 提交于 2020-01-04 06:36:07
问题 I'd like to set the prefix based on some of the data coming from event hub. My data is something like: {"id":"1234",...} I'd like to write a blob prefix that is something like: foo/{id}/guid.... Ultimately I'd like to have one blob for each id. This will help how it gets consumed downstream by a couple of things. What I don't see is a way to create prefixes that aren't related to date and time. In theory I can write another job to pull from blobs and break it up after the stream analytics

Export Custom Event Dimensions to SQL from Application Insights using Stream Analytics

别等时光非礼了梦想. 提交于 2020-01-03 13:09:07
问题 I'm following the example walkthrough Export to SQL from Application Insights using Stream Analytics. I am trying to export custom event dimensions (context.custom.dimensions in the JSON example below) which get added as a nested JSON array in the data file. How do I flatten the dimensions array at context.custom.dimensions for export to SQL? JSON... { "event": [ { "name": "50_DistanceSelect", "count": 1 } ], "internal": { "data": { "id": "aad2627b-60c5-48e8-aa35-197cae30a0cf",

Rules engine for Stream Analytics on Azure

懵懂的女人 提交于 2020-01-03 01:34:45
问题 I'm new to Azure and to analytics. I'm trying to understand streaming alert rules engine. I have used some sample data as input and have queries to filter data. However i'm not sure of what rules engine means, is it just queries or is there anything more to it and is there a way we can have all rules in one if yes, how? 回答1: The main way to define logic for ASA is to use SQL, which provide a way to define rules with SQL statements (e.g. SELECT DeviceID ... WHERE temperature>50). Multiple

Stream Analytics Job -> DataLake ouput

99封情书 提交于 2020-01-02 10:03:58
问题 I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store. https://docs.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts The issue comes with refreshToken: "It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token" Furthermore after 90-days

Stream Analytics Job -> DataLake ouput

送分小仙女□ 提交于 2020-01-02 10:01:43
问题 I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store. https://docs.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts The issue comes with refreshToken: "It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token" Furthermore after 90-days

Latest value in PowerBi from ASA

你离开我真会死。 提交于 2019-12-30 10:35:11
问题 Is it possible to show the latest value that has arrived in PowerBi from Stream Analytics? In the card diagram type for example I imagine having a filter value for a measurementtime field selecting latest value or something? 回答1: Best you can do right now is use Q&A to ask a question like "show value in the last 10 seconds". It's a valid request, could you submit an item through support.powerbi.com? 来源: https://stackoverflow.com/questions/32199941/latest-value-in-powerbi-from-asa

Latest value in PowerBi from ASA

风流意气都作罢 提交于 2019-12-30 10:35:05
问题 Is it possible to show the latest value that has arrived in PowerBi from Stream Analytics? In the card diagram type for example I imagine having a filter value for a measurementtime field selecting latest value or something? 回答1: Best you can do right now is use Q&A to ask a question like "show value in the last 10 seconds". It's a valid request, could you submit an item through support.powerbi.com? 来源: https://stackoverflow.com/questions/32199941/latest-value-in-powerbi-from-asa