问题
So I am creating flow for imorting data, so some aggragate columns could be uploaded to Azure Sql database and later on tabular model. Now I would to describe the flow, so someone could tell about it, pros and cons.
At this stage of development the flow is:
1.
User imports CSV file to my web service (in ASP.NET CORE 2.1) to Azure Sql database, for importing I am using Sql Bulk Library in .NET Core. The webservice and database will be located on the server in azure. Some of the data importstakes about 20minutes.
2.
When the data import is finished, I am calling Azure Machine Learning Web service, who will calculate helping columns so later on with MDX queries I could retrieve data from tabular model more easily and more efficiently. These helping columns will tell if users were active previous month or not for example.
3.
When R script finishes calculation it updates Azure Sql Database table with new column(s).
4.
When columns are updated in a database, I am telling Azure Analysis Service to refresh database(not in .net core version because it doesn't support ADOM.NET) So I am created another web service (.NET 4.7) so I could from web service automatically refresh it.
5.
So finally the new data appears on a tabular model so I can get information of data using MDX queries with ADOM.NET library.
Please tell me if there is a better solution for this flow.
回答1:
Azure SQL Database supports in memory R execution for feature engineering, training models and inference. It is currently in preview, but will GA soon: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-machine-learning-services-overview
Also at //BUILD, Microsoft announced a serverless performance profile for Azure SQL DB which is perfect for low frequency jobs like this.
This hopefully can simplify your workflow dramatically.
来源:https://stackoverflow.com/questions/56166966/calculating-helping-columns-on-r-script-on-azure-machine-learning-so-they-could