azure-table-storage

Securing azure table storage connection string

≯℡__Kan透↙ 提交于 2019-12-25 03:01:39
问题 How to secure an azure table connection string? It is not safe to store it inside code or in a config file. What is the best practice? Can I use some kind of certificate based authentication? 回答1: You can encrypt the connection strings section of the config file. A certificate can be made using the Visual Studio Command Prompt which will be used as an encryption key. Once you have the cert, you can add it to your solution and encrypt the connection strings section. After that, upload the cert

Read retry on Azure Table

微笑、不失礼 提交于 2019-12-25 00:27:24
问题 I got error No connection could be made because the target machine actively refused it often, when read from Azure Table Storage, sometime on 80(http) sometime on 443(https) I know I can setup retry for write by .SaveChangesWithRetries() , but how to apply retry on read? btw, I read by code DataServiceQuery<datatype> query = tableContext.CreateQuery<datatype>("table"); IQueryable<datatype> results= from q in query select q 回答1: Last I had to use the semi-official Transient Fault Handling

Azure durable functions - status is running even after completing all the tasks

一曲冷凌霜 提交于 2019-12-24 19:32:23
问题 Related to my previous question - link I have created a Durable job with HttpStart in order to call it via http request (code from sample code from MSDN). In my Durable job, I wrote logic to loop and call an activity trigger 600 times. public static async Task<List<BusinessRules.RuleResponse>> Run( [OrchestrationTrigger] DurableOrchestrationContext context) { var data = await context.CallActivityAsync<BusinessRules.CustomersData>("XmlDeserialiser"); var tasks = new List<Task<BusinessRules

Adding property to entity in Azure Table Storage seems to add null property to all entities in table

不问归期 提交于 2019-12-24 18:39:48
问题 I'm trying to dynamically add properties to Azure Table Storage using the ReadingEntity and WritingEntity events, as described here: How to add new properties to an entity saved in Azure Table Storage?. The purpose is to save a List property on my poco into table storage. To accomplish this, every time I need to add a new item to the list for an entity, I'm getting that entity out of Table Storage, counting the number of properties with a certain name that the entity has, and then adding

How to set the properties of an Azure Table through python sdk

旧时模样 提交于 2019-12-24 15:27:37
问题 I am trying to enable CORS for a specific azure account/table from the python sdk. Unfortunately the docs do not cover that topic. From looking here I know that I must use the set_table_service_properties() and pass the storage_service_properties argument. But I don't know how is this argument supposed to be formatted. Should I create a dictionary that when passed to the xml converted will produce something like that? <?xml version="1.0" encoding="utf-8"?> <StorageServiceProperties> <Logging>

query rowkey with greater than in azure table storage

若如初见. 提交于 2019-12-24 13:03:15
问题 I used this link and the quoted white paper to allow me to sort data inserted into table storage. The 'entities' stored have this simplified 'schema': public class Bla : TableEntity { public Bla(){} public Bla(string partitionKey) { PartitionKey = partitionKey; // rowkey + partition = guid/pk // used to order blas RowKey = (DateTime.UtcNow.Ticks - DateTime.MinValue.Ticks).ToString(); } } I can easily get a 'page' (maximum page size 1000) sorted by the rowkey ascendingly like so: var query =

How would Azure storage be billed?

一个人想着一个人 提交于 2019-12-24 10:47:54
问题 According to the Azure homepage, it says: [Storage, measured in GB] (http://www.windowsazure.com/en-us/pricing/details/) Storage is billed in units of the average daily amount of data stored (in GB) over a monthly period. For example, if you consistently utilized 10 GB of storage for the first half of the month and none for the second half of the month, you would be billed for your average usage of 5 GB of storage. I don't understand clearly about the term, "utilization" here. Let's say I

Table Storage SDK

一世执手 提交于 2019-12-24 07:16:52
问题 I am trying to load some data from a CSV file to Azure table storage row by row using Python. String columns are getting inserted directly but the date column mentioned in the source in the format 2018-02-18T11:29:12.000Z is still loaded as string. This means I am unable to query the records using date column. Can someone tell me if there is a way to create an entity definition (datatype for columns) for the table and use it to load the records in order to avoid dates loaded with string type?

Return more than 1000 entities of a Windows Azure Table query

谁都会走 提交于 2019-12-24 04:51:53
问题 My question is exactly this one. However, the Azure Storage API has changed and all answers I can find for this question deal with the old version. How do handle queries that return more than 1000 items in the current API version? The query fetching less than 1000 items looks like this: var query = new TableQuery<TermScoreEntity>() .Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, Name)); var table = _tableClient.GetTableReference("scores"); foreach (var

Azure Table Partitioning Strategy

纵然是瞬间 提交于 2019-12-24 02:37:06
问题 I am trying to come up with a partition key strategy based on a DateTime that doesn't result in the Append-Only write bottleneck often described in best practices guidelines. Basically, if you partition by something like YYYY-MM-DD, all your writes for a particular day will end up the same partition, which will reduce write performance. Ideally, a partition key should even distribute writes across as many partitions as possible. To accomplish this while still basing the key off a DateTime