azure-storage-blobs

AzureBlobStorage Connector - CreateFile/DeleteFile - Id, relationship with Azure Storage Blobs client library for .NET

廉价感情. 提交于 2021-01-29 09:27:03
问题 I have a powerapp using the AzureBlobStorage Connector(the connector) . However, this app has to interact with data that is being bulk uploaded using the Azure Storage Blobs client library for .NET (the api). When you create a blob using the connector you get and Id which can then be used to Delete the blob. However, when creating blobs with the api I cannot see how I can get that ID (you just use the blobid which is the filename). Hence data that is being bulk created cannot be deleted in

Generating ZIP files in azure blob storage

▼魔方 西西 提交于 2021-01-29 08:46:08
问题 What is the best method to zip large files present in AZ blob storage and download them to the user in an archive file (zip/rar) does using azure batch can help ? currently we implement this functions in a traditionally way , we read stream generate zip file and return the result but this take many resources on the server and time for users. i'am asking about the best technical and technologies solution (preferred way using Microsoft techs) 回答1: There are few ways you can do this **from azure

Microsoft Azure Storage denying my OneClick application from downloading

依然范特西╮ 提交于 2021-01-29 06:09:58
问题 Through MAS, I have a blob container setup with public access level set to "Public read access for blobs only", allowing for a download of the setup.exe by visiting "https://myblob.blob.core.windows.net/myprogram/myprogram.htm". The problem stems from actually running setup.exe in that it gives me The following properties have been set: Property: [AdminUser] = true {boolean} Property: [InstallMode] = HomeSite {string} Property: [NTProductType] = 1 {int} Property: [ProcessorArchitecture] =

read excel files from “input” blob storage container and export to csv in “output” container with python

寵の児 提交于 2021-01-29 05:02:43
问题 I'm trying to develop a script in python to read a file in .xlsx from a blob storage container called "source", convert it in .csv and store it in a new container (i'm testing the script locally, if working I should include it in an ADF pipeline). Sofar I managed to access to the blob storage, but I'm having problems in reading the file content. from azure.storage.blob import BlobServiceClient, ContainerClient, BlobClient import pandas as pd conn_str = "DefaultEndpointsProtocol=https

How to upload files to azure blob storage?

走远了吗. 提交于 2021-01-28 19:00:45
问题 I need to upload files to azure blob container every day from local system. I use azcopy with sas for doing it. But what i encountered is that SAS for container keep changing on every refresh. So is there any better way i can upload files using python or azcopy. Or is there any way to get the SAS toke from the azure without login and pass that SAS token to azcopy command? as of now i use this command from azcopy .\azcopy "Sourcefilepath" "Destblobpath?SAS_Token" --recurcive=true Each day i

Azure Block Blob: “The specified block list is invalid” when committing previously committed blocks

こ雲淡風輕ζ 提交于 2021-01-28 13:18:27
问题 Using the Azure.Storage.Blob .NET SDK v12, I'm trying to append to a block blob. Based on various documentation, it seems the way to do this is to commit a list of previously committed block IDs along with new block IDs. I've made sure that the block IDs are all the same length. But when I try to commit a block ID that has already been committed, I get a "400: The specified block list is invalid" error. Here is some simplified code which illustrates the problem: // Create a blob container and

Azure Block Blob: “The specified block list is invalid” when committing previously committed blocks

烂漫一生 提交于 2021-01-28 13:18:09
问题 Using the Azure.Storage.Blob .NET SDK v12, I'm trying to append to a block blob. Based on various documentation, it seems the way to do this is to commit a list of previously committed block IDs along with new block IDs. I've made sure that the block IDs are all the same length. But when I try to commit a block ID that has already been committed, I get a "400: The specified block list is invalid" error. Here is some simplified code which illustrates the problem: // Create a blob container and

Copy an archived blob to an online tier

会有一股神秘感。 提交于 2021-01-28 09:21:38
问题 I need to copy a blob from archive tier to the hot tear in another container. If I'm using StartCopy method I'm getting "This operation is not permitted on an archived blob" error. Here is my code: CloudBlockBlob blobSource = (CloudBlockBlob)item; CloudBlockBlob blobTarget = ArchiveContainer.GetBlockBlobReference(blobSource.Name); blobTarget.StartCopy(blobSource); It should be possible to do based on this article, but I didn't find any code sample. Is it possible to do with Microsoft.Azure

Azure Blob Error: StorageException: The condition specified using HTTP conditional header(s) is not met

假装没事ソ 提交于 2021-01-28 09:01:40
问题 So I have a function that simply downloads text from blob storage every 10 minutes and checks for a result. This function can run for days. But it often (roughly every day) fails before finishing with the following error. Caused by: com.microsoft.azure.storage.StorageException: The condition specified using HTTP conditional header(s) is not met. My code is pretty simple. public String downloadTextBlob(CloudBlobDirectory dir, String filename) { try { return dir.getBlockBlobReference(filename)

How to upload a large string in an Azure Blob?

梦想的初衷 提交于 2021-01-28 08:16:37
问题 Right now I'm trying to figure out how to work with Azure, and now I'm stuck in a problem while storing my data in the storage account. I have three strings and want to store each of them in a separate blob. With the first two, my code works fine, but the third one causes some retries and ends with a timeout. My code is running within an Azure function. Here is a minimal example: from azure.storage.blob import BlobClient blob_client = BlobClient.from_connection_string( conn_str. = '<STORAGE