I am working on saving files to sql blob to a varbinary(max) column, and have got the save side of things working now (I believe).
What I can\'t figure out is how to
You are making it more difficult than it needs to be. This is using MySQL just because it is handy - the providers all work pretty much the same. Some things will need to be tweaked to handle very large data items (more of a server thing than DB Provider).
string sql = "INSERT INTO BlobDemo (filename, fileType, fileData) VALUES (@name, @type, @data)";
byte[] imgBytes;
using (MySqlConnection dbCon = new MySqlConnection(MySQLConnStr))
using (MySqlCommand cmd = new MySqlCommand(sql, dbCon))
{
string ext = Path.GetExtension(filename);
dbCon.Open();
cmd.Parameters.Add("@name", MySqlDbType.String).Value = "ziggy";
cmd.Parameters.Add("@data", MySqlDbType.Blob).Value = File.ReadAllBytes(filename);
cmd.Parameters.Add("@tyoe", MySqlDbType.String).Value = ext;
int rows = cmd.ExecuteNonQuery();
}
The file data is fed directly to the DB Provider
is there a way to get the full path including file extension of a file stored in an SQL Blob?
No. Your code and the code above is saving the bytes which make up an image or any file.
This will read the data back, save it to file and start the associated app:
string SQL = "SELECT itemName, itemData, itemtype FROM BlobDemo WHERE Id = @id";
string ext = "";
string tempFile = Path.Combine(@"C:\Temp\Blobs\",
Path.GetFileNameWithoutExtension(Path.GetTempFileName()));
using (MySqlConnection dbCon = new MySqlConnection(MySQLConnStr))
using (MySqlCommand cmd = new MySqlCommand(SQL, dbCon))
{
cmd.Parameters.Add("@id", MySqlDbType.Int32).Value = 14;
dbCon.Open();
using (MySqlDataReader rdr = cmd.ExecuteReader())
{
if (rdr.Read())
{
ext = rdr.GetString(2);
File.WriteAllBytes(tempFile + ext, (byte[])rdr["itemData"]);
}
}
// OS run test
Process prc = new Process();
prc.StartInfo.FileName = tempFile + ext;
prc.Start();
}
In both cases, File.ReadAllBytes()
and File.WriteAllBytes()
will do most of the work for you, no matter the file type.
There is no need to scoop out the data 1k at a time. If the blob was something like an image you wished to use in the app:
using (MySqlDataReader rdr = cmd.ExecuteReader())
{
if (rdr.Read())
{
ext = rdr.GetString(2);
using (MemoryStream ms = new MemoryStream((byte[])rdr["imgData"]))
{
picBox.Image = Image.FromStream(ms);
}
}
}
The blob bytes can be fed to the memstream, and even a temp Image
need not be created unless you don't need to show it.
In all, Ceiling Cat made it back just fine (image was 1.4 MB, zoomed; another test with a 15.4 MB image also worked - both are larger than I would care to store in a DB).:
Depending on how this is used, consider archiving the images to somewhere on the file system and just saving the filename - perhaps with the Id
added to assure the names are unique and help visually link them to the record. Not only will large blobs of data bloat the DB, but there is obviously some overhead involved in converting to and from bytes which can be avoided.
If you want/need to delete these at some point after the associated app is done with them (not really a component of the question), then use a tempfile in a specific directory so you can delete everything in it (conditionally1) when the app ends, or at start up:
private string baseAppPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData),
"Company Name", "Product Name", "Temp Files");
Append a Temp Filename and the actual extension for individual files. Alternatively, you could maintain a List<string> trashCan
to store the name of each file you create to be deleted later.
1 Whenever you do delete them, do allow that files could still be open in the app associated with the extension.
With the .NET SQL Server provider, you can use a little-known but cool class called SqlBytes. It's been designed specifically to map varbinary
fields, but there are not many examples on how to use it.
Here is how you can save to the database with it (you can use a stored procedure or direct SQL like I demonstrate here, we just presume the MyBlobColumn is a varbinary
one).
string inputPath = "YourInputFile";
using (var conn = new SqlConnection(YourConnectionString))
{
conn.Open();
using (var cmd = conn.CreateCommand())
{
// note we define a '@blob' parameter in the SQL text
cmd.CommandText = "INSERT INTO MyTable (Id, MyBlobColumn) VALUES (1, @blob)";
using (var inputStream = File.OpenRead(inputPath))
{
// open the file and map it to an SqlBytes instance
// that we use as the parameter value.
var bytes = new SqlBytes(inputStream);
cmd.Parameters.AddWithValue("blob", bytes);
// undercovers, the reader will suck the inputStream out through the SqlBytes parameter
cmd.ExecuteNonQuery();
}
}
}
To read the file out into a stream from the database, here is how you can do it.
string outputPath = "YourOutputFile";
using (var conn = new SqlConnection(YourConnectionString))
{
conn.Open();
using (var cmd = conn.CreateCommand())
{
// this is a regular direct SQL command, but you can use a stored procedure as well
cmd.CommandText = "SELECT MyBlobColumn FROM MyTable WHERE Id = 1";
// note the usage of SequentialAccess to lower memory consumption (read the docs for more)
using (var reader = cmd.ExecuteReader(CommandBehavior.SequentialAccess))
{
if (reader.Read())
{
// again, we map the result to an SqlBytes instance
var bytes = reader.GetSqlBytes(0); // column ordinal, here 1st column -> 0
// I use a file stream, but that could be any stream (asp.net, memory, etc.)
using (var file = File.OpenWrite(outputPath))
{
bytes.Stream.CopyTo(file);
}
}
}
}
}
With these techniques, we never allocated any byte[] nor MemoryStream instances, just used in and out SQL or File Streams.
You will need to use a SqlCommand to retrieve data when data is stored in a varbinary(MAX) column unless you use a FileTable, which allows access to the contents via UNC path similarly to regular files stored on a file system, but managed by SQL Server.
If the size of the blob may be large, the "chunk" technique you are currently using will reduce memory requirements, but at the expense of more verbose code. For reasonably size blob sizes, you could read the entire column contents at once without a chunking method. Whether this is feasible depends on both the size of the blob and client available memory.
var buffer = (byte[])cmd.ExecuteScalar();
fs.Write(buffer, 0, buffer.Length);