46

I have a table named Blob (Id (int), Data (Image)). I need to use SqlDataReader to get that image data. Note that I dont want to Response.Binarywrite() the data to the browser. I just need that binary data as byte[] to use the same for some internal operations. Only way I can think of is getting id using SqlDataReader and the again use SqlCommand.ExecuteScalar() to get that as byte[] for a given id. Can I use just the SqlDataReader (SqlCommand.ExecuteReader) to get that image data as byte[]? Am I missing anything?

7 Answers 7

76

You should be able to get it via: (byte[])reader["Data"].

Also note that the image data type is deprecated, and will be removed in a future version of SQL Server; use varbinary(max) instead.

2
  • 3
    I don't even see an Items collection on the SqlDataReader in .NET 4.0.
    – Zack
    Commented May 22, 2015 at 13:50
  • 1
    Is the Items collection deprecated?
    – AJ_83
    Commented Jun 3, 2015 at 14:33
20

Yes, you can use SqlDataReader.GetBytes. You probably want to pass in null for the buffer in the first call, to find out how much data there is, and then call it again with an appropriately sized buffer.

You may just be able to use the indexer and cast the result to a byte array - I'm not sure. Worth a try :)

3
  • 1
    ind ndx = rdr.GetOrdinal("<ColumnName>"); if(!rdr.IsDBNull(ndx)) { long size = rdr.GetBytes(ndx, 0, null, 0, 0); //get the length of data byte[] values = new byte[size]; int bufferSize = 1024; long bytesRead = 0; int curPos = 0; while (bytesRead < size) { bytesRead += rdr.GetBytes(ndx, curPos, values, curPos, bufferSize); curPos += bufferSize; } } Commented Mar 20, 2011 at 20:33
  • 2
    @ydobonmai: Why bother with a different buffer size when you already have a buffer of the right size? And you should use bytesRead instead of curPos; you're currently always assuming that it reads a full buffer amount.
    – Jon Skeet
    Commented Mar 21, 2011 at 6:24
  • 1
    SqlDataReader.GetBytes() sounds like a very efficient method since you can reuse the same array (if it has the proper size). So you don't create a new array for each column as (byte[])reader["Data"] would do.
    – mmmmmmmm
    Commented May 6, 2015 at 19:24
16

In .NET Framework 4.5 you can use GetStream method to access binary data as Stream.

7

From MSDN. Not sure why I couldn't find that before.

    SqlConnection pubsConn = new SqlConnection("Data Source=localhost;Integrated Security=SSPI;Initial Catalog=pubs;");
    SqlCommand logoCMD = new SqlCommand("SELECT pub_id, logo FROM pub_info", pubsConn);

    FileStream fs;                          // Writes the BLOB to a file (*.bmp).
    BinaryWriter bw;                        // Streams the BLOB to the FileStream object.

    int bufferSize = 100;                   // Size of the BLOB buffer.
    byte[] outbyte = new byte[bufferSize];  // The BLOB byte[] buffer to be filled by GetBytes.
    long retval;                            // The bytes returned from GetBytes.
    long startIndex = 0;                    // The starting position in the BLOB output.

    string pub_id = "";                     // The publisher id to use in the file name.

    // Open the connection and read data into the DataReader.
    pubsConn.Open();
    SqlDataReader myReader = logoCMD.ExecuteReader(CommandBehavior.SequentialAccess);

    while (myReader.Read())
    {
      // Get the publisher id, which must occur before getting the logo.
      pub_id = myReader.GetString(0);  

      // Create a file to hold the output.
      fs = new FileStream("logo" + pub_id + ".bmp", FileMode.OpenOrCreate, FileAccess.Write);
      bw = new BinaryWriter(fs);

      // Reset the starting byte for the new BLOB.
      startIndex = 0;

      // Read the bytes into outbyte[] and retain the number of bytes returned.
      retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);

      // Continue reading and writing while there are bytes beyond the size of the buffer.
      while (retval == bufferSize)
      {
        bw.Write(outbyte);
        bw.Flush();

        // Reposition the start index to the end of the last buffer and fill the buffer.
        startIndex += bufferSize;
        retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);
      }

      // Write the remaining buffer.
      if(retval > 0) // if file size can divide to buffer size
          bw.Write(outbyte, 0, (int)retval); //original MSDN source had retval-1, a bug
      bw.Flush();

      // Close the output file.
      bw.Close();
      fs.Close();
    }

    // Close the reader and the connection.
    myReader.Close();
    pubsConn.Close();
2
  • 5
    One note on this: if you don't set CommandBehavior.SequentialAccess on the ExecuteReader call (eg getting the reader from another part of a data layer), it will do horrible things to your memory (creating a blob-sized byte array for every call to GetBytes()); This solution should ONLY be used with CommandBehavior.SequentialAccess as demonstrated here.
    – Tao
    Commented May 27, 2011 at 14:51
  • 5
    Could anyone offer comments on how one decides on the buffer size? The example uses 100, but why not 10 and why not 10,000? What are the factors that one needs to consider? Commented Nov 27, 2013 at 7:50
5

Use this function for safe and flexible bytes reading:

    /// <summary>
    /// Reads all available bytes from reader
    /// </summary>
    /// <param name="reader"></param>
    /// <param name="ordinal"></param>
    /// <returns></returns>
    private byte[] GetBytes(SqliteDataReader reader, int ordinal)
    {
        byte[] result = null;

        if (!reader.IsDBNull(ordinal))
        {
            long size = reader.GetBytes(ordinal, 0, null, 0, 0); //get the length of data 
            result = new byte[size];
            int bufferSize = 1024;
            long bytesRead = 0;
            int curPos = 0;
            while (bytesRead < size)
            {
                bytesRead += reader.GetBytes(ordinal, curPos, result, curPos, bufferSize);
                curPos += bufferSize;
            }
        }

        return result;
    }
2
  • Since you already have the correct size, better do int bufferSize = size > int.MaxValue ? int.MaxValue : (int)size;
    – marsze
    Commented Nov 7, 2018 at 11:18
  • 1
    In practice there is no need for the loop. Just call reader.GetBytes once with the correct size parameter : reader.GetBytes(ordinal, 0, result, 0, result.Length)
    – ZunTzu
    Commented Dec 20, 2019 at 10:19
2

No need of using the reader. Just use a dataset to fetch values from the database(using stored Proc or any other method) and just type cast it with byte(code below) and store it in a byte array. Your work is done.

byte[] productImage;
productImage = (byte[])ds.Tables[0].Rows[0]["Image"];
1
  • Good, if it works.
    – Zeek2
    Commented Jan 20 at 9:51
1

This is an old question, and I had been using Anton Bakulev's answer above for a while, until I ran into a case where my data was actually bigger than the 2GB that int curPos could handle. When I tried changing that bufferIndex argument to 0, well anything that was beyond the bufferSize returned corrupted. (Also, that tiny buffer size made loading anything above 2MB a literal PAIN).

No, you probably should not have over 2GB of data in a single column in your database. Try to avoid that. But just in case, here is a more robust, and streamlined, version of the code, as an SqlDataReader extension method:

public static byte[] ParseStrictByteArray(this SqlDataReader reader, string columnName)
{
    int colIdx = reader.GetOrdinal(columnName);
    long size = reader.GetBytes(colIdx, 0, null, 0, 0);
    byte[] imageValue = new byte[size];
    // essentially, we are loading all this data in memory, either way... Might as well do it in one swoop if we can
    int bufferSize = (int)Math.Min(int.MaxValue, size); 
    //int.MaxValue = 2,147,483,647 = roughly 2 GB of data, so if the data > 2GB we have to read in chunks
            
    if(size > bufferSize){

        long bytesRead = 0;
        int position = 0;
        //we need to copy the data over, which means we DON'T want a full copy of all the data in memory. 
        //We need to reduce the buffer size (but not too much, as multiple calls to the reader also affect performance a lot)
        bufferSize = 104857600; //this is roughly 100MB
        byte[] buffer = new byte[bufferSize];
        while (bytesRead < size)
        {
            if (size - bytesRead < bufferSize)
                bufferSize = Convert.ToInt32(size - bytesRead);

            bytesRead += reader.GetBytes(colIdx, position, buffer, 0, bufferSize);
            //shift the buffer into the final array
            Array.Copy(buffer, 0, imageValue, position, bufferSize);
            position += bufferSize;
        }
    }
    else 
    {
        //single read into the image buffer
        reader.GetBytes(colIdx, 0, imageValue, 0, bufferSize);
    }

    return imageValue;
} 
2
  • Interesting. Couldn't you more simply set buffSize = Integer.MaxValue ??
    – Zeek2
    Commented Jan 20 at 9:58
  • @Zeek2 Um, this was a while ago so I don't necessarily remember this very well, but if you look at the else portion of where it does a straight reader.GetBytes, it uses the bufferSize, so if you leave the code as is, you wouldn't want to change the initial declaration. However, if you change the else reader.GetBytes line to use size rather than bufferSize, it would probably work just fine, indeed.
    – krokador
    Commented Jan 21 at 13:31

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.