I was advised a while ago that is was common place to use std::vector as exception safe dynamic array in c++ rather than allocating raw arrays... for exampl
MVS does range checking in operator[]
even in release builds. I don't know if it's standard compliant. (I actually found debug code in their implementation which made their implementation break correct code). There is a switch to disable it though.
Could you use iterators instead of pointers?
{
std::vector<char> scoped_array (size);
std::vector<char>::iterator pointer = scoped_array.begin();
//do work
} // exception safe deallocation
This brought an interesting question to my mind, which I promptly asked here. In your case, you can avoid using pointers the following way:
template<class InputIterator, class OutputIterator>
OutputIterator copy_n( InputIterator first, InputIterator last, OutputIterator result, std::size_t n)
{
for ( std::size_t i = 0; i < n; i++ ) {
if (first == last)
break;
else
*result++ = *first++;
}
return result;
}
std::ifstream file("path_to_file");
std::vector<char> buffer(n);
copy_n(std::istream_iterator<char>(file),
std::istream_iterator<char>(),
std::back_insert_iterator<vector<char> >(buffer),
n);
This will copy the contents of the file to a buffer n
chars at a time. When you iterate over the buffer, use:
for (std::vector<char>::iterator it = buffer.begin(); it != buffer.end(); it++)
instead of a counter.
As far as the C++ standard is concerned, operator[]
isn't guaranteed not to check, it's just that (unlike at()
) it's not guaranteed to check.
You'd expect that in a non-checking implementation, &scoped_array[scoped_array.size()]
would result in a legal pointer either within or one-off-the-end of an array allocated by the vector. This isn't explicitly guaranteed, but for a given implementation you could verify by looking at its source. For an empty vector, there might not be an allocation at all (as an optimisation), and I don't see anything in the vector
part of the standard which defines the result of scoped_array[0]
other than table 68.
Going from table 68, you might say that the result of your expression is &*(a.begin() + 0)
, which illegally dereferences an off-the-end iterator. If your implementation's vector iterator is just a pointer then you probably get away with this - if not you might not, and obviously yours isn't.
I forget the results of the argument as to whether &*
, on a pointer that must not be dereferenced, is a no-op, or not. IIRC it's not clear from the standard (some ambiguity somewhere), which provoked requests to fix the standard to make it explicitly legal. This suggests that it does in fact work on all or most known implementations.
Personally I wouldn't rely on this, and I wouldn't disable the checking. I'd rewrite your code:
char* pointer = (scoped_array.size() > 0) ? &scoped_array[0] : 0;
Or in this case just:
char* pointer = (n > 0) ? &scoped_array[0] : 0;
It just looks wrong to me to use index n of a vector without knowing that the size is at least n+1, regardless of whether it actually works in your implementation once you've disabled the checking.
If you want to get cleaner behaviour in this scenario you could replace use of a[0]
with use a.at(0)
, which will throw if the index is invalid.
A pragmatic solution would be to init vector with n+1 entries and constrain access to 0..n-1 (as this code already does).
void foo (int n) {
std::vector<char> scoped_array (n+1);
char* pointer = &scoped_array[0];
file.read ( pointer , n );
for (int i = 0; i < n; ++i) {
//do something
}
}
See LWG issue 464. This is a known issue.
C++0x (which is partially implemented by MSVC 2010) solves it by adding a .data()
member.