Before variable-length arrays were supported, I would dynamically allocate them like this:
int foo(size_t n)
{
int *arr = malloc(n * sizeof int);
if
In reality it is prohibitively expensive to check for out of memory conditions everywhere. The enterprisy way to deal with massive data is to limit data sizes by defining hard cap on size at single early checkpoint and fail fast and gracefully when the cap is hit.
What I just suggested is simple and stupid. But its what every ordinary (non-scientific or special) product always does. And its what normally is expected by customer.