问题
Based on a comment of someone in another thread:
VLAs introduce more problems than they solve, because you never know if the declaration is going to crash for x being too large for the stack.
This code will overflow because sizeof(a)
is too long for the stack:
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
int n = 100000000;
int a[4][n];
printf("%zu\n", sizeof(a));
return 0;
}
But this one can not because sizeof(a)
is 8 (the size of a pointer in my computer):
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
int n = 100000000;
int (*a)[n];
printf("%zu\n", sizeof(a));
a = malloc(sizeof(*a) * 4);
free(a);
return 0;
}
Is my assumption correct?
Can we determine if the use of a VLA is dangerous or not (may overflow) based on the sizeof
the object?
回答1:
int (*a)[n];
is not a VLA, but a pointer to a VLA. So OP 2 examples are not a close enough comparison.
As @M.M commented, preventing stack overflow is a problem with any automatic allocation. Recursion can overly consume a stack. Local large variables can overly consume a stack too.
A VLA is simply one of the more likely to be used egregiously.
// Qualified use of VLA
int len = snprintf(NULL, 0 "%d", some_int);
assert(len > 0);
char vla_good[len+1];
len = snprintf(vla_good, len+1, "%d", some_int);
// Unqualified
int x;
scanf("%d", &x);
char vla_bad[x]; // who knowns what x may be, did scanf() even work?
VLAs introduce more problems than they solve, because you never know if the declaration is going to crash for x being too large for the stack.
Can we determine if the use of a VLA is dangerous?
Use the right tool for the task. Usually a worst-case small fixed-sized arrays will do. VLAs have limited uses. Robust code would insure the array element count is not foolish before declaring a VLA.
Note that VLA, available since C99 is optionally supported in C11.
VLA are not bad, they are just drawn that way.
来源:https://stackoverflow.com/questions/42123355/large-vla-overflow