My dataset: 500,000 points in 960 dimensions. The size of the file is 1.9 GB (1,922,000,000 bytes).
The code works for smaller data sets, but for this it will crash in t
std::bad_alloc means a problem with allocating a memory - so yes, you're most likely out of memory. Unfortunately, there is no a reliable way to "handle" this kind of exception - you can catch it and gratefully exit application.