I am encountering something extremely strange on my system right now.
#include <cstdint>
#include <iostream>
constexpr size_t N = 1l<<31;
int main() {
auto buf = new int64_t[N]{}; // value-initialization zeroes the array
for (size_t i=0; i<N;i++) {
if (buf[i]!=0) {
std::cout << "buf[" << i << "]=" << buf[i] << std::endl;
break;
}
}
delete[] buf;
}
Compiled with g++ -O3 on gcc version 13.3.0 (Ubuntu 13.3.0-6ubuntu2~24.04),
the above code keeps producing the result
buf[x]=131072
where x is some random index that changes between runs. The value is almost always the above, which is 2^17, but occasionally it is 2^18 instead.
Since these numbers are powers of two, 128k and 256k respectively, I'm suspecting it's some kind of OS or memory paging glitch? Or it might be that my DRAM has some frozen bits? Is there some arcane limitation in g++ about initializing very large arrays?
1l<<31;to1ul<<31;fix it ?std::vector<int> buf(N,0)? (which anyway would be more in line with the C++ core guidelines)squashfs error: xz decompression failed data probably corrupt...