mystran wrote: - int64_t is handy for some stuff
- large allocs and/or memory maps are hardly a problem on 64-bit,
I'm nowhere near doing anything of that scale yet. If I ever have to load in huge swaths of samples, then I can start worrying. However, I started programming with 4k of RAM, so minimizing the use of resources is extremely ingrained in my programming style. I laugh at myself when I throw more memory at a speed problem and I end up using only 64k more in the end.
Old programming habits die hard as I drag myself out of the 70s and into the world of "limitless" resources.
With large allocs in particular, hitting the "maximum continuous block" limit is actually a lot easier than hitting the "maximum memory per process" limit. The reason is that while you might be able to get 3GB (or whatever) of memory into the process, the actual address space can get pretty fragmented especially after the process has been running for a while.
So you might end up in a situation where you have 1GB of free heap space, but malloc()/new might still fail for a continuous block of 200MB, which is only about 10 minutes of audio for 32-bit floats 44.1kHz stereo.