"Does C++ Have a Future?"

DSP, Plugin and Host development discussion.
Post Reply New Topic
RELATED
PRODUCTS

Post

v1o wrote:With the millions of people who learned programming through the various learning institutions out there. We really shouldn’t be having these issues.
That's the actual issue. The quality of the coders has gone down. Anyone can write code, but not a lot of people study the tool properly to know how to code properly.

It's going to be the same for machine learning, everyone can read a book on deep learning. How many do know what they are doing?

Post

Guillaume Piolat wrote:Perhaps at one point C++ was the only langage to provide all this (especially your last point), now there are several competitors that can do every single thing you have listed.
Hmmm, I'm probably missing a lot. Besides D and Rust (on which I commented earlier) and (to a certain extent) Pascal I'm not aware of alternatives. Would you care to give some names and maybe examples?

Post

I'm not sure either. The huge point of C++ (which was also the case of Fortran for performance) is the compiler's ability to generate very efficient code.

Post

Z1202 wrote:
Guillaume Piolat wrote:Perhaps at one point C++ was the only langage to provide all this (especially your last point), now there are several competitors that can do every single thing you have listed.
Hmmm, I'm probably missing a lot. Besides D and Rust (on which I commented earlier) and (to a certain extent) Pascal I'm not aware of alternatives. Would you care to give some names and maybe examples?
Well I was about to say the same: Rust, D and Pascal (minor note, Rust doesn't have goto and Pascal doesn't have "zero-cost" abstractions so Pascal is "out"?). Nim is an even smaller competitor that can do the zero-cost thing and is native, though it doesn't compile to native code but to C. LLVM releases mention Zig which has more of a C inspiration. That place gets pretty crowded ; Java and C# have plans of becoming more ahead-of-time to lessen the gap to native (though it's hard to know what they will do with their huge runtime and JIT).
This is supposedly good for C++, since every competitor will have a harder time. If one of the new kid takes over though (Swift or Go would be most likely?) the future of C++ will be to maintain existing C++ money-making products. The Redmonk charts tell that there is only a few slots for popular languages.

EDIT: https://godbolt.org/ to compare generated code
Checkout our VST3/VST2/AU/AAX/LV2:
Inner Pitch | Lens | Couture | Panagement | Graillon

Post

Swift cannot, it's linked to Apple and their bugs.
Go? Perhaps, but I don't see lots of traction behind Go. And Google has a known tendency to move to a different target and ditch whatever is not enough. So doesn't seem very trustworthy.

Post

Z1202 wrote:
Guillaume Piolat wrote:Perhaps at one point C++ was the only langage to provide all this (especially your last point), now there are several competitors that can do every single thing you have listed.
Hmmm, I'm probably missing a lot. Besides D and Rust (on which I commented earlier) and (to a certain extent) Pascal I'm not aware of alternatives. Would you care to give some names and maybe examples?
I've got my eye on Nim, which is transpiled down to C.

I also took a look at Zig, which is very interesting with the built-in, but unobtrusive safety features. It's not quite ready for prime time, but looks very promising.
I started on Logic 5 with a PowerBook G4 550Mhz. I now have a MacBook Air M1 and it's ~165x faster! So, why is my music not proportionally better? :(

Post

syntonica wrote:
Z1202 wrote:
Guillaume Piolat wrote:Perhaps at one point C++ was the only langage to provide all this (especially your last point), now there are several competitors that can do every single thing you have listed.
Hmmm, I'm probably missing a lot. Besides D and Rust (on which I commented earlier) and (to a certain extent) Pascal I'm not aware of alternatives. Would you care to give some names and maybe examples?
I've got my eye on Nim, which is transpiled down to C.
(Edit: oops! I do see it already mentioned up there. But compared to Rust, Go, and all the other current upstarts, I believe it's best of breed.)
I also took a look at Zig, which is very interesting with the built-in, but unobtrusive safety features. It's not quite ready for prime time, but looks very promising.
:dog: My iPad has gone haywire this morning! ...sorry...
I started on Logic 5 with a PowerBook G4 550Mhz. I now have a MacBook Air M1 and it's ~165x faster! So, why is my music not proportionally better? :(

Post

All this discussion appears to be about syntax. Nobody wants a virtual machine or a garbage collector I guess. Additionally everybody wants low level access to the operating system, integration with assembly, and an optimizing compiler. What is the content of operating system APIs? If we don't count oddities of OS X, C and often C++ headers. Will that change anytime soon? Unlikely. Is there anything really wrong with C++ other than the fact that parsing a large set of include files increase compilation time? Unlikely. Unless you want a very different set of features it's unlikely that somebody might come up with a very different solution. This case looks pretty conclusive to me. Some day C++ will have modules instead of include files and that's all there is to it.
~stratum~

Post

Include files are redundant?

Post

camsr wrote:Include files are redundant?
One could add that info to a library file in a binary form that can be read faster. While this is already what precompiled headers do, if you have a large number of small projects they do not decrease compilation time much.
~stratum~

Post

stratum wrote:
camsr wrote:Include files are redundant?
One could add that info to a library file in a binary form that can be read faster. While this is already what precompiled headers do, if you have a large number of small projects they do not decrease compilation time much.
I'm not convinced that parsing headers is a significant cost anyway, unless you actually have to go an fetch the header from the disk (ie. it's not in cache yet). Parsing a bunch of text might have been a performance problem back in the 90s, but it's not a huge deal these days. I mean <windows.h> is probably the reason pre-compiled headers were invented (since it pulls in pretty much everything Microsoft ever invented), yet these days you don't even notice any difference when you #include it (without pre-compiled headers and after the first build that has to load the crap from the physical disk).

The headers that cause compilation slow down are usually the ones that rely on a quadrillion template instantiations (eg. STL and Boost) that the compiler has to generate and then optimise, but pre-compiled headers are unlikely to do anything for these, since it's all stuff that happens after parsing.

Post

mystran wrote:
stratum wrote:
camsr wrote:Include files are redundant?
One could add that info to a library file in a binary form that can be read faster. While this is already what precompiled headers do, if you have a large number of small projects they do not decrease compilation time much.
I'm not convinced that parsing headers is a significant cost anyway, unless you actually have to go an fetch the header from the disk (ie. it's not in cache yet). Parsing a bunch of text might have been a performance problem back in the 90s, but it's not a huge deal these days. I mean <windows.h> is probably the reason pre-compiled headers were invented (since it pulls in pretty much everything Microsoft ever invented), yet these days you don't even notice any difference when you #include it (without pre-compiled headers and after the first build that has to load the crap from the physical disk).

The headers that cause compilation slow down are usually the ones that rely on a quadrillion template instantiations (eg. STL and Boost) that the compiler has to generate and then optimise, but pre-compiled headers are unlikely to do anything for these, since it's all stuff that happens after parsing.
I have about 120GB of source code + related object files on a disk now. (Including 32/64 bit release/debug builds for windows and release-only builds of open source libraries for two different linux distributions, so possibly it's much less for a single platform, (edit:roughly 32GB of that is due the subversion database and 24GB of that is open source code+prebuilt libraries), but still large)
Will windows.h stay in disk cache during a rebuild? I'm not sure. This disk used to be a mechanical one and during that time compilation time was longer. With a SSD it's better.
~stratum~

Post

stratum wrote: I have about 120GB of source code + related object files on a disk now. (Including 32/64 bit release/debug builds for windows and release-only builds of open source libraries for two different linux distributions, so possibly it's much less for a single platform, (edit:roughly 32GB of that is due the subversion database and 24GB of that is open source code+prebuilt libraries), but still large)
Will windows.h stay in disk cache during a rebuild? I'm not sure. This disk used to be a mechanical one and during that time compilation time was longer. With a SSD it's better.
Do you seriously rebuild all of that from scratch on regular basis during development (ie. when you actually have to stand by to wait for the rebuild)?

edit: and even if you do, the time to parse the input files is probably STILL going to be fairly insignificant compared to the rest of the stuff that the compiler has to do

Post

mystran wrote: Do you seriously rebuild all of that from scratch on regular basis during development (ie. when you actually have to stand by to wait for the rebuild)?

edit: and even if you do, the time to parse the input files is probably STILL going to be fairly insignificant compared to the rest of the stuff that the compiler has to do
Of course I do not rebuild the open source libraries, but the part that we have written ourself takes about 9 minutes to rebuild (debug, on a core i7-4770) and possibly longer for release, and sometimes I do that 2-3 times a day (because the microsoft build system sometimes tries to be too intelligent and messes things up, or sometimes it is necessary to edit a commonly used file). Add that the need to build 32 bit windows and a linux versions before commiting, naturally it takes a bit more time than a few coffee breaks). Nothing serious as long as there is nothing that needs to be done immediately, but when there is, it can become a problem.
~stratum~

Post

mystran wrote: I'm not convinced that parsing headers is a significant cost anyway, unless you actually have to go an fetch the header from the disk (ie. it's not in cache yet). Parsing a bunch of text might have been a performance problem back in the 90s, but it's not a huge deal these days.
It's not just parsing, text is imported, run through the preprocessor, parsed, semantic analysis, symbol tables built, struct layouts / class definitions built. ect.. My understanding of compiler internals is pretty limited but I dont imagine it's as cheap as you think it is. Parsing text into tokens yes, but building symbol tables, and all all the other data structures you need for creating the actual object file, id guess not.

Modules and symbolic imports would help for sure, but I suspect C++ will always be slow simply because it's such a huge language.
Chris Jones
www.sonigen.com

Post Reply

Return to “DSP and Plugin Development”