Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't "modern C++" pretty much an oxymoron, and pretty much follows the above definition of "legacy software" ?


It's.. not?

The definition of "legacy software" in this discussion is: software that grew so many "temporary" fixes and workarounds instead of necessary architectural changes that it's in permanent maintenance mode, and entire parts of it are untouchable because nobody understands what they do due to the exponential increase in complexity stemming from the abundance of these hacks, exacerbated by system knowledge evaporating with engineer turnover.

That doesn't apply to either C++ language and compiler (rapidly evolving in the past decade), nor projects written in it, generally (that's not a property of the language).

On that note, we've integrated some Fortran code that was written in the 80s that I wouldn't call "legacy" under this definition: the algorithm was clear, and the implementation documented well enough that modifying it, if necessary, would not have been hard, and using it with our floor was very simple (it was one of the flavors of gradient descent algorithms that converged much better than several others).


> software that grew so many "temporary" fixes and workarounds instead of necessary architectural changes that it's in permanent maintenance mode, and entire parts of it are untouchable because nobody understands what they do due to the exponential increase in complexity stemming from the abundance of these hacks, exacerbated by system knowledge evaporating with engineer turnover

Pretty much this, what else am I supposed to think when I read this : ? https://stackoverflow.com/questions/17103925/how-well-is-uni...

We aren't on the ARPANet anymore - I'm expecting even low-level programming languages to use native Unicode. The way how C equates "char" to both character and byte is fundamentally broken.

(This is also/more(?) an issue with Unicode - IMHO we should have increased byte size to something like 32 bits (during the transition to 64-bit words ?) to be able to fit one character per byte - the increased hardware cost for text storage would have been quickly compensated by the decrease in developer costs. But here we are.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: