Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Go figure: people are downvoting me but I never once said developers must be stupid or lazy. This is a very common kind of mistake developers often make: premature optimization without considering the actual bottlenecks, and without testing theoretical optimizations actually make any difference. I know I'm guilty of this!

I never called anyone lazy or stupid, I just wondered whether they blindly trusted some stats without actually testing them.

> FWIW, the PC install size was reasonable at launch. It just crept up slowly over time

Wouldn't this mean their optimization mattered even less back then?



    premature optimization
One of those absolutely true statements that can obscure a bigger reality.

It's certainly true that a lot of optimization can and should be done after a software project is largely complete. You can see where the hotspots are, optimize the most common SQL queries, whatever. This is especially true for CRUD apps where you're not even really making fundamental architecture decisions at all, because those have already been made by your framework of choice.

Other sorts of projects (like games or "big data" processing) can be a different beast. You do have to make some of those big, architecture-level performance decisions up front.

Remember, for a game... you are trying to process player inputs, do physics, and render a complex graphical scene in 16.7 milliseconds or less. You need to make some big decisions early on; performance can't entirely just be sprinkled on at the end. Some of those decisions don't pan out.

    > FWIW, the PC install size was reasonable at launch. It just crept up slowly over time

    Wouldn't this mean their optimization mattered even less back then?
I don't see a reason to think this. What are you thinking?


> One of those absolutely true statements that can obscure a bigger reality.

To be clear, I'm not misquoting Knuth if that's what you mean. I'm arguing that in this case, specifically, this optimization was premature, as evidenced by the fact it didn't really have an impact (they explain other processes that run in parallel dominated the load times) and it caused trouble down the line.

> Some of those decisions don't pan out.

Indeed, some premature optimizations will and some won't. I'm not arguing otherwise! In this case, it was a bad call. It happens to all of us.

> I don't see a reason to think this. What are you thinking?

You're right, I got this backwards. While the time savings would have been minimal, the data duplication wasn't that big so the cost (for something that didn't pan out) wasn't that bad either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: