Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think maybe you are misreading the rule, it doesn't say don't optimize, it says when optimizing, don't guess from the code where the bottleneck is, go measure it.


Yeah but some things like how you structure your data(which then drives CPU cache misses) aren't something you can easily adjust/tune.

Usually when you encounter one of those it's a rewrite/rearchitecture of a whole module/subsystem before you see any gains. Been there done that, not excited to repeat it again.


The other rule is choose your data structures wisely.


The number of people who don't understand when to use a list/array VS dictionary/hash table vs lookup object is too damn high. A huge amount of basic optimization is constantly at their fingertips and they nearly always choose to make a list/array and use linq to join/merge multiple relational data sets instead of optimized standard objects.


sure, sometimes your approach is at fault, but as the rule says, sometimes it's just a surprising bottleneck. Having done a lot of embedded work, I've found that many of those can often be fixed in a straightforward manner. Sometimes, you have to change your approach though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: