Hacker News new | past | comments | ask | show | jobs | submit login

But translation units / small .cc files can build in parallel and cached, and with multi-core thus it's desirable to have many small translation units. Except of course when there's eventually one large translation unit that needs everything and then link time dominates ...

The article emphasizes a common issue about headers.. X-Code and Visual Studio work around this to some extent with pre-compiled headers, something that can be really hard to set up in ccache. If Figma's whole team is using macs (they mention getting everybody macbooks?) then I wonder if they could just switch to X-Code and use built-in pch support. While that introduces a dependency on X-Code :( maybe their whole C++ stack will get effectively re-written in the next couple of years anyways?




OK, when you only have as many .cc files as you have CPU cores, you should stop making them bigger. This is not an issue for big projects.

The caching of compilation fails if you touch a central .h file. When you work on projects like that you start dreading a change to the .h files because development slows to a crawl as the compile times explode.

I worked on V8 and over time the .cc files got smaller and the build times got much worse. Some people felt this was neater, but if you are not in an office with 20 beefy workstations using distcc the effect is brutal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: