🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Long build times.

Started by
20 comments, last by SuperVGA 3 years, 9 months ago

With more threads, which require a proportionate amount of RAM, you can easily compile more things at the same time; with a good SSD compilation is rarely an IO-bound activity.

For example, GNU Make has a very user friendly "-j" option to specify the number of tasks that should be executed concurrently, of which a typical C/C++ build can have hundreds. The difference between make -j4 and make -j32 is a strong selling point for the newer AMD CPUs.

Omae Wa Mou Shindeiru

Advertisement

LorenzoGatti said:

With more threads, which require a proportionate amount of RAM, you can easily compile more things at the same time; with a good SSD compilation is rarely an IO-bound activity.

For example, GNU Make has a very user friendly "-j" option to specify the number of tasks that should be executed concurrently, of which a typical C/C++ build can have hundreds. The difference between make -j4 and make -j32 is a strong selling point for the newer AMD CPUs.

I was referring to linking specifically. AFAIK -j has no effect on linking.

hplus0603 said:
Unreal build times are long (but not “hours”) because: Unreal includes a MASSIVE amount of separate subsystems and libraries.

True, except for the ‘but not “hours”' part. Three of my last four Unreal titles — which included both Fortnite and Ark — a full rebuild or a clean build of the game code itself required more than a day on a single computer. (Those were modern computers within the past 3 years, 64GB memory, either SSD or NVMe drives.) With Incredibuild distributing it across the studio it could be done in about an hour; but outside of work environments most people don't have 200+ CPU cores doing the processing. When we brought a machine in fresh, someone would run setup scripts and start the build process where it would spin for about there days both building everything and pulling down the DDC. In my last non-Unreal title, a game exclusive to the Switch also in C++, a full rebuild / clean build took about 45 minutes on a single computer. On IncrediBuild distributed across the studio it took about 15.

Building UE4 engine from source the first time (or clean/rebuild) with no game code or extra components takes about an hour on 4.25.

Always use incremental builds when possible.

For home users doing incremental builds, building a UE4 title is often a multi-minute process, not a multi-hour process. But if you have to do a full engine+game+content rebuild, expect the hours.

SuperVGA said:

I was referring to linking specifically. AFAIK -j has no effect on linking.

Only when you only link one executable as the very last step of your build, so that you have a long task that cannot be run concurrently with something else.
As already discussed in other posts, building many executables rather than one is a realistic scenario; moreover, linking can often be split into potentially parallel jobs (several large dynamic link libraries instead of a very large statically linked executable).

Omae Wa Mou Shindeiru

LorenzoGatti said:

SuperVGA said:

I was referring to linking specifically. AFAIK -j has no effect on linking.

Only when you only link one executable as the very last step of your build, so that you have a long task that cannot be run concurrently with something else.
As already discussed in other posts, building many executables rather than one is a realistic scenario; moreover, linking can often be split into potentially parallel jobs (several large dynamic link libraries instead of a very large statically linked executable).

It's been a while since I saw a game with multiple executables. And how would you deal with using DLLs on linux? Perhaps there's a cross-platform alternative I haven't heard of yet.

Linking a single executable at the very last step is the most common approach (I think), and I suspect all cross platform games that don't use a launcher or frontend executable to do this.

[quote]how would you deal with using DLLs on linux[/quote]

Just build your plugin as a .so instead. And build it as a .dylib on MacOS. Shared libraries have been a thing on UNIX for longer than on Windows :-)

UNIX even pioneered weak dynamic symbols, which don't even check whether the symbols are resolvable until you start the program and they're needed. This saves more time in linking, but leaves time bombs for when you run, so I don't recommend it.

enum Bool { True, False, FileNotFound };

hplus0603 said:

[quote]how would you deal with using DLLs on linux[/quote]

Just build your plugin as a .so instead. And build it as a .dylib on MacOS. Shared libraries have been a thing on UNIX for longer than on Windows :-)

UNIX even pioneered weak dynamic symbols, which don't even check whether the symbols are resolvable until you start the program and they're needed. This saves more time in linking, but leaves time bombs for when you run, so I don't recommend it.

That's good to know. I don't think I'll go that route though. Maybe for development, to quicker run after a change.
Then again, it might be tedious to have two so different ways to build the project.

I loathe DLLs. There's just something about it, even when versioned etc it should be just fine and realiable. I guess I'm hell-bent on having the executable stuff in the executable (save user scripts etc.)

I loathe DLLs. There's just something about it, even when versioned etc it should be just fine and realiable. I guess I'm hell-bent on having the executable stuff in the executable (save user scripts etc.)

You still have “the executable stuff in the executable”, it's only arranged in pieces like a picnic table with detachable legs. In particular, it shouldn't be hard to align DLL content with suitable boundaries for link-time optimization.

Omae Wa Mou Shindeiru

LorenzoGatti said:

I loathe DLLs. There's just something about it, even when versioned etc it should be just fine and realiable. I guess I'm hell-bent on having the executable stuff in the executable (save user scripts etc.)

You still have “the executable stuff in the executable”, it's only arranged in pieces like a picnic table with detachable legs. In particular, it shouldn't be hard to align DLL content with suitable boundaries for link-time optimization.

That's true. I guess I meant the behaviour rather than the executable stuff. Good analogy. I want the whole table in there, but just talking about it here is making me consider using DLLs. I'll probably turn, eventually.

SuperVGA said:
Please elaborate on how you'd go about reducing linking time by increasing RAM or cores (Or machines)?.

Sorry, I didn't see this part until now. As I said in the other answer: Split into .so files, use weak (late-resolved) symbols. There is still a final linker phase where they are all collected, but with weak symbols, that's pretty fast. (At the expense of making starting the game slower …)

frob said:
True, except for the ‘but not “hours”' part.

I was talking about the “re-build the code” part, not the “bake lighting” part.

You are absolutely corect: A full, production-level bake, on large, detailed levels, with full pre-computed GI, can take however long you want! I once poked at setting up a VPN link to Amazon to use on-demand compute instances for baking, but because I didn't have a big enough level for it to matter, I never followed through. And Lightmass, while being somewhat able to split work across nodes, doesn't scale anywhere near infinitely to available nodes. But you'd still get the benefit of the 128-core, 2048-GB RAM, 40-Gbps networked hosts on demand, which you probably don't have sitting around the studio. (Well, maybe EPIC does, what with making money hand over fist on Fortnite :-)

enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement