Basically every service has already been packed full with so many things that an instance can barely fit on a server, you wont be able to run that monstrosity locally. Which is why they started doing "micro services", out of necessity since when each binary gets over a few gigabytes you don't got many other options. their micro services still takes gigabytes, but it let them continue adding more code. But each of those depends on hundreds of other micro services. And those microservices are of course properly secured so you wont be able to talk to production servers from your development machine.
Are there a lot of binaries "over a few gigabytes"? On x86_64 you can only have 32-bit offsets for relative jumps. How would you build and link something that large?
I received during my tenure several peer bonuses for unblocking the search or ads release by keeping the [redacted] under 2GiB. They are huge just because authors are lazy, not as an inevitable consequence of any technical choice that has been made. It was always easy for me (for some reason, easier than it was for the program's own maintainers) to use `blaze query` and a bit of critical thinking to break up some library that was always called `:util` into smaller pieces and thereby remove tens of megabytes from the linker output. People are just so lazy they aren't thinking about the consequences of their build targets.
Most developer's main targets are much, much smaller.