We have a repository which contains files (third party libraries etc) needed by other builds. It is huge, around 20GB or so, and a fresh sync takes approximately 6 hours to complete.
Right now I have set up the syncing of this repo as a dedicated build configuration which syncs to a fixed path so that other build configurations will know where the data is located. This works fine when running only one build agent.
Will this work when having more than one build agent or can it happen that one agent gets the sync job for the huge repo and the other (which depends on this data) gets started on another agent?
If this does not work then the only way I can think of to get this working is to sync this repo as part of the build configurations that depend on it. In this case I would like to be able to say that this particular repo should never be cleaned and should be shared among all build configurations which depend on it.
Is it somewhat clear what I am trying to achieve? Do you have any tips how to handle this in a way that would never cause 6-hour checkouts unless when initializing a new build agent for the first time.