I have some doubts about how to setup a build pipeline with some fairly complex dependencies. As long as you have simple builds things are easy, but I've got some issues in the following situation.
I have one build which compiles some code. The results is shipped as artifact dependencies to several suites of tests. Only when all suites have completed successfully the following step runs, which packages the software taking artifacts from the compile build. Finally, as soon as the package is completed some other builds start.
We have issues syncronizing all those builds so that all of them run within the same consistency boundary, that is, all of them work on the same binaries compiled in the first step.
At the moment we are relying on snapshot dependencies to keep this consistency and artifact dependencies to move artifacts around, but we're having some issues. The first of these issues is that as long as changes are committed to the repository the compile build keeps being triggered, and since other builds later in the chain have snapshot dependencies on this one they won't run until the compile stops running, that is, until there are no changes on the repository. This is because if they are in queue and there are more changes in their snapshot dependencies they will switch to the newer version. Of course as changes keep being committed on VCS these builds will stay in queue and only start when there are no more changes.
What I would like to achieve is a build pipeline which for each change committed to the VCS performs the whole pipeline from start to end.