-
Improvement
-
Resolution: Unresolved
-
Major
-
None
At least in our case, a project can produce quite a number of artifacts, some quite large and some which only change occasionally from one build to another (i.e. some artifacts change every time, some less frequently). It seems that both space and bandwidth could be saved by de-duplicating these seldom changed artifacts from one build to another.
I imagine an algorithm where the server keeps a database of sums and sizes of stored artifacts and when a slave is going to send the artifacts of a build it first offers the sums and sizes of the artifacts. If the server finds potential matches, further verification of duplication could be performed (i.e. comparing random samples of the suspected duplicates) and once a duplicate has been confirmed, the server can either copy or link the artifact locally and tell the slave not to bother sending it.
What happens when the older artifact gets pruned? Then the link will show to nothing or am I wrong?