
If I do 'darcs get' to get a bunch of different repositories from cvs.haskell.org to my local filesystem, they won't all end up hard-linked together, surely?
Not automatically in that case, no. But you could use darcs optimize --relink to restore them to linked status. Or better yet:
Just to be precise, if A, B and C are the repositories, optimally you'd do something like (cd B; darcs optimize --relink --sibling ../C) (cd A; darcs optimize --relink --sibling ../B --sibling ../C) This will link anything that can be linked from C into B, then anything that can be linked from either B or C into A. But you shouldn't worry about being optimal; just call ``optimize --relink'' with all the other likely repositories as siblings, and you'll end up converging to maximal sharing. Optimize --relink is relatively fast, and it should be safe, so nothing prevents you from relinking often (for example, each time you pull a new pool of changes).
1) Check out the most recent common ancestor 2) darcs get it n times across the local filesystem (resulting in a bunch of hardlinked patches) 3) darcs pull the appropriate repo that you want in each one of them
Yes, this will avoid the extra network traffic. However, you should still manually ``optimize --relink'' after doing that, as ``get'' doesn't currently link pristine trees (it only links patches). Juliusz