27
votes

Sorry to come up with this topic again, as there are soo many other questions already related - but none that covers my problem directly.

What I'm searching is a good version control system that can handle only two simple requirements:

  1. store large binary files (>1GB)
  2. support a repository that's >1TB (yes, that's TB)

Why? We're in the process of repackaging a few thousand software applications for our next big OS deployment and we want those packages to follow version control.

So far I've got some experience with SVN and CVS, however I'm not quite satisfied with the performance of both with large binary files (a few MSI or CAB files will be >1GB). Also, I'm not sure if they scale well with the amount of data we're expecting in the next 2-5 years (like I said, estimated >1TB)

So, do you have any recommendations? I'm currently also looking into SVN Externals as well as Git Submodules, though that would mean several individual repositories for each software package and I'm not sure that's what we want..

10
You sure you want a version control system? That means after every minor change to any >1GB binary file means having a >1GB copy somewhere on the disk of the old version of that file. You might consider using a database instead, since many databases support blob formats which allow you to save it on the disk rather than internal to the database (much faster that way). - Neil
You also can consider Git with git-lts: see my answer below - VonC
@Neil Wrong. For example, Subversion supports binary diffs by design and won't create a 1GB copy of 1GB file for every minor change. - bahrep

10 Answers

13
votes

Take a look at Boar, "Simple version control and backup for photos, videos and other binary files". It can easily handle huge files and huge repositories.

5
votes

Old question, but perhaps worth pointing out that Perforce is in use at lots of large companies, and particular in games development companies, where multi-Terabyte repositories with many large binary files.

(Disclaimer: I work at Perforce)

3
votes

Version control systems are for source code, not binary builds. You are better off just using standard network file server backup tapes for binary file backup - even though it's largely unnecessary when you have source code control since you can just rebuild any version of any binary at any time. Trying to put binaries in source code control is a mistake.

What you are really talking about is a process known as configuration management. If you have thousands of unique software packages, your business should have a configuration manager (a person, not software ;-) ) who manages all of the configurations (a.k.a. builds) for development, testing, release, release-per-customer, etc.

3
votes

Update May 2017:

Git, with the addition of GVFS (Git Virtual File System), can support virtually any number of files of any size (starting with the Windows repository itself: "The largest Git repo on the planet" (3.5M files, 320GB).
This is not yet >1TB, but it can scale there.

The work done with GVFS is slowly proposed upstream (that is to Git itself), but that is still a work in progress.
GVFS is implement on Windows, but will soon be done for Mac (because the team at Windows developing Office for Mac demands it), and Linux.


April 2015

Git can actually be considered as a viable VCS for large data, with Git Large File Storage (LFS) (by GitHub, april 2015).

git-lfs (see git-lfs.github.com) can be tested with a server supporting it: lfs-test-server (or directly with github.com itself):
You can store metadata only in the git repo, and the large file elsewhere.

https://cloud.githubusercontent.com/assets/1319791/7051226/c4570828-ddf4-11e4-87eb-8fc165e5ece4.gif

2
votes

When you really have to use a VCS, i would use svn, since svn does not require to copy the entire repository to the working copy. But it still needs about the duplicate amount of disk space, since it has a clean copy for each file.

With these amount of data I would look for a document management system, or (low level) use a read-only network share with a defined input process.

2
votes
  • store large binary files (>1GB)
  • support a repository that's >1TB (yes, that's TB)

Yep, that is one of the cases Apache Subversion should fully support.

So far I've got some experience with SVN and CVS, however I'm not quite satisfied with the performance of both with large binary files (a few MSI or CAB files will be >1GB). Also, I'm not sure if they scale well with the amount of data we're expecting in the next 2-5 years (like I said, estimated >1TB)

Up-to-date Apache Subversion servers and clients should have no problems controlling such amount of data and they perfectly scale. Moreover, there are various repository replication approaches that should improve performance in case you have multiple sites with developers working on the same projects.

I'm currently also looking into SVN Externals as well as Git Submodules, though that would mean several individual repositories for each software package and I'm not sure that's what we want..

svn:externals have nothing to do with the support for large binaries or multiterabyte projects. Subversion perfectly scales and supports very large data and code base in a single repository. But Git does not. With Git, you'll have to divide and split the projects to multiple small repositories. This is going to lead to a lot of drawbacks and a constant PITA. That's why Git has a lot of add-ons such as git-lfs that try to make the problem less painful.

1
votes

You might be much better off by simply relying on some NAS device that would provide a combination of filesystem-accessible snapshots together with single instance store / block level deduplication, given the scale of data you are describing ...

(The question also mentions .cab & .msi files: usually the CI software of your choice has some method of archiving builds. Is that what you are ultimately after?)

1
votes

This is an old question, but one possible answer is https://www.plasticscm.com/. Their VCS can handle very large files and very large repositories. They were my choice when we were choosing a couple years ago, but management pushed us elsewhere.

0
votes

There are a couple of companies with products for "Wide Area File Sharing." They can replicate large files to different locations, but have distributed locking mechanisms so only one person can work on any of the copies. When a person checks in an updated copy, that is replicated to the other sites. The major use is CAD/CAM files and other large files. See Peer Software (http://www.peersoftware.com/index.aspx) and GlobalSCAPE (http://www.globalscape.com/).

0
votes

The perks that come with a versioning system (changelog, easy rss access etc.) are nonexistant on a simple fileshare.

If you only care about the versioning metadata features and don't actually care about the old data then a solution that uses a VCS without storing the data in the VCS may be an acceptable option.

git-annex is the first one that came to my mind, but from the what git-annex is not page it seems there are other similar but not exactly the same alternatives.

I have not used git-annex, but from the description and walkthrough it sounds like it could work for your situation.