Update November 2011:
Using Git, as forcefully advocated by Tilo, is obviously a better choice for a number of reasons (decentralization, private commits, branching, merging, ...).
I am fully aware of the differences between Centralized and Decentralized VCS, as detailed in "Describe your workflow of using version control (VCS or DVCS)".
However, using it in an large enterprise is not easy.
I know. I introduced Git in a large enterprise:
I exposed the general reasons in "Can we finally move to DVCS in Corporate Software? Is SVN still a 'must have' for development?" (while still saying that DVCS is a very valid choice)
But I really detailed the major pain points in installing Git in enterprise in "Distributed Version Control Systems and the Enterprise - a Good mix?".
I have actually presented those pain points in the latest CodeKen 2011 (replacing the ex-DevDays 2011).
The presentation was called "Introducing DVCS in a big corporation", and the tweets were eloquent:
- Grundlefleck: My summary: one does not simply git into enterprise
- ben_sim: @VonC_ redefines pain at #codeken2011: recompiling git and and all its dependencies so you can use it on a production server in enterprise.
The trick with a DVCS is: you still need a "server", a centralized place for all the developers to get a "blessed" version of their repo.
And in a large corporation, those mutualized server are... mutualized, there are a lot of other services already running, which means you cannot just add your Git on it (you won't have the right libraries on that server).
Plus, you will need your own ssh and httpd for your user to push/pull to/from that server.
I have actually automated the installation process for Git and all its dependencies/related services in the GitHub project compileEverything.
So yes, use Git.
On a client PC, you can install and start using it in minutes!
But on a server of a large enterprise? This isn't so easy.
Remember 2009:
- Git support on Windows was possible, but still in progress,
- Git itself didn't include "smart http", which meant the only protocol with authentication for pull/clone and push operations was ssh: convincing users to generate and manage public/private keys is... problematic to say the least.
Https was possible for pull and clone, even if the request was quite inefficient. For push... the setup involved WebDAV and was complex. Smart Http changed all that.
- authorization layers was clunky (gitosis), and gitolite was barely starting.
And you need an authorization layer in big enterprise.
Not any user can access to any repository, some of said repositories are quite confidential.
Original answer, back in 2009:
Within a corporation, in a closed centralized environment, Perforce is certainly a good option.
It does checkout your workspace fast, and manage changeset nicely.
It supports "locking by default" which help to see who it working on the same file.
However, you must be in sync with the P4 server at all time, and should have the proper infrastructure to support it, including backup and DRP scenarios: if no server, and you go on working, deleting a file for instance, it will not be restored on successive updates.
The main complaint I hear from the teams using it (since I only do a bit of administration tasks on the tool) is the notion of "Inter-File Branching", a bit confusing at first, but very useful.
Depending on the level of integration of Perforce with your IDE (here eclipse) and editors, the main intrusive aspect will be to make sure to keep your workspace in sync.
Other interesting links for you to read:
- What are the advantages to Perforce?
- What are the benefits of using Perforce instead of Subversion?