Does anyone still use cvs




















The documentation for Mercurial also is more complete and will facilitate learning the differences faster. Especially if your team is currently engaged on a certain project, the prospect of migrating everything to another revision control is annoying, and if they were to switch, it would most likely be to SVN.

SVN is currently the king of server-based version control. It has all of the good features of CVS and improves upon them. In terms of corporate interaction, you are more likely to come across CVS or SVN than you will with Git or Mercurial, so a familiarity with single server technology, while not a requirement, will ease the transitions in the workplace.

With its wide range of usage and its software maturity level, SVN has a large knowledge base, and users will be able to find help readily accessible from other users. Git has a clear speed improvement over its competitors, and for projects that lend themselves to distributed systems, it is a clear improvement.

The primary downside cited for Git is that it can be at times difficult to explain to others, and there is likely to be a slow down in production as programmers adapt to it. Once it is learned, however, the speed increases and better branch management will reclaim that time and more.

For those absolutely repulsed by Git and it does have its sworn enemies in the programming world , Mercurial offers a bridge between SVN and Git that is well documented and used in many well known projects. The Windows-friendly version of Git has also made some strides which brings the speed closer to that of the Linux versions, so it could still be on the table if you are not developing in Linux.

To find out which one is best for you, consider the project and the developers. And talk to them! All of these systems are fully functional. Here are some options to check out. View activity, browse files, compare revisions. Great user interface. Integrates with a bunch of popular services, including Twitter! Comes with a handy Widget for Mac OS — you can monitor account activity across all of your projects from one simple interface.

Free with 2 people, with 1 project for up to MB storage. Free with 1 person, with 1 project for up to MB storage. A decent feature set and represent excellent value for money. It is implemented as a Windows shell extension, which makes it integrate seamlessly with explorer. It is easy to use. In whatever client you are using, you need to create a new source control repository.

You will need to enter the URL of you repository along with your user name and password in your client. Once you have your head around these basic concepts, the documentation provided with each client is a good starting point for more advanced features. Besides the ability to back up your code safely in the cloud, allowing your developers to work from anywhere, version control can also allow you to find and revert changes which might have broken your regression tests.

By performing a diff between the current and previous versions, you can easily determine what has changes were made and then rectify as necessary. Rob Rawson is a co-founder of Time Doctor which is software to improve productivity and help keep track and know what your team is working on, even when working from home.

Subscribe to our mailing list and get interesting stuff on remote working and productivity to your email inbox. We respect your privacy and take protecting it seriously. Also it is easier to use. Uh Mercurial less powerful than Git? I dont think so!.

The only real downsides are: Its a bit slower than Git Gits a rocket ship, theres no denying that , and it just doesnt have the marketshare it used to have Boo Atlassian! Git is the worst versioning system. You listed what it promises not what it delivers. Git often fails at doing simple merge or branch switch due to many branches is extremely sliw and occupies lot of space and its over complicated procedures make you often lose an entire day for an operation that should be done in 15 min.

Is awful. Even VSS is better. Pros: Easier to learn than Git and just as powerful Better documentation with actual help output when you ask it Distributed model Cons: No merging of two parents uhm, so what am I merging when I merge?

Less out of the box power wrong, less default-configured chances to shoot off your legs. I am not going to hide that I am biased in favor of Mercurial. Because it has a design, it has a concept and — for the most part — is a consistent tool that helps me get my job done. As opposed to Git which does honor to its name and does quite the opposite.

Hello, great article. The title brings three tools and you talk about four. Thank you for the great article. The pricing is lower than most of the tools listed and when you see the product you quickly understand what a good user experience means. Any of the others is a better choice. Subversion is the one choice that lets you control control access over who is able to do what to which sub trees.

Mercurial works a lot like subversion for people working alone. When you are disconnected from the net perhaps on a long flight you can still commit code when you want to. Git is a low level tool that could be made useable with wrappers around it to suit your developmemt process. Unless you know its internals, feedback when something goes wrong conflicts needing to be resolved, for example can be totally and inexcusably misleading.

If your team already uses and prefers Git, go for it. Anyone else would benefit from something more structured. Thank you for the great mention of Unfuddle. Very much appreciated, and we would be thrilled to answer any questions from you or your readers. Thanks again! Try Kiln from Fog Creek Software fogcreek. If you have binary data. Does Git allow one to many configuration?

I am using TFS: Source Control but it requires you to upload a solution and download the entire solution not just an individual file. As a starter, this is what I understood to be the problem:. At some point the two want to combine what they did in a single repository. Is that the problem mentioned above? Getting to the top, especially if the amount of competition for your specific keywords is high, you might want to focus on other techniques to help you in the interim and that will also assist you with your rankings, such as link building.

Google may consider the so-called anchor text within the link the word or phrase which is linked and the authority of the website that the link is coming from to determine how highly or lowly your website ought to rank.

For instance, if you write about flooring, you might have carpet ads on your page. I have selected Git for my team but I like the analysis presented. I have a small development team that works on many different projects using different apps with different filetypes, including Microsoft Dynamics SL, Crystal Reports, Visual Cut, Access, and lots of documents and images, etc.

Half of the team works remotely, and finds connecting over the VPN pretty slow. Software Engineering Stack Exchange is a question and answer site for professionals, academics, and students working within the systems development life cycle. It only takes a minute to sign up. Connect and share knowledge within a single location that is structured and easy to search. I want to choose a version control system for my company. These days I see that Git is the most used, so I'm left wondering: would there be any specific reason to still use Subversion, or should I go directly to Git?

SVN is not dead at all. It's is still in extremely wide use, and it's not going anywhere anytime soon. SVN is much simpler to use than distributed version control, especially if you're not actually running a distributed project that needs distributed version control.

If you only have one central repository which is all your company will need if they're still small enough to get by without source control so far , it's much simpler to use SVN to interact with it. For example, with SVN you can pull changes from the repository, or commit your local changes to it, with a single operation, whereas HG and Git require two or three steps to do the equivalent work.

It's significantly faster now than it was a couple years ago, and at this point, there's really no good reason to look at HG or Git for your project unless you actually need the advanced features of distributed version control. Client tooling hasn't been mentioned yet. You can certainly do everything with a command line script but having GUI integration can be a real productivity boost. This may change in the future, but I'd certainly weigh this into your decision just as much as the version control functions.

Just like everything else, a version control system isn't a goal in itself, just a tool to get you where you're going. Pick the one that's going to get you there fastest based on your situation.

I'm a Git fan. Recently I had to admit that one of the downsides of Git is that it identifies versions with hashes as opposite to svn's release numbers. The release number can be easier passed on by phone or something like that. And that's the only pro I can imagine. In Git there are tags that can serve the purpose. Anyway I just couldn't imagine developing without quick branch switching, and stashing. These two features alone beat SVN, where as far I remember the same task required creating and checking out a whole tree into separate directories to achieve the same goal.

Those so called "advanced features of distributed version control" come with the time, and you don't have to learn them at the very beginning. Don't be scared of them. They are here to help you, not to get in the way. And there's no problem to set up a central repository for a DVCS. With SVN you can easily checkout parts of a repository down to the folder level, whereas with git, you get the whole repository, including all the history.

Distributed Version control is a different beast to tackle. It requires substantial learning for each developer. If you have the buffer to accommodate the learning process for each developer, you should move to a good distributed version control system. Distributed Version Control seems to be an eventuality. It is here to stay for a very long time, it is better that we adapt to it sooner than later. If the company is well established with a lot of source code in the existing version control system, moving to a new system is a big task, but if the company is small or starting up, moving to a new version control is very easy.

But if you stick to an older version control in a new setup you will hit the bottleneck somewhere in future where you will have to eventually plan a version control migration anyway.

So I would strongly recommend that you choose a Distributed Version Control such as Git for your project. Someone mentioned tooling for visual studio as a reason to stick to SVN. Yes, there are all those complex guides out there which describe how Git is trivial once you understand that branches are just homeomorphic endofunctors mapping submanifolds of a Hilbert space.

But you know what? The elephant in the room is of course branching: branches just work in Git and Hg. By contrast, in SVN they are painful at best and broken at worst merging multiple heads. Of course you can still use SVN. You can also still use Windows XP. However, the majority of users who have tried both agree that one of the alternatives is vastly superior.

So if you faced a difficult merge because your colleague just committed something huge, you could just set your clock back and commit before him, walk over to his desk and tell him he broke the build. SourceSafe also allowed you to checkout files in non-exclusive mode which made it more like CVS. Our merge strategy with SourceSafe was to always do a 3 way merge using Araxis between: - your latest changes - version of the code before you made changes - latest version of the code on the server So yes, would have 3 full checkouts of the project locally to accomplish this.

I guess it boils down to "patch" workflow, except you get to both create and apply the patch yourself. We used a real-world commit token rubber duck IIRC to make sure only one person was doing merges at a time Tortoise SVN was an easy sell when we discovered it. Perforce was like that when I used it.

Edit: While I'm sure this was what I experienced it might be because of configuration by the organization I worked for, but I doubt it as I remember reading everything I could find about Perforce since I disliked it so much and wanted to find out why everyone seemed to like it. Perforce has supported concurrent versioning since at least if not earlier.

Individual files could be marked as requiring locks which is useful for binary files for which concurrent changes cannot be merged one of the big reasons why Perforce is so popular in game development , but it's not either the default or only option. It sounds like your organization had severely misconfigured Perforce. When you're working with binary, unmergeable files it's a mandatory requirement.

FWIW you could configure P4 to only lock certain file extensions. I also found it useful to find out who was "working" on a file if I had to touch it.

I started my software engineer using CVS. I was using CVS as recently as Two reasons. I'd played with git but hadn't really understood the power of trivial branching though I was one of those CVS power users who could branch, but tended to use my IDE to manage it.

I remember thinking to myself, oh this is like CVS, because that is how I used it when I played with it. The bigger reason is that I was managing a team of developers that rarely worked on the same thing. We all worked in the same room.

The codebase was relatively small 35k loc. I could see no good reason to make the change when CVS was "good enough". I was the same reason we used the same old crufty bug tracker--too many features to write to spend time upgrading infrastructure. Unless it was a 2x efficacy improvement; we did add automated testing and scripting around deploys because the benefits were obvious.

Now I love git and the power to branch and stage commits but I am still not sure it's needed for colocated teams of that size. I'll second your suggestion that centralized version control systems have advantages for small projects.

I use SVN for most of my personal projects, because no one else is contributing. Only one branch. In fact, I have never branched in SVN. The simplicity is a major benefit. The main downside is how condescending some Git users can be to SVN users. The one thing I'd like is the ability to commit without an internet connection, which distributed systems can easily do.

But this hasn't been enough of an issue to motivate a switch. SVN also maintains one killer advantage over git to this day: Storing large binary files. I think about the reproducible research movement and think to myself that SVN is strictly better for many such projects. And Subversion, being centralized, allows you to opt-in to a "lock, change, commit" workflow, on a file-per-file basis, or on basis of file-type. Which is great for binary files, as pretty much any binary file such as Excel sheets or Photoshop documents or even PNG assets can't be easily merged, so Git's "work independently, then merge later" workflow can never work with those files.

At least git stores its local copy of the repository compressed -- svn keeps a second uncompressed copy of the current revision, to be able to run a diff. So neither are great for large binaries, although at least subversion will only grab the latest. But use of compression and binary deltas does mean that for regular text-based code a git checkout including all history can be smaller than a subversion checkout with just the latest version.

For me it was my personal projects that made me switch to git. I was tired of maintaining a SVN server just for myself. With git you don't need a server at all, your full repository is locally stored and you can just back it up like you do everything else.

You also don't need a server for SVN, it can use another directory as repository. I don't find the maintenance burdensome. I keep my software up to date, including Git, so this isn't an argument against SVN. And I run a weekly backup script that first checks the integrity of the repositories. This check was motivated by a hard drive failure which caused a small amount of data loss. Git probably would have helped in this case, but now I think I'm good. I haven't done much anything else specifically for SVN in the past 5 years.

To be fair, most of my issues were with Apache, not SVN itself. The most egregious thing I remember was that Apache changed configuration directives twice during the time I maintained it in a way that broke my SVN server, but there were plenty of other papercuts.

This was also during SVN's heydey. I'm sure it's much more stable now. HelloNurse on July 8, root parent prev next [—]. My recollections of using Subversion, a few years ago, are mainly about terrible performance and frequent unrecoverable damage to working copies and repositories. And of course, a policy of not branching because it's difficult and dangerous doesn't mean taking advantage of simplicity.

Reason on July 8, root parent next [—]. Managing branches is a pain with subversion compared to git, sure. But dangerous? There are major projects out there still using subversion. GCC for example. They have lots of branches. As for performance Sounds to me like you didn't know what you were doing with SVN. If I complain that Git caused some problems, Git users are likely and often justified to tell me that I was using Git wrong.

But it's rare that the same reasoning is extended towards SVN by many Git users. The worst I've had with SVN was a broken working copy, which is usually easily fixed. With Git, problems like that occur much more frequently.

In the past 6 months I likely have completely wiped my local Git repository for a particular project alone more times than I've ever had to fix a broken SVN working copy. I think Git has a terrible and confusing UI compared against other distributed systems. As for performance, again, I use SVN with small projects, so performance hasn't been an issue. Branching would provide no benefit in my case.

If I wanted to branch, I'd switch to a distributed system. My experience talking to some Git people is that they often branch as a habit without considering what could be gained from branching.

HelloNurse on July 8, root parent next [—]. If network failures can leave working copies in pieces, in an unknown state, it isn't a matter of knowing what one's doing. Out of the box, Git and similar modern VCS systems offer better safety, and better auditing whenever something goes wrong. There's just never a reason to not branch. It keeps ideas, efforts, tasks separated really nicely and has essentially no cost to doing so. It lets me have a completely different environment to try things out, wreck, and abandon things without ever touching the branches that are important.

When I'm done, a simple merge brings it all in at once. These are all completely valid reasons to branch.

They also don't apply to most of my projects. And for the ones they do apply for, I use Git. My small projects tend to be relatively simple, and often contain a lot that's not code. Some contain almost no code at all. For example, I've had people recommend branching to keep track of different versions of the same paper they're writing. But this immediately struck me as a waste of time. I'll be submitting only one version of the paper. Why keep multiple internal versions?

I am not convinced by the argument that I can have a branch for each person I ask to read the paper. Merging in the handwritten changes they provide me is not hard. Branching would just be extra work in this case. It can be nice to try different organizational structures sometimes, but I've found it easier to simply have a different TeX file in that case. Or better yet, multiple TeX files for each part, and then a set of master TeX files that organize the paper differently.

If someone has a good argument for distributed version control in this use case, I'd be happy to hear it. In both CVS and Svn you can just check out from a local directory. Same is possible with Svn too. The SVN documentation used to warn that concurrent access in this configuration could lead to data loss. Even though I'm the only human who accessed my repository I did use scripts , so I always avoided this setup because of that.

GordonS on July 8, root parent prev next [—]. I resisted moving from SVN to Git for my personal projects for a long time, mainly because I was too lazy. But conflicts or repos getting into an inconsistent state that required fixing was too regular an occurrence, and eventually I snapped. It seems as though our profession has such a low barrier to entry, especially with open source, that a lot of tooling is seen as having no associated cost. To draw an admittedly flawed comparison I work at a contract engineering and manufacturing firm.

There are some products that we produce by the tens or hundreds of thousand and benefit greatly from a lot of automation in both assembly, testing, packaging, etc. We also do low count production runs that quite simply don't get much automation because the per unit cost would end up being astronomical. There's no reason to tool up for a , piece run if you're making 10 pieces.

In our field the barrier to entry seems free though. So while git was designed to meet the needs of the linux kernel, people also use it for their own person 1kloc side project. It doesn't stop here of course, introductions for making a simple web app are often filled with tooling, frameworks, etc that need to be included, configured, and used.

Undoubtedly these make sense for large projects, but are used for personal sites as well. Sure, and now git is well enough known to be a good default. Note that I realize that you didn't argue for no version control. How weird. The only thing I remember about CVS was that to clone something from CVS, you had to know some root directory this presumably was the webroot , and sourceforge.

With cvs, specify just ". But, yes it was sometimes annoying that the module name wasn't clearly documented. I enjoyed skipping the part of the article that explains how CVS worked, because I lived it. I wonder though, have we reached the end? If there anything beyond Git? When I used RCS, I would always lament "it would be nice if two of us could work on a file at the same time". When I was using CVS, I'd lament, "It would be nice if two of us could work on a group of files at the same time and merge our changes".

But using Git, my only lament is, "I wish this were easier for new developers" and "it would be great if there were a standard workflow". Neither one really demands a new paradigm in VCS though. The ability to split and merge repositories as easily as we can split and merge branches might open up some new use patterns. The particular context I'm thinking of is scientific repositories. These tend to grow in size and scope in an unplanned manner.

Pieces inevitably need to be split off for a collaboration, to be made public, or because someone is changing institutions and needs to take part of the project with them. RCS was okay if you were a sysadmin, terrible for everyone else. CVS was okay, but still limited. SVN was more advanced, but buggy as hell. Git is more advanced and less buggy, but over-complicated and unintuitive.

Git was actually started in order to complete core functionality and let someone else make the front-end for the VCS. But somewhere along the line people just decided they didn't need a user friendly frontend, and now the core is what people use every day. Unless someone comes up with a really slick universal frontend for it, it's probably time for a new VCS.

DAG of commits, content-addressable file-system. What about Mercurial? Good question. I never gave it a fair shot, and it used to have some flaws, but maybe it's cleaned those up by now. Really slick, really easy, really powerful. Combined with Evolution extension.. Please leave everybody on git so that I can have a competitive advantage over them. I literally couldn't pull the OpenBSD src tree for months I'm one of the few people who deliberately learned and used CVS for a while in recent times.

I did not have any public repositories at the time and needed VCS for my configuration and some documents Org mode mostly , and the model where I could have a central repository on a local directory which I could easily back up was compelling. Then I figured out a filesystem layout where I could back up all my work easily and this became useless, thus I switched to Mercurial. Nowadays I'm considering going just git, because it's what everybody uses, and Magit is a compelling piece of software.

RCS is good for e. I also have a pool of Elisp files which contain the personal bits of my Emacs configuration, and I use RCS on them because their histories are not related to one another. The repo is central and can totally reside in a local tree, and checkins from different checkouts go directly to that repo.

This is akin to sharing one. When did it start to be like this? Making code better is a dick move now? Who rewrites stuff passive-aggressively? What does that even mean? Nice article but the conclusion doesn't really follow from how the article is build up. As the article re-examines this obsolete version management tool, it becomes clear it's pretty easy and straightforward and can do a lot of things that git can to a certain degree.

On top of that it's dead-easy to setup and use, in fact, its simplicity might be an indication that's it's not all that obsolete and might be exactly the right fit for new small personal projects. Those are some really optimistic takeaways. I hope you'll try CVS and report back especially a merge, or looking at project history including deleted files. I don't think any of these statements will stand up to scrutiny. Sometimes that's all a project needs.

I never used Perforce or ClearCase. I played with Bazaar, Monotone, Mercurial and Darcs but not enough to really appreciate them. Those were interesting times. One of the engineers had made a SVN repository for all our design specs and had cooked up a simple intranet page where the latest version of a design could always be shared by a permanent URL but also a history of all earlier versions.

That was my first experience with version control and I remember thinking it was magic. Though I had experience with cvs and subversion mostly through open source mostly in sourceforge , I remember installing Trac around the time it first came out, and using that as equivalent to what would be done with GitHub today.

Of course you had to have one install of it per project or per 'organization', depending on your repository setup , and run it on a server somewhere. Trac was great though, especially for the time: subversion server, source and changeset browser, tickets, wiki, roadmap.

Aside from my own personal stuff, I switched several open source projects to it, and got a couple companies on to it. I worked on a lot of php stuff that could be deployed from source and never really saw the need then.

Now I think it's essential and don't work without it. In other words, the people coming from cvs and svn and complaining that git added a step for them were either doing an impeccable job of keeping their source dirs clean at all times , or they were implicitly admitting that they weren't keeping track of what they were adding to their own repos.

I would guess there are old projects that fit the latter description. But I know from experience there are old projects that clearly fit the latter. SVN also allowed you to commit only specific files, so if your working directory wasn't clean, you could mostly still commit just the parts you wanted to commit. This post does not do justice to just how awful CVS is. OpenBSD still uses it, and it's the main reason I've only rarely contributed patches. CVS is just that crappy. But was 32 years ago.

It's not bad because it's old, but it's not 32 years of good either. Commodore 64 was great for its time, but I'm not going to load my version control from a cassette player in Git has it's problems, especially on usability, but it's much better than all the others in that list! Ensorceled on July 8, prev next [—]. Every one of those was a major upgrade in functionality except the last.



0コメント

  • 1000 / 1000