I am trying to provide a lot of context below for this problem so that experienced people can read the symptoms. I expect more questions will need answering to get to the bottom of it.
The short form of this question is how do I remove the source control history from a project in Delphi? (The local project folder) After removing the .svn and .local directories Delphi still wont svn import. There is nothing in the repository. This seems to be a problem with the Delphi integration. Some local caching of activity that does not seem to be relying on the repository for information.
Longer form: I recently setup the Delphi XE3 included subversion client and server. It is running as a service on Windows. The setup was done with sc as per the svn-book.
I successfully figured it all out by trial and error. With a fair bit of error over the last few days.
In trying to clean up my source code and repositories to get to a clean install I found that I needed to remove repositories from the server and re-create them. I also thought if I removed the .svn directories from the source that it would remove all traces of source control. This did not work. So I tried to additionally remove the .local files, which do have some version history in them.
When I load the project group and switch to the import tab, I still see recent comments showing my initial commit.
In addition when I try to Import a dialog comes up saying one of my big key .pas files is already under version control. Another file, a dproj file it says is not under version control. When I check the repo with
svn ls
some folders got made, but there are no files in the repository.
The config file is setup such that I should need to supply a password. It never asks for one. I just left the default security in place because I dont need to concern myself with it much. Just enough to stop mistakes. It is on a local network. SVN Import didnt require it either.
I can manually add files with svn import. I am using the svn: protocol prefix with svnserve.exe running as a service. Authentication is default. It works without passwords for some reason. It shouldnt do that.
svnserve.conf has the following: (comments omitted for bevity)
[general]
anon-access = none
auth-access = write
password-db = D:\SVNRepos\conf\passwd
realm = Root
force-username-case = none
svn --Version says:
svn, version 1.7.5 (r1336830)
compiled May 11 2012, 02:21:17
At first, SVN is not CVS. While CVS is legacy system with many problems, SVN is a modern centralized version-control system (git fans may disagree, but this is another topic).
Subversion 1.7 is out of date and no longer supported. The current release is Subversion 1.9 and it has many improvements compared with 1.7. SVN 1.8 is still supported as well. Therefore, you should definitely upgrade your client and server to the latest version.
Returning to your problem: you haven't specified any errors that you get in Delphi XE3 IDE. Do you get any?
When you svn add files in Subversion working copy, you schedule them to be committed next time you run svn commit. So there is a chance that you haven't actually committed them to the repository.
If your code was already imported to Subversion repository, then it has to be there. Double-check this with svn ls -r <URL> again. Then you could checkout a working copy using svn checkout <URL>.
BTW, .local files do not relate to Subversion. These seem to be some project files of the IDE, but I'm not sure.
And if you have issues with Subversion server setup on Windows, there are packages that should help you. See the binary packages page.
Related
I'm using Jenkins v1.546, hosted on a Windows Server 2008 R2 SP1 machine.
I've set up a fairly simple job for building a Maven Java project. It polls the SCM with no schedule and picks up remote build triggers, requiring an authentication token. It uses Subversion and performs clean checkouts with svn update. Additionally, it has a post-build step that archives some build artifacts (i.e., the resulting WAR and WSDLs).
The issue I'm experiencing is that the builds that it stores on the filesystem itself contain invalid characters in their filenames. This causes our automatic backup process to blow up, it being unable to alter or remove those directories/files with the '$'. I myself cannot move/delete those folders or files either, but if I rename it and remove the $, then things work fine. Oh, and if I try to follow one of these links with the $ in it, it doesn't resolve. None of the other jobs seem to do this - just my job, of course. Anyone know why this may be occurring and what I can do to resolve this?
I've attached multiple screenshots that show the bad filename and my Jenkins job setup. I had to white out some company information. If I can provide any additional information to help troubleshoot this, just let me know.
Also, as an update, I did some additional research, looking through the changelogs for each released version of Jenkins since my version (latest is 1.557). I saw three possible issues in the changelogs that could be related, but it's hard for me to tell. I cannot simply upgrade our Jenkins to test out this theory, since I'll need to provide a reason for upgrading beyond a hunch.
https://issues.jenkins-ci.org/browse/JENKINS-21023
https://issues.jenkins-ci.org/browse/JENKINS-20534
https://issues.jenkins-ci.org/browse/JENKINS-21958
The $ is a perfectly valid character in Windows directory name. You can manually make a folder with it, and delete it without any problems.
The com.company$moduleName syntax is used by Jenkins Maven-style job to separate modules of your build. If you don't see this structure for other people's jobs, it is because they are either not building a Maven job, or they don't have multiple modules in a single job.
What is strange though it that these are symlinks (I don't see that in my environment). It is possible that the location that is referenced by the symlink is deleted, but the link remains. In this case, you would not be able to navigate to that location through the link (this is what you are experiencing)
Is it possible that your backup software is deleting the target directories before deleting the links?
In any case, do a simple dir on the directory with the links to see what they link to. And then verify those target locations exists. If they don't, you need to figure out who/what is deleting the links' targets
Edit:
This seems to be more related to the issue that you are facing. Unfortunately, it's marked as "unresolved"
https://issues.jenkins-ci.org/browse/JENKINS-20725
The issue stems from the fact that the symlinks are referencing to targets with / instead of \
My Maven plugin (not Maven version) is 2.6. See if upgrading your Maven plugin in Jenkins will help you. Also, I am running Maven 3.2.2 from the automatic installers. Try with that, as I don't see symlinks in my modules.
I have installed the latest release of Subversion following these instructions http://www.drbob42.com/examines/examinD3.htm successfully. Then I installed the Delphi IDE integration too.
If I open a Delphi project, right click on the Project Manager in the Delphi IDE and choose "Tortoise SVN" and then "Repository browser" I can see all the files in my project in the local repository.
At that stage I added one instruction line in my application source, saved and then tried to commit. The SVN commit form shows up but it says "No files were changed since the last commit. There is nothing for TortoiseSVN to do here". As I just changed the source of my application I was expecting SVN showing that. Why isn't this happening ?
I chose the option cleanup and the error message says "Cleanup failed to process the following paths..... is not a working directory
Thanks
You need to check out the files from the repository into a working directory, and then modify the files that are in that working directory. (You don't directly modify files in the repo.)
Use "File->Open from Version Control", and complete the dialog. It will check out the files into the folder you specify as "Destination" (which should not be your repository - it should be a separate directory!), and then you modify the files in that working directory and commit (check in) your changes to the repository.
Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.
I just signed up for Team Foundation Services Cloud Service since I've failed to implement TFS on my server and local machine. I want to change the source provider from the previous TFS system I abandoned to the new cloud one. It still refers to the old one. I've gone through all the options available, including running a program off of CodePlex that will remove source control bindings. For some reason, it refuses to allow me to unmap my source control from the old server and bind it to the new TFS cloud service. Why?
EDIT: I noticed there are some invisible .SUO files in my project directories. Is this where TFS 2012 stores it's settings? I deleted these files and somehow I was able to map my source to the new server. There were work spaces that appeared when I ran TFS from the command line that didn't appear in the in VS.
I tried everything including uninstalling TFS server locally and removing all traces of the old server connection info on my system and I still couldn't get it to switch to another server. It was like a pit bull that wouldn't let go. I had the taught that Microsoft wanted to make their TFS look less cluttered by hiding its ugly plumbing in invisible folders like Git and Mercurial. Sure enough, there were SUO files hidden in my directory and subfolders. I recursively deleted all them and was able to connect to the new server.
The following command should recursively remove all hidden SUO files from your solution folder:
del /S /A:H *.suo
I am using GIT with a new ASP.NET MVC project. I have a line in my gitignore file to ignore dlls
*.dll
I would like to add something along the lines of the following to include (i.e. do not ignore) DLLs in my NUGET packages folder
!/packages/*.dll
The problem I'm encountering is that not all nuget packages are created equally and, depending on the package in question, DLLs may be nested an arbitrary number of levels in the path hierarchy. It seems that I simply need a recursive solution along the lines of:
!/packages/**/*.dll
!/packages/**/*
I have not yet found a solution that will work via mysysgit (or any windows distribution of git).
Does anyone know of a way to make this work???
Leave your top level gitignore alone by keeping *.dll in it.
Create another .gitignore file in the packages directory and put !*.dll in it.
Another option to consider is NOT including your NuGet dlls in your repository and instead only download them the first time you build your project. This is what we do with all of our NuGet dependencies.
UPDATE
Nuget handles this now without having to manually create your own build events. See the details on this page: http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages
Original Answer:
We put the NuGet.exe application in a tools folder under our solution, and then add the following to our project pre-build event.
"$(SolutionDir)Tools\NuGet.exe" install "$(ProjectDir)packages.config" -o "$(SolutionDir)Packages"
The first time we build the app it will download all of the dependencies, but with subsequent builds, NuGet is smart enough to see that they already exist at the correct version and skips them.