Here some information about our setup. I feel I need to provide this information in order to better explain why I want to do this.
We have all our dev environment in VM. This way we can switch easily between different version of the product. We work over an awfully slow vpn (downloading the source take 30 min to 1 hour). The code is tightly integrated with the OS (COM registered, VB6 dll, OCX, etc.) so this is the best way for use to work currently.
I cannot change the way we work
I am currently setting up a base VM to distribute around team mates to get working faster. I want to download once the source code in this base and when team members start using the VM they recreate the workspace, point it to the existing code and just do a getlatest.
The problem is TFS doesn't recognize what is already in the folder and simply download everything directly.
How can I make TFS check what is present locally before downloading everything from the server? Like when you do a "normal getlatest".
Since you're using TFVC, you should set up a TFVC proxy server. The proxy server will allow you to synchronize your code from a fast, local source.
Related
I am using docker containers and have docker-compose files for both local development and production environment. I want to try Google Cloud Platform for my new app and specifically Google Kubernetes Engine. My tools is Docker for Mac with Kubernetes on local machine.
It is super important for developers to be able to change code and to see changes live for local development.
Use cases:
Backend developer make changes to basic Flask API (or whatever you use) and should see changes on reloaded app immediately.
Frontend developer make changes to HTML layout and should see changes on web page immediately.
At the moment i am using docker-compose files to mount source code to local containers. But Kubernetes does not support relative paths to mount the source code.
Ideally i should be able to set the variable
Deployment.spec.templates.spec.containers.volumes.hostPath
as relative path to my repo. For example, in our team developers clone repo to this folders:
/User/BACKEND_developer/code/project_repo
/User/FRONTEND_developer/code/project_repo
Obviously you can't commit and build the image after every little change to the source code.
So what is the best practice for local development with Kubernetes? Do i need some additional tools to modify .yaml files for every developer?
#tgogos is right.
The best way to achieve your goal is to use Skaffold
It will rebuild container whenever it sees changes in source code.
Skaffold has a pluggable architecture that allows you to choose the tools in developer workflow that work best for you:
A very promising approach for dynamic languages is the hybrid approach recently introduced by Skaffold, allowing to take advantage of the usual auto-reload mechanisms. You can define two set of files:
Changing a file on the first set triggers the full rebuild+push+deploy mechanism.
Changing a file on the second set only syncs the file between the local machine and the container.
Such an hybrid approach is well suited to a large class of technology stacks, like Node.js, React, Angular, Python, where you can use the native hot-reload mechanism for source code changes, and trigger the full rebuild only when it’s needed (for example, adding a dependency). This helps a lot in keeping the latency low.
I spoke about this in my recent talk at All Day Devops. Here there’s an example based on Node.JS.
I am using docker containers and have docker-compose files for both local development and production environment. I want to try Google Cloud Platform for my new app and specifically Google Kubernetes Engine. My tools is Docker for Mac with Kubernetes on local machine.
It is super important for developers to be able to change code and to see changes live for local development.
Use cases:
Backend developer make changes to basic Flask API (or whatever you use) and should see changes on reloaded app immediately.
Frontend developer make changes to HTML layout and should see changes on web page immediately.
At the moment i am using docker-compose files to mount source code to local containers. But Kubernetes does not support relative paths to mount the source code.
Ideally i should be able to set the variable
Deployment.spec.templates.spec.containers.volumes.hostPath
as relative path to my repo. For example, in our team developers clone repo to this folders:
/User/BACKEND_developer/code/project_repo
/User/FRONTEND_developer/code/project_repo
Obviously you can't commit and build the image after every little change to the source code.
So what is the best practice for local development with Kubernetes? Do i need some additional tools to modify .yaml files for every developer?
#tgogos is right.
The best way to achieve your goal is to use Skaffold
It will rebuild container whenever it sees changes in source code.
Skaffold has a pluggable architecture that allows you to choose the tools in developer workflow that work best for you:
A very promising approach for dynamic languages is the hybrid approach recently introduced by Skaffold, allowing to take advantage of the usual auto-reload mechanisms. You can define two set of files:
Changing a file on the first set triggers the full rebuild+push+deploy mechanism.
Changing a file on the second set only syncs the file between the local machine and the container.
Such an hybrid approach is well suited to a large class of technology stacks, like Node.js, React, Angular, Python, where you can use the native hot-reload mechanism for source code changes, and trigger the full rebuild only when it’s needed (for example, adding a dependency). This helps a lot in keeping the latency low.
I spoke about this in my recent talk at All Day Devops. Here there’s an example based on Node.JS.
we are a team who would like to replicate the TFS from one site into another site. Both are in different domain and cannot communicate in any means. Please suggest the best practices of the same.In addition I am also looking for a standalone tool to give me a detailed report of the TFS environment(which includes the work-items, etc) along with the SQL server attached to it. The intention is to replicate the same environment so that a full backup goes through fine.
You want to setup a complete clone of your environment in another site, disconnnected from your. Some key points follows:
You need a proper backup of the current TFS data, see Backup TFS
Size the target environment in terms of disk, memory, network, etc.
Install on the new site a compatible SQL Server version
Install on the new site the same (or newer) TFS version
Study the instruction to Clone TFS and apply them on the new site
Plan for changed environment: Active Directory domain, user accounts
Topology could be different, you have to rebuild you Build and Test infrastructure or, at least, properly remove the old references from the new site
What you are wanting to do is not possible.
You will need to put your TFS server somewhere accessible to both locations. I would recommend either VSO (TFS.visualstudio.com) or a custom IAAS instance and domain.
OK, I've been convinced that SVN is the way to go in a previous posting, but I haven't yet seen the epiphany. I'm not sure how I would set SubVersion up for my development environment.
Here's my current setup. I'm not keen to mess with it and it would be really nice if subversion could sit alongside it:
Work:
N:\Projects
N:\Projects\Lib
N:\Projects\App1
N:\Projects\App1\Help
N:\Projects\App1\Images
N:\Projects\App2
..etc
N: is on a separate server in the building.
There are several other development machines with the tools installed locally, but all development takes place referencing the files on the server - i.e. no source code is kept on the workstations.
Home
Laptop with same development toolset, and the sources in c:\Projects\App1.. etc, i.e. a mirror of the setup on n:\Projects at work.
The sources between N:\Projects and C:\Project are currently kept aligned with a custom app in conjunction with DropBox. File exclusions make sure that non-source files don't get sync'ed
I want to run SubVersion with this setup.
Where do I put the Repository?
Assuming I can have the repository in
a mutually accessible place, will SVN
remove the current need to sync
between work and home?
In order to embrace Subversion, you will replace your shared source directory with a Subversion repository that lives on the server. Each developer workstation will check out a copy of the whole source code locally (however, this could be a private area on a network server if you like).
You could retain your N:\Projects tree as a read-only copy of the daily build, or whatever. But one of the goals of Subversion is to mediate between two people editing the same file at roughly the same time. This is not compatible with a shared directory containing writable source code. Also, having multiple developers "share" the same Subversion working directory in some way is doomed to failure.
Why not create an internet accessible (free) trial Subversion account, and play around a bit, to get yourself familiar, before you move your entire source code tree into it. Just so you don't delete everything you own, by accident. Maybe start with one dummy project. Host something on the internet. Without even paying a cent, you could use this site:
http://www.projectlocker.com/
Then you can set up your very own starter subversion server. You can create a brand new Delphi application (file -> new delphi application), and add a button, and double click that button, and write a message box thingy, or whatever it is you like to do in demo apps. Now create a subversion repository (perhaps they call them projects, up on project locker), and add the folders you saved this project into, to that repository.
Now you can play with (a) tortoise SVN, (b) the SVN integration build into Rad Studio XE, if you have Rad Studio XE, and (c) the version control plugins that come in the JCL, if you don't have Rad Studio XE.
Also, may I suggest that if you want to have any hope of knowing what you're doing you learn how to add and commit, and update, from the command line. It's really not that hard. And it will pay off later.
Knowing you can type svn co http://reposite.something.com/svn/myproject to check out a project to your disk, is very handy. Sometimes, I think GUIs are training wheels for your brain. You cripple yourself if you don't learn command lines.
A benefit to a hosted subversion service like the one I showed above, is that you have an offsite backup. Of course, such hosting is always free even for large projects, if you are writing something open source. Then you can host on sourceforge. Otherwise, you're going to (a) need to use your own internet accessible host or (b) pay for hosting, otherwise you're not going to be able to easily access your repository at home, and at work.
Personally, if it was my own business, or my professional job to write software, I would host my own subversion server, and it would be private (LAN) only, and I would use a VPN to access it from home.
1: You definitely want a repository accessible from both locations. Either that, you you will need to use a distributed versioning system, like Mercurial or Git
2: Yes, there will be no more need for your custom sync app. This is exactly the job for your versioning system. Syncing manually in addition to using SVN is not necessary and would even create lots of conflicts.
Your shared directory should be removed and a copy of the code present on each machine that is a working copy of the SVN repository.
Use your server with the files to place the SVN server on it or any server that all including your home computer have access to.
Commit / Update every day, multiple time a day and manage merges if needed .
For the home access the simplest is to either get a dedicated server on the net or redirect the correct port on your router (but you will obviously need some access control in place) so that your repository is accessible from outside. If needed you could limit access from your home IP or from a list of IPs with a good router.
The other solution as other said is another kind of version system called "distributed" where every commit is done locally in your own repository on your own PC and this repository is merged on the "main" repository to share code and the change of other members of the team are pulled back in your local repository (You don't need any "main" repository technically on a DVCS but for a company that's what you will have).
See Git or Mercurial for good DVCS (Git syntax sucks but it's the most widely used system and technically the best one).
Put the repository in the safest place. That usually means a good redundant server (disks, etc.), in a controlled server room, and one which is properly backed up. When you switch to a VCS, source code to work on is typically in local machines sandboxes, because each developer must have its own. Then changes are get and sent to/from the server. Be aware that some tools may have issue is on a remote directory, because of the way for example the SMB protocol works - check they are supported explicitly if you need to use them. Unless you have paramount security needs, IMHO working in local sanboxes is faster and easier.
If you can access the SVN server from home (i.e. via a VPN), it will be not different than working from the office. You will "sync" (update/commit) your laptop sandbox the same way, you don't need a local server and repository. If you need a local server (reason could be you can't access the central repository from outside, need to work disconnected yet version files, etc.) there could be ways to replicate across SVN servers, but at that point maybe a distributed VCS should work better in such scenario.
Let's say I have my TFS team project setup the way I want it, and all the code between my machine and the team project is in sync (i.e. if I do a get latest it says everything is up to date).
What I would like to do is test whether or not I can pull the project back down to my local machine from TFS source control have everything work properly. By work properly, I mean I'm able to build all the projects, run the web sites, etc.
I thought what I could do is just blow away the code on my local machine and then do a get latest. But TFS seems to think that my local machine and TFS are still in sync (this is a bit different from the way Visual Source Safe worked).
In a nutshell, I'm just trying to test whether or not if another developer were to pull this team project down to their machine, that I know the project is setup correctly with all the necessary dependencies, etc. such that the other developer could build and run the project. But since I only have one machine to test this with currently, I need to do this test on the same machine.
The only way I've found to do it is to use "Get specific version" and force it to overwrite existing files, but it seems like if I delete the stuff off my hard drive, it should know when I do a get latest that "hey, the files aren't there anymore, I need to pull them down".
Any ideas on how I can do this? Thanks.
Not withstanding the answer above highlighting the merits of having an automated build process and continuous integration...
The easiest way to validate what you've checked-in is to create a new workspace with the same folder mappings, albeit to a different location on your hard-drive. You can then 'Get Latest' into this new workspace and confirm that everything builds locally, this should prove that:
The correct versions of existing files are in source control
All the required files have actually been added to source control
Alternatively if you'd rather not check-in your changes until you've validated your pending changes, then your best bet is to 'Shelve' all your changes (ticking the box to undo your pending changes), and then 'Unshelve' that shelveset into a new Workspace and do your testing against that instance of the codebase... or even ask a colleague to pull down your shelveset and do the validation (typically this called a 'Buddy Build').
TFS is a little different that VSS in that local workspaces are maintained so that every file doesn't have to be compared with every GET. In addition to removing the code from your development machine, you should also delete your local workspace. Check out "Working with Version Control Workspaces" on MSDN:
http://msdn.microsoft.com/en-us/library/ms181383.aspx
Really, though, the best way to make sure that your code can be pulled down and built easily is to create an automated build in TFS for continuous integration. That way you know immediately if you have done something that would make the solution un-buildable.
Check out the overviews of Team Foundation Build on MSDN:
http://msdn.microsoft.com/en-us/library/ms181710.aspx
The answers above are good. Except it will not completely test you entire scenario. If you have references outside of your solution (such as dll in the GAC, or dll from an SDK installed on your machine), creating a new workspace or deleting and getting latest code won't found those problems.
The only way to make sure is to pull down the code on another computer. If you don't have another computer handy, you can use a Virtual machine.
Do Get Specific Version and specify the latest. This will force TFS to download everything, ignoring the current synchronisation status.
TFS uses your workspace to know what is synched between the server and local.
I don't think there is an option to make Get Latest to behave like you want (Get Specific Version and specifying Latest Version and Overwrite).