I am new to this whole containarization and backend as a services technologies. But I wanted to give appwrite a shot because it seemed very easy and well suited for a small project I am about to build. The only problem is I donot know that much about docker, and I am a bit unsure if and how will I be able to move the appwrite image instance that is running locally with all the changes that I have made to it (i.e. created projects, existing db documents, functions etc) to production server or any other computers. How might I be able to do this? thanks
If you're looking to move the configuration for your project AND the data, the best thing to do would be to:
Backup your project
Move the backups and the appwrite folder to the new location
Start Appwrite
Restore the backup
If you only need to migrate the schema for your collections, you can use the Appwrite CLI to create an appwrite.json file of your project and then deploy it to another instance. The CLI can also be used to manage your Appwrite Functions too.
References:
YouTube Video on Backing Up and Restoring
Docs on Backups
Docs on Appwrite CLI
I am using docker containers and have docker-compose files for both local development and production environment. I want to try Google Cloud Platform for my new app and specifically Google Kubernetes Engine. My tools is Docker for Mac with Kubernetes on local machine.
It is super important for developers to be able to change code and to see changes live for local development.
Use cases:
Backend developer make changes to basic Flask API (or whatever you use) and should see changes on reloaded app immediately.
Frontend developer make changes to HTML layout and should see changes on web page immediately.
At the moment i am using docker-compose files to mount source code to local containers. But Kubernetes does not support relative paths to mount the source code.
Ideally i should be able to set the variable
Deployment.spec.templates.spec.containers.volumes.hostPath
as relative path to my repo. For example, in our team developers clone repo to this folders:
/User/BACKEND_developer/code/project_repo
/User/FRONTEND_developer/code/project_repo
Obviously you can't commit and build the image after every little change to the source code.
So what is the best practice for local development with Kubernetes? Do i need some additional tools to modify .yaml files for every developer?
#tgogos is right.
The best way to achieve your goal is to use Skaffold
It will rebuild container whenever it sees changes in source code.
Skaffold has a pluggable architecture that allows you to choose the tools in developer workflow that work best for you:
A very promising approach for dynamic languages is the hybrid approach recently introduced by Skaffold, allowing to take advantage of the usual auto-reload mechanisms. You can define two set of files:
Changing a file on the first set triggers the full rebuild+push+deploy mechanism.
Changing a file on the second set only syncs the file between the local machine and the container.
Such an hybrid approach is well suited to a large class of technology stacks, like Node.js, React, Angular, Python, where you can use the native hot-reload mechanism for source code changes, and trigger the full rebuild only when it’s needed (for example, adding a dependency). This helps a lot in keeping the latency low.
I spoke about this in my recent talk at All Day Devops. Here there’s an example based on Node.JS.
Here some information about our setup. I feel I need to provide this information in order to better explain why I want to do this.
We have all our dev environment in VM. This way we can switch easily between different version of the product. We work over an awfully slow vpn (downloading the source take 30 min to 1 hour). The code is tightly integrated with the OS (COM registered, VB6 dll, OCX, etc.) so this is the best way for use to work currently.
I cannot change the way we work
I am currently setting up a base VM to distribute around team mates to get working faster. I want to download once the source code in this base and when team members start using the VM they recreate the workspace, point it to the existing code and just do a getlatest.
The problem is TFS doesn't recognize what is already in the folder and simply download everything directly.
How can I make TFS check what is present locally before downloading everything from the server? Like when you do a "normal getlatest".
Since you're using TFVC, you should set up a TFVC proxy server. The proxy server will allow you to synchronize your code from a fast, local source.
I'm working on creating some docker images to be used for testing on dev machines. I plan to build one for our main app as well as one for each of our external dependencies (postgres, elasticsearch, etc). For the main app, I'm struggling with the decision of writing a Dockerfile or compiling an image to be hosted.
On one hand, a Dockerfile is easy to share and modify over time. On the other hand, I expect that advanced configuration (customizing application property files) will be much easier to do in vim before simply committing an new image.
I understand that I can get to the same result either way, but I'm looking for PROS, CONS, and gotchas with either direction.
As a side note, I plan on wrapping this all together using Fig. My initial impression of this tool has been very positive.
Thanks!
Using a Dockerfile:
You have an 'audit log' that describes how the image is built. For me this is fundamental if it is going to be used in a production pipeline where more people are working and maintainability should be a priority.
You can automate the building process of your image, being an easy way of updating the container with system updates, or if it has to take part in a continuous delivery pipeline.
It is a cleaner way of create the layers of your container (each Dockerfile command is a different layer)
Changing a container and committing the changes is great for testing purposes and for fast development for a conceptual test. But if you plan to use the result image for some time, I would definitely use Dockerfiles.
Apart from this, if you have to modify a file and doing it using bash tools (awk, sed...) results very tedious, you can add any file you wish from outside during the building process.
I totally agree with Javier but you need to understand that one image created with a dockerfile can be different with an image build with the same version of the dockerfile 1 day after.
maybe in your build process you retrieve automatically last updates of an app or the os etc …
And at this time if you need to reproduce a crash or whatever you can’t rely on the dockerfile.
OK, I've been convinced that SVN is the way to go in a previous posting, but I haven't yet seen the epiphany. I'm not sure how I would set SubVersion up for my development environment.
Here's my current setup. I'm not keen to mess with it and it would be really nice if subversion could sit alongside it:
Work:
N:\Projects
N:\Projects\Lib
N:\Projects\App1
N:\Projects\App1\Help
N:\Projects\App1\Images
N:\Projects\App2
..etc
N: is on a separate server in the building.
There are several other development machines with the tools installed locally, but all development takes place referencing the files on the server - i.e. no source code is kept on the workstations.
Home
Laptop with same development toolset, and the sources in c:\Projects\App1.. etc, i.e. a mirror of the setup on n:\Projects at work.
The sources between N:\Projects and C:\Project are currently kept aligned with a custom app in conjunction with DropBox. File exclusions make sure that non-source files don't get sync'ed
I want to run SubVersion with this setup.
Where do I put the Repository?
Assuming I can have the repository in
a mutually accessible place, will SVN
remove the current need to sync
between work and home?
In order to embrace Subversion, you will replace your shared source directory with a Subversion repository that lives on the server. Each developer workstation will check out a copy of the whole source code locally (however, this could be a private area on a network server if you like).
You could retain your N:\Projects tree as a read-only copy of the daily build, or whatever. But one of the goals of Subversion is to mediate between two people editing the same file at roughly the same time. This is not compatible with a shared directory containing writable source code. Also, having multiple developers "share" the same Subversion working directory in some way is doomed to failure.
Why not create an internet accessible (free) trial Subversion account, and play around a bit, to get yourself familiar, before you move your entire source code tree into it. Just so you don't delete everything you own, by accident. Maybe start with one dummy project. Host something on the internet. Without even paying a cent, you could use this site:
http://www.projectlocker.com/
Then you can set up your very own starter subversion server. You can create a brand new Delphi application (file -> new delphi application), and add a button, and double click that button, and write a message box thingy, or whatever it is you like to do in demo apps. Now create a subversion repository (perhaps they call them projects, up on project locker), and add the folders you saved this project into, to that repository.
Now you can play with (a) tortoise SVN, (b) the SVN integration build into Rad Studio XE, if you have Rad Studio XE, and (c) the version control plugins that come in the JCL, if you don't have Rad Studio XE.
Also, may I suggest that if you want to have any hope of knowing what you're doing you learn how to add and commit, and update, from the command line. It's really not that hard. And it will pay off later.
Knowing you can type svn co http://reposite.something.com/svn/myproject to check out a project to your disk, is very handy. Sometimes, I think GUIs are training wheels for your brain. You cripple yourself if you don't learn command lines.
A benefit to a hosted subversion service like the one I showed above, is that you have an offsite backup. Of course, such hosting is always free even for large projects, if you are writing something open source. Then you can host on sourceforge. Otherwise, you're going to (a) need to use your own internet accessible host or (b) pay for hosting, otherwise you're not going to be able to easily access your repository at home, and at work.
Personally, if it was my own business, or my professional job to write software, I would host my own subversion server, and it would be private (LAN) only, and I would use a VPN to access it from home.
1: You definitely want a repository accessible from both locations. Either that, you you will need to use a distributed versioning system, like Mercurial or Git
2: Yes, there will be no more need for your custom sync app. This is exactly the job for your versioning system. Syncing manually in addition to using SVN is not necessary and would even create lots of conflicts.
Your shared directory should be removed and a copy of the code present on each machine that is a working copy of the SVN repository.
Use your server with the files to place the SVN server on it or any server that all including your home computer have access to.
Commit / Update every day, multiple time a day and manage merges if needed .
For the home access the simplest is to either get a dedicated server on the net or redirect the correct port on your router (but you will obviously need some access control in place) so that your repository is accessible from outside. If needed you could limit access from your home IP or from a list of IPs with a good router.
The other solution as other said is another kind of version system called "distributed" where every commit is done locally in your own repository on your own PC and this repository is merged on the "main" repository to share code and the change of other members of the team are pulled back in your local repository (You don't need any "main" repository technically on a DVCS but for a company that's what you will have).
See Git or Mercurial for good DVCS (Git syntax sucks but it's the most widely used system and technically the best one).
Put the repository in the safest place. That usually means a good redundant server (disks, etc.), in a controlled server room, and one which is properly backed up. When you switch to a VCS, source code to work on is typically in local machines sandboxes, because each developer must have its own. Then changes are get and sent to/from the server. Be aware that some tools may have issue is on a remote directory, because of the way for example the SMB protocol works - check they are supported explicitly if you need to use them. Unless you have paramount security needs, IMHO working in local sanboxes is faster and easier.
If you can access the SVN server from home (i.e. via a VPN), it will be not different than working from the office. You will "sync" (update/commit) your laptop sandbox the same way, you don't need a local server and repository. If you need a local server (reason could be you can't access the central repository from outside, need to work disconnected yet version files, etc.) there could be ways to replicate across SVN servers, but at that point maybe a distributed VCS should work better in such scenario.