Dockerfile dotnet restore locally instead of reaching to nuget server - docker

I'm working on a .net core 3.1 project for a company. I have a pretty simple dockerfile configured that leverages RUN dontnet restore to retrieve all the necessary nuget packages for my project. This works fine for me. It will build and run the container no problem.
The problem comes when the company pulls my solution to their network and they try to run the same dockerfile. Their firewall rules block SSL connections to external sites (with explicit exceptions that nuget is not part of). So when they try to build the container they get failure when the restore tries to access "https://api.nuget.org/v3/index.json". Which makes sense to me for the reasons above.
If I update my dotnet build to include nuget packages as part of the published output, can I have the dockerfile update to reference that published output for nuget packages instead of doing a dotnet restore from nuget itself? Essentially is there a good way to handle the docker build process so it can be done offline (i.e. without relying on external sources)?

You could create a local cache using NuSave, then add it to your Docker container; which can be used to restore the solution offline.

As #Lex Li mentioned above, the better way to do things was a self contained publish with dotnet and then copying those files over into the container so that there was no need to do a dotnet restore.
I typically like to do dotnet restore to keep the size small but in this case this was the right way to do it.

Related

Best practice for running Symfony 5 project with Docker and Docker-Swarm

I have an existing Symfony 5 project with a mysql database and a nginx Webserver. I wanna dockerize this project, but on the web I have found different opinions how to do that.
My plan is to write a multi-stage Docker file with at least a dev and a prod stage and let this build with docker-swarm. In my opinion it is useful to install the complete code during the build and having multiple composer.json files (one for every stage). In the web I have found opinions to not install the app new on every build but to copy the vendor and var folder to the container. Another opinion was to start the installation after the build process of the container is ready. But I think with that the service is not ready, when the app is successfully deployed.
What are you thinking is the best practice here?
Build exactly the same image for all environments
Do not build 2 different images for prod and dev. One of the main docker benefits is, that you can provide exactly the same environment for production and dev.
You should control your environment with ENV vars. for example, you can enable Xdebug for dev and disable it for prod.
Composer has option to install dev and production packages. You should use this feature.
If you decide to install some packages to dev. Try to use the same Dockerfile for both environment. Do not use Dockerfile.prod and Dockerfile.dev It will introduce some mess in the future.
Multistage build
You can do multistage build described in the official Docker documentation if your build environment requires much more dependencies than your runtime.
Example of it is compilation of the program. During compilation you need a lot of libraries, and you produce single binary. So your runtime does not need all dev libraries.
First stage can be build in second stage you just copy binary and it is.
Build all packages into the docker image
You should build your application when Docker image is building. All libraries and packages should be copied into image, you should not install them when the application is starting. Reasons:
Application starts faster when everything is installed
Some of the libraries can be changed in future or removed. You will be in troubles and probably you will spend a lot of time to do debugging.
Implement health checks
You should implement Health-Check. Applications require external dependencies, like Passwords, API KEY, some non sensitive data. Usually, We inject data with environment variables.
You should check if all required variables are passed, and have good format before your application is started. You can implement Health-Check or you can check it in the entrypoint.
Test your solution before it is released
You should implement mechanism of testing your images. For example in the CI:
Run your unit tests before image is built
Build the Docker image
Start new application image with dummy data. If you require PostgreSQL DB you can start another container,
Run integration tests.
Publish new version of the image only if all tests pass.

Copying huge files into container (from arbitrary location)?

Docker newbie doing a POC. We have one web service that relies on a huge 3rd party library (30GB+). I'm playing around with how to get it into the container so I can run the installer.
From my understanding, and like I said, I'm a newbie, the Dockerfile is going to produce layers? So if the first step is installing the 30GB library, that should be a cached layer, correct?
A few ugly catches:
1) the library needs to be updated once a month
2) we have a build / deployment group, so I'd like to set it up in a way where they can do it... they aren't developers, so this would ideally be as automated as possible for them. They do build the code and have a powershell script to deploy to all the VMs we currently use.
3) they use team build to build the visual studio solutions, but they just push the build button, they don't really do much beyond that.
So ideally, they'd download the monthly update zip, put it in a specific location then run the team build and the dockerfile would copy the zip into the container and run the monthly install.
Obviously I don't want to rebuild the entire image from scratch every build which is why I was mentioning that it would be the first step.
Or do I need to go the whole route of building a custom image?

Best practice/way to develop Golang app to be run in Docker container

Basically what the title says... Is there a best practice or an efficient way to develop a Golang app that will be Dockerized? I know you can mount volumes to point to your source code, and it works great for languages like PHP where you don't need to compile your code. But for Go, it seems like it would be a pain to develop alongside Docker since you pretty much only have two options I guess.
First would be to have a Dockerfile that is just onbuild so it starts the go app when a container is run, thus having to build a new image on every change (whether it be small or not). Or, you do mount your source code dir to the container dir, then attach to the container itself and do the manual go build/run yourself as if you would normally.
Those two ways are really the only way that I see it happening unless you just don't develop your Go app in a docker container. Just develop it as normal, then use the scratch image method where you pre build the Go into a binary then copy that into your container when you are ready to run it. I assume that is probably the way to go, but I wanted to ask more professional people on the subject and maybe get some feedback on the topic.
Not sure it's the best pratice but here is my way.
Makefile is MANDATORY
Use my local machine and my go tools for small iterations
Use a dedicated build container based on golang:{1.X,latest}, mount code directory to build a release, mainly to ensure that my code will build correctly on the CI. (Tips, here is my standard go build command for release build : CGO_ENABLED=0 GOGC=off go build -ldflags -s -w)
Test code
Then use a FROM scratch to build a release container (copy the bin + entrypoint)
Push you image to your registry
Steps 3 to 6 are tasks for the CI.
Important note : this is changing due to the new multistage builds feature : https://docs.docker.com/engine/userguide/eng-image/multistage-build/, no more build vs release containers.
Build container and release container will be merged in one multistage build so one Dockerfile with (not sure about the correct syntax but, you will get the idea) :
FROM golang:latest as build
WORKDIR /go/src/myrepos/myproject
RUN go build -o mybin
FROM scratch
COPY --from=build /go/src/myrepos/myproject/mybin /usr/local/bin/mybin
ENTRYPOINT [ "/usr/local/bin/mybin" ]
Lately, I've been using
https://github.com/thockin/go-build-template
As a base for all of my projects. The template comes with a Makefile that will build/test your application in a Docker.
As far as I understood from you question, you want to have a running container to develop a golang application. The same thing can be done in your host machine also. But good thing is that if you could build such application, then that will be consider as cloud Platform-as-a-Service(PaaS).
The basic requirement of the container will be: Ubuntu image and other packages such as editor, golang compiler and so on.
I would suggest to look on the docker development environment.
https://docs.docker.com/opensource/project/set-up-dev-env/
The docker development environment is running inside a container and the files are mounted from one of the host directory. The container image is build from Ubuntu scratch image and added required packages which are needed to compile docker source code.
I hope you almost got what you are looking for.

Can I use EF6 migrations in a separate project with a ASP.NET Core Web Application (.NET Framework)?

I've been trying to work this out for a little while now and I can't see how it's possible, which seems odd given that .NET Core is properly released and EF6 is still recommended for some mainline cases. However, I'm new to .NET Core, so hopefully I'm just missing something obvious.
I have a ASP.NET Core Web Application and have been trying to add EF models and migrations in a separate project. I've tried two main paths and both have hit different road blocks.
Attempt 1: Create the EF library as an old fashioned class library
This works OK within Visual Studio (although you have to mess around with the start-up project), but when I try to use "dotnet restore" to build things from the command line (so I can get my CI set up) I get:
Errors in C:\MyProject\src\My.Website\project.json
Unable to resolve 'My.DataModel' for '.NETFramework,Version=v4.5.2'.
Using the "dotnet" command appears to be the new way of running command line builds, but perhaps it's not the right way for this case?
Attempt 2: Create EF library as a .NET Core class library
This way fixes the error when running "dotnet restore" but has the rather more fatal flaw of breaking the package manager console commands for creating migrations (they appear to require a .csproj file to work). This answer looked like it might be a solution, but I couldn't get it to work for me.
I think that my attempt 1 is more likely to be the right way to go. In which case this question may boil down to "how do I use 'dotnet restore' with non-.NETCore libraries?"
I've just found this thread, I've tried a few things from there and they haven't worked yet, but I'll keep looking at it.
As far as I can tell the answer is "No" if you want what I would consider a sensible CI set up. If you can run a full install of VS2015 on the CI server you may be able to get this to work somehow.
I got further with my attempt 1 but ended up at what looks like a terminal (at the moment) road block. Currently Visual Studio 2015 will create lock.json files that correctly include my EF project. However, running "dotnet.exe restore" will then remove the lines referencing my EF project and break the build.
I hope that eventually dotnet.exe will catch up with Visual Studio and correctly handle project dependencies but for now this seems like a terminal issue. Even if I commit the lock.json files to source control the CI server needs to run dotnet restore to pull down dependencies and so will break the lock.json files.
I had to jump through quite a few other hoops to get this far, so I'll document them below in case it helps anyone else.
Nuget
You need to install nuget onto your build server (or have your build download it). This is probably fairly common, but our build server didn't have nuget already on it. I have tried nuget 2.8.6 which appears to only download the .csproj dependencies, and nuget 3.5.0-rc1 which appears to behave very similarly to dotnet.exe.
Dotnet
Again you need to install this on the build server (not unreasonable). However it will produce an error when trying to handle the project dependency so you may need to ensure that your build doesn't fail at that point (e.g. by doing this)
Msbuild
At this point my local Msbuild complained about the EF project with:
Your project.json doesn't list 'win' as a targeted runtime. You should add '"win": { }' inside your "runtimes" section in your project.json, and then re-run NuGet restore.
That project didn't have a project.json file at all at that point so I added one with the following contents:
{
"frameworks": {
"net452": {
}
},
"runtimes": {
"win": {}
}
}
As hinted at on this thread. Although I've just noticed that the presence of this file stops nuget 3.5.0-RC1 from downloading the Entity Framework dependency for my EF project at all (Nuget 2.8.6 ignores this file and downloads EF fine)
The end
At this point my local msbuild will build the project as long as I've opened it in VS2015 so that VS2015 has had a chance to build the correct lock.json files.
On my CI server if I comment out the "dotnet restore" step and commit my lock files to source control my EF project reference isn't a problem any more (I still have an issue with the reference to Entity Framework itself, but that's probably a separate issue). However, leaving out the dotnet restore step is not a viable solution - things are only working because that step was there before and has already downloaded the relevant dependencies.
Note that if you're going to try to reproduce this, VS2015 will notice and re-write the lock.json files back to a working version immediately if you run "dotnet restore" so you need to close VS2015 or have a separate checkout to see the issue.

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

Resources