devops feed shows packages from upstream sources. Can I disable that? - tfs

I have two devops (on premis) feeds, in separate projects.
I have two nuget packages, one in each feed. Both packages have a dependency on the Dapper package in the nuget gallery. One feed does not list the upstream dependencies and lists no upstream feeds, and the other one lists the upstream dependencies and lets me pick nuget, maven, npmjs, and PYPI feed source options when viewing packages.
I don't really want the feed to show upstream dependencies because it clutters everything, and I can't figure out why the two feeds have different behavior. Is this a config setting somewhere?

If you create a feed and then enable the below option Upstream sources, you will encounter this issue that those dependency packages get saved automatically in your feed.
To avoid this now, you could check if this package enables the option Allow external sourced versions as this doc stated.
Also you could delete the upstream sources. Go to the Settings->Upstream sources:
See thread for more details.

Related

Azure artifacts not seeing upstream sources

So I have an Azure Artifacts feed. This feed has some packages. One if it is Newtonsoft.Json lets say 10.0.4 version. The feed is set up with an upstream source for Nuget.
When I add this feed to my project, my expectation is that when I want to get NewtonSoft latest, I would be able to get it. However, in Visual Studio or Nuget Restore, it fails to see any other version. The only version it can see is 10.0.4. I am expecting that since I specified an upstream source, it would get the packages from upstream.
I tried unlisting the 10.0.4 version, now it says newton soft is not even found.
What am I missing? What could be going on here?
For me that was a problem with permissions to the Azure Artifacts package repo.
If you are part of a group which has "Reader" access to the Azure Artifacts package repository, your nuget restores or npm installs (for JavaScript packages) WILL NOT be able to add packages to the Azure Artifacts package repository mirror/proxy.
Set the permissions of your teams/people with access to the Azure Artifacts package repository one level higher than "Reader" - set them to "Collaborator":
This is an error that's easy to miss since if you are the Artifacts package repo creator, you won't get any access errors other people might be having because you're the owner of the repo (in the picture below I'm Artur)
Once you've enabled an upstream source, any user connected to your feed can install a package from the remote feed, and your feed will save a copy.
If you add your Artifacts feed in the Package sources in Visual Studio. When you search for package NewtonSoft within your artifact feed in Visual Studio. It is the default behavior that you can only see the saved copy of package NewtonSoft.
To install the NewtonSoft latest, you can search and install it in the nuget.org package source(You need to add nuget.org to Package sources if it is not listed in the package source list )
Or you can use command line to install the NewtonSoft latest by specifying the version parameter. See below:
Install-Package Newtonsoft.Json -Version 12.0.3
You can also directly add NewtonSoft latest version as package dependency to your project by editting .csproj or package.config file. When you restore your packages. if the NewtonSoft latest is not found in your Artifacts feed, it will be downloaded from the upstream source.

Pointing Jenkins to Use Another Plugin Repository

Good afternoon,
As I understand Jenkins, if I need to install a plugin, it goes to Jenkins Plugins
The problem I have is Jenkins is installed on a closed network, it cannot access the internet. Is there a way I can download all of the plugins, place them on a web server on my local LAN, and have Jenkins reach out and download plugins as necessary? I could download everything and install one plugin at a time, but that seems a little tedious.
You could follow some or all of the instructions for setting up an artifactory mirror for the plugin repo.
It will need to be a http/https server and you will find that many plugins have a multitude of dependencies
The closed network problem:
You can take a cue from the Jenkins Docker install-plugins.sh approach ...
This script takes as input a list of plugins, and optionally versions (eg: $0 workflow-aggregator:2.6 pipeline-maven:3.6.5 job-dsl:1.70) and will download all the plugins and dependencies into a working directory.
Our approach is to create a file (under version control) and redirect that to the command line input (ie: install-plugins.sh $(< plugins.lst).
You can download from where you do have internet access and then place on your network, manually copying them to your ${JENKINS_HOME}/plugins directory and restart the instance.
The tedious list problem:
If you only specify top-level plugins (ie: what you need), every time you run the script, it will resolve the latest dependencies. Makes for a short list, but random dependencies if they get updated at https://updates.jenkins.io. You can use a two-step approach to address this. Use the short-list to download the required plugins and dependencies. Store the generated explicit list for future reference or repeatability.

Nuget Build Triggers

We are considering a move to Azure DevOPS/TFS and we have built a prototype workflow which seems to work well.
The only outstanding thing from our current CI process to replicate is the triggering of builds based on nuget package updates.
Our build pipeline is a tree, where some libraries which generate nuget packages generated at the top of the tree are used as dependencies in other libraries downstream.
Using team city one of our build steps inspects the dependencies of a solution, identifies the topmost level dependencies and adds them as nuget build triggers ensuring that the next time a successful build of a dependency occurs the downstream library is triggered as well.
How can that be replicated in Azure Dev Ops?
I think you might be after something like NuKeeper:
NuKeeper automates the routine task of discovering and applying NuGet
package updates.
NuKeeper will compare the NuGet packages used in your solution to the
latest versions available on NuGet.org, and:
List available NuGet package updates on .NET code on the local file
system or on a GitHub server.
Apply NuGet package updates to .NET code
on the local file system.
Make pull requests containing updates to
code on a GitHub server.
Image stolen from Shayne Boyer's blog.

Correct usage of Nexus IQ for Javascript based projects

I have just started out trying to use Nexus IQ server to scan a Javascript based project of mine which uses libraries from npm and bower.
I am using the Jenkins Nexus Platfom Plugin and have configured a build step to connect to our Nexus IQ server instance. As part of the plugin I have configured it to scan for Javascript files within locations of the built project where the npm and bower dependencies are installed to.
The final report that gets generated on our Nexus IQ server is huge, in fact it reaches a limit of results (10000 rows) it can display and so is unable to display everything it finds.
I'm not 100% sure if I am doing things right here, and wondered whether anyone else out there has any experience of how to get sensible results from Nexus when scanning npm\bower installed dependencies.
I'm looking at the Licence Analysis section now and can see over 3000 rows of various 'Not supported' licence threats coming from libraries that havent directly been included in the project, e.g. listed in my projects package.json file, but I guess these are child dependencies of libraries I have specified to be installed.
Can anyone offer any advice on the best approach to getting Nexus IQ to handle Javascript projects that rely on npm\bower dependencies?

How to use NuGet packages on build server/production server without internet?

Background
I have the following components:
My local solution (.NET 4.5) which makes use of NuGet packages.
A PowerShell build script in my solution that has targets to build, run unit tests, to Web.config transforms, etc.
A build server without an internet connection running CruiseControl.NET that calls my build script to build the files. It also serves as the (IIS7) environment for the dev build.
A production server with IIS7 that does not have internet access.
Goal
I would like to utilize NuGet packages from my solution and have them be stored locally as part of source -- without having to rely on an internet connection or nuget package server on my build and production servers.
Question
How can I tell MSBuild to properly deploy these packages, or is this the default behavior of NuGet?
Scott Hanselman has written an excellent article entitled How to access NuGet when NuGet.org is down (or you're on a plane). If you read through this article, you'll see at the end that the suggestions he makes are primarily temporary-type solutions and he goes out of his way to say that you should never need the offline cache except in those emergency situations.
If you read at the bottom of his article, however, he makes this suggestion:
If you're concerned about external dependencies on a company-wide
scale, you might want to have a network share (perhaps on a shared
builder server) within your organization that contains the NuGet
packages that you rely on. This is a useful thing if you are in a
low-bandwidth situation as an organization.
This is what I ended up doing in a similar situation. We have a share which we keep with the latest versions of various packages that we rely on (of course, I'm assuming you're on some type of network). It works great and requires just a little work to update the packages on a semi-regular basis (we have a quarterly update cycle).
Another article that may also be of help to you (was to me) is: Using NuGet to Distribute Our Company Internal DLLs
By default, Nuget puts all your dependencies in a packages/ folder. You can simply add this folder to your source control system and Nuget will not need to download anything from the internet when you do your builds. You'll also want to ensure that Nuget Package Restore isn't configured on your solution.
You'll have to make a decision; either you download/install the packages at build time (whether it be using package restore, your own script, or a build tool that does this for you), or you put the /packages assemblies in source control as if they were in a /lib directory.
We've had so many problems with using package restore and NuGet's Visual Studio extension internally that we almost scrapped NuGet completely because of its flaws, despite the fact that 1 of our company's 2 products is a private NuGet repository.
Basically, the way we manage the lifecycle is by using a combination of our products BuildMaster and ProGet such that:
ProGet caches all of our NuGet packages (both packages published by ourselves and ones from nuget.org)
BuildMaster performs both the CI and deployment aspect and handles all the NuGet package restoration so we never have to deal with massive checked-in libraries or the solution-munging nightmare that is package restore
If you were to adopt a similar procedure, it may be easiest to create a build artifact in your first environment which includes the installed NuGet package assemblies, then simply deploy that artifact to your production without having to repeat the process.
Hope this helps,
-Tod
I know this is an old discussion, but how in the world is it bad to store all the files required to build a project because of the size?
The idea that if a library is not available that you should replace it is crazy. Code cost money and since you don't control the libraries on git, or in nuget, a copy should be available.
One requirement that many companies have is an audit. What if a library was found to steal your data. How would you know for sure if the library is removed from NUGET and you can't even build the code to double check.
The one size fits all Nuget and git ways of the web are not OK.
I think the way Nuget worked in the past, where the files were stored locally and optionally place in source control is the way to go.

Resources