I have an F# script referencing a package in
%USERPROFILE%\.nuget\packages
FSI is happy with
#I #"C:\Users\Username\.nuget\packages"
but not
#I #"%USERPROFILE%\.nuget\packages"
How can the #I path be the nuget packages directory for the current user?
Related
create an FSharp fsx script, type this:
#r "nuget: FSharp.Data"
get
Error FS3217 ...: error NU1301: Unable to load the service index for source https://pkgs.dev.azure.com/.../index.json. F# Miscellaneous Files ...\Script.fsx 1 Active
I suspect this is because i have a custom Nuget source defined in addition to nuget.org, is there a way I can force the script to just look in a nuget.org?
I find the nuget client configuration at best confusing.
I'm using visual studio 2022
I fixed it by disabling the other sources
Tools->Manage Packages for solution-> package source setting icon
disable the ones you don't want
RESTART visual studio
then it will work
it would be nice if you could specify the package source in the script, i find nuget config a bit of a black art, and I'd rather control explicitly in the script that have to change config (which i will inevitably change back and forget)
I want to build a docker container which I can use for our continuous integration. Within that, I want to build an application which needs node, .net core and within the .net core project T4 transformation.
How can I transform T4 files which need external assemblies e.g. System.Data.DataSetExtensions in a Linux Docker container?
I tried the mono t4 engine https://github.com/mono/t4. I installed it in my docker container and I am able to do simple T4 transformation. But in our T4 files, I need assemblies. e.g.
<## Assembly Name="System.Data" #>
<## Import Namespace="System.Data" #>
.... some code
DataSet dsProcedures = new DataSet();
..... more code
So I used dotnet restore to install all our dependencies within the docker container. So System.Data.DataSetExtensions is provided as a NuGet package.
So I tried to use command line parameters of dotnet-t4 to provide the path of the assembly. That did not help so I changed the assembly to
<## Assembly Name="/root/.nuget/packages/system.data.datasetextensions/4.5.0/lib/netstandard2.0/System.Data.DataSetExtensions.dll" #>
At least it did not complain anymore that it can not find the assembly but now it returns the following error.
ERROR: The type or namespace name 'DataSet' could not be found (are you missing a using directive or an assembly reference?)
I am using the following Dockerfile
FROM microsoft/dotnet:sdk
RUN dotnet tool install -g dotnet-t4
ENV PATH="/root/.dotnet/tools:${PATH}"
Is there any way to transform T4 files with assembly dependencies in a Linux Docker container?
Would you switch to a different transformation engine?
Or is the only option to use a windows docker container?
The problem was a bug in the mono t4 library and got fixed with version 2.0.4.
For more information check the GitHub issue: https://github.com/mono/t4/issues/46
My Nuget-Pack-Step in the TFS isn't working, because it is searching my DLL-File at the wrong path.
My Actual path to the dll is:
D:\Agent1\_work\21\s\src\MyProjectName\bin\Any CPU\Release\netstandard2.0\MyProjectName.dll
But the Nuget-Pack step path is missing my Any CPU. But I don't see an option to set it.
'D:\Agent1\_work\21\s\src\MyProjectName\bin\Release\netstandard2.0\MyProjectName.dll'.
The NuGet Pack task takes the following arguments:
Path to csproj or nuspec file(s) to pack
Configuration to package
I suspect that you haven't set the configuration property to Any CPU resulting in the task not being able to find your DLL.
I'm trying to build projects in project group from command line using MSBuild. After reading this page, my batch file looks like this:
SET BDS=C:\Program Files (x86)\Embarcadero\Studio\17.0
SET FrameworkDir=C:\Windows\Microsoft.NET\Framework
SET FrameworkVersion=v3.5
"%ProgramFiles(x86)%\MSBuild\14.0\Bin\MSBuild" .\Source\MyProjectGroup.groupproj /t:build /p:config=Debug /p:Platform=Win32 /verbosity:minimal /fileLogger /fileLoggerParameters:LogFile=Build.log;Verbosity=detailed;Append=true
Build fails, if I try to perform a "clean" build (that is, get source files from source control and run build from command line).
Looks like it tries to build projects in order they are placed in groupproj file. Consider this example:
there are two package projects, package A and package B;
package B requires package A;
package B is placed before package A in groupproj file.
In this case, "clean" build will fail, but if I reorder projects in project group, or build package A first, build will be successful.
E.g., MSBuild targets for C# resolve dependencies from project references.
But groupproj neither include dependencies info:
<Projects Include="NativePackages\Drawers\Drawers.dproj">
<Dependencies/>
</Projects>
nor processing DCC_Reference properties in dproj files:
<DCCReference Include="Drawers.dcp"/>
Am I doing something wrong?
Is there any option/property to trigger?
Could MSBuild targets for Delphi resolve dependencies automatically?
UPDATE
I know about "Dependencies..." context menu item in Project Manager (it just affects Dependencies tag in groupproj file).
In Jenkins How can i use FxCop Runner Plugin with Reference .dlls that additionally to the .dll to be analyse? I'm using FxCop Runner Plugin for jenkins in order to Analyse the builds. But i got following error
One or more referenced assemblies could not be found. Use the '/directory' or '/reference' switch to specify additional assembly reference search paths.
How can I add references with the jenkins runner plugin?
Which version of FxCop are you running ?
Two things:
1 Add the /gac flag to FxCop
2 Change the FxCopCmd.exe.config file to AssemblyReferenceResolveMode from StrongName to StrongNameIgnoringVersion
OR (this is NOT what you want, but) create a FxCop Project File and add all the references there; then call it in cmd.
"C:\Program Files (x86)\Microsoft Fxcop 10.0\FxCopCmd.exe" /project:"C:\Jenkins\jobs\8.5-StaticCodeAnalsis\Project.FxCop" /out:"C:\Jenkins\jobs\8.5-StaticCodeAnalsis\workspace\FxCopReport.xml"