Foreach loop container SSIS - foreach

I have a Foreachloop Container in my SSIS package under which a File system task is placed which moves files from Source Folder to the Destination folder and with that container, some other tasks are further connected.
Every time I run the package, all task starts running one by one despite having any new file in Source folder which always takes time.
Is there any way that if a new file added in Source folder then only all task
run and if there is no file then only container run and package gets fail and show the message of No new file found through Script task?

You could store the names of the files that were loaded in a table and then use a script task to check if the files in the folder are present in that table and if any new file is found then execute the entire process

Related

Copy template folder to current folder with environment variable

I would like to speed up the process of creating folders, I have a template folder with subdirectories that I use often and every time I have to go there in the path where the template folder is and copy it to start a new project.
The thing is, I thought of creating an environment variable that would do this for me, the variable would execute a command like this robocopy "%~d0\model folder\" "%cd%" /e
That is: copy the contents of the model folder path that is on that same drive and paste it in the folder where this command prompt is open.
Obviously this doesn't work, because if the command prompt is already open in a folder it won't be able to access the source path to make the copy...
Any idea how I could make it work?

Deleting folder and script after execution

I am trying to run a script that is deployed by SCCM in a randomly named folder in computers. I need to make sure that after the execution is completed, the script and the folder is deleted. I found various posts where the folder name was specific but in my case, the folder name differs every time. I am new to PSScripts and any help is greatly appreciated.

How to copy nupkg from build process to drop folder

TL/DR: In a release step, how do I find a .nupkg file that was definitely created in a build process and copy it to a drop folder for use in a release task?
Using TFS 2018, I am trying to copy a .nupkg file created in a prior Build task to the drop folder.
...In the Build Process...
From the log, I know that the file was created.
Successfully created package
'C:\agent_work\9\a\StaticHelpers.1.0.0.nupkg'.
What I am trying to figure out is how I can find this file and copy it to the drop folder. Using Build Variables for inspiration, I have tried the following. At first, I thought it was successful because of what the log said.
Source Folder: $(Agent.BuildDirectory)
Contents: *\*.nupkg
Target Folder: drop
Result:
found 1 files Copying C:\agent_work\9\a\StaticHelpers.1.0.0.nupkg to
drop\a\StaticHelpers.1.0.0.nupkg
All that means is that I can create a release process that takes that file and copies it in a copy release step, right?
...In the Release Process...
Not right. There is nothing in the drop folder when I created a copy file release task and tried to select the nuget package that was definitely created in the build. What I need to do is take that *.nupkg file created during the Build process and copy it to a network share.
So I tried to hard-code the folder based on what I copied from the build log.
Source Folder: drop\a
The release failed, showing this in the log:
[error]Unhandled: Not found SourceFolder: C:\agent_work\r4\a\drop
Either I am copying the file to the wrong location or I am reading from the wrong location. What folders do I need to use so that I can see the *.nupkg file in my release task?
In your build process, don't use a Copy Files task, use a Publish Artifacts task. That will publish an artifact "attached" to the build that a release will automatically pick up during deployment.

Build vNext CodedUI Item Deployment - items specified in .testsettings not being copied

I have a CodedUI test suite. I'm attempting to use a B-D-T workflow in TFS 2015 R2 to deploy the test .dlls to a machine group and execute them.
The tests rely on a couple files - an html start page and an XML file.
First, I attempted to have the tests just run from the bin folder OR where they're deployed onto the machines in the machine group. No suggestions I found on SO or on the first 3 pages of google worked. Whenever I ran a test locally it would copy the test dlls to the TestResults folder and execute from there.
Then, I attempted to use a .testsettings file to deploy the files with the .dlls into the TestResults folder. I still truly don't understand why this is the best or only solution - if I can just execute from the Bin please let me know how, but it does work - locally.
Now in TFS 2015 when I deploy my testing software to the test agent(Member of the machine group), I can see my supporting files exactly where they should be in the test drop location. Then, the test starts. A temp folder is created in a byzantine region of AppData, where only the .dlls and config are copied - not the files specified in the .testsettings file! The testsettings file is specified in the "Run Functional Tests" task, which I reached by browsing to the correct file in source control.
I can provide any information you require. This should be so simple but it has been an enormous headache.
In the configuration you need to setup where you copied the files to on the remote machine using a "Windows Machine File Copy" task executed by the agent. You need to make sure all the files needed to run are copied to the server.
Then in your .testsettings file specify which of the files you copied over you want to deploy to the temporary test context directory. Only files already copied to the test agent can be selected here.
The File copy task would look something like:

Can't see copied Jenkins jobs from one instance to the other in a destination instance folder?

I am copying Jenkins jobs from one instance to the other. I created a folder called "
Old_Jobs" in the destination instance under jobs directory. If I copy all the jobs under this Old_jobs directory and reload configuration from disk, I can't see those jobs in the Jenkins GUI. However, if I copy those jobs under the "jobs" directory, I can see all the jobs in Jenkins GUI.
Is there any way I can see all my copied jobs under /var/lib/jenkins/jobs/Old_Jobs/ directory?
Note- I have tried changing permission to 777 in the destination folder, but it didn't work.
Ownership is also correct in the destination instance.
AFAIK, all the jobs are listed under /jobs/
Since you have created one more directory, "Old_Jobs" under /jobs, the required structure is not present.
Also, I remember facing similar issue (even with keeping same directory structure) and I had to copy "/workspace" folder as well to the new instance.
You can refer required directory structure over here : https://wiki.jenkins.io/display/JENKINS/Administering+Jenkins
It also mentioned below points:
Moving/copying/renaming jobs You can:
Move a job from one installation of Jenkins to another by simply copying the corresponding job directory.
Make a copy of an existing job by making a clone of a job directory by a different name.
Rename an existing job by renaming a directory. Note that the if you change a job name you will need to change any other job that tries
to call the renamed job.
Those operations can be done even when Jenkins is running. For changes
like these to take effect, you have to click "reload config" to force
Jenkins to reload configuration from the disk.

Resources