Deleting folder and script after execution - powershell-2.0

I am trying to run a script that is deployed by SCCM in a randomly named folder in computers. I need to make sure that after the execution is completed, the script and the folder is deleted. I found various posts where the folder name was specific but in my case, the folder name differs every time. I am new to PSScripts and any help is greatly appreciated.

Related

Foreach loop container SSIS

I have a Foreachloop Container in my SSIS package under which a File system task is placed which moves files from Source Folder to the Destination folder and with that container, some other tasks are further connected.
Every time I run the package, all task starts running one by one despite having any new file in Source folder which always takes time.
Is there any way that if a new file added in Source folder then only all task
run and if there is no file then only container run and package gets fail and show the message of No new file found through Script task?
You could store the names of the files that were loaded in a table and then use a script task to check if the files in the folder are present in that table and if any new file is found then execute the entire process

Loosing environment variables in .cproject in Eclipse Oxygen

I have some Yocto recipes running that clean out my source directories before building. The problem is that I also have an Eclipse CDT project defined in the same source directory that gets deleted when I do the clean. I modified my scripts so that the .project and .cproject files would get copied to a safe location and then copied back once the build is complete. Everything appears to get restored satisfactorily except for the environment variables defined in the Project's Build Configurations (right-click Project->Properties->C/C++ Build->Environment). They get lost.
Now, I can see the environment variables defined in the org.eclipse... directory but it appears that they are time stamped somehow and are out of sync with the Project. How can I get the Project's Build Configurations to restore automatically? (it is a pain to have to redefined these variables everytime I do a clean/build).
Found the problem. There is a directory called .settings that appears along side the .project and .cproject files that also needs to be saved and then restored. When this is done the environment variables define under the project are also restored.

Jenkinsfile Pipeline do something when some checked-in file is changed / newly checked out / run in a fresh node

I'm trying to find a way to run some optional code in case a specific file is changed / newly checked out / run in a fresh node.
Specifically I'm trying to rebuild a Dockerfile which is used for the following build-steps only whenever the Dockerfile that's checked in actually changes.
Any pointers are much appreciated!

Can JENKINS create build by excluding particular files?

I am actually trying to ignore a particular file to be included in the build. Actually it is a thumb.db file which is automatically created whenever images are encountered in a folder. My solution contains the image folder. So basically, whenever the build is triggered from JENKINS, it will create the thumb.db file.
Is there any way, I can ignore the Thumb.db file from getting created via JENKINS?
I can switch off the thumb.db file from creation by switching it off from my windows, but I have to do it every time a build is created from JENKINS. So I want to ignore the thumb.db file from creation.
Below is the Job creation flow in my JENKINS from my current project:
SCM
I have used the Team Foundation Server Plugin and have mentioned my SERVER URL and PROJECT PATH
POST BUILD steps simply create the build to the staging location(folder Location)
Any help is appreciated.
Thanks,
AFAIK, on windows you can turn it off or on for the whole system and not for particular folders.
So you have to choose. On or off.
If you still want to keep it on, you'l have to handle this in your Jenkins build scripts. Remove any thumbs.db files after build is done.
If your output is an archive, make sure to exclude thumbs.db from it. All archives support an exclude flag.
I hope this helps.

Can't see copied Jenkins jobs from one instance to the other in a destination instance folder?

I am copying Jenkins jobs from one instance to the other. I created a folder called "
Old_Jobs" in the destination instance under jobs directory. If I copy all the jobs under this Old_jobs directory and reload configuration from disk, I can't see those jobs in the Jenkins GUI. However, if I copy those jobs under the "jobs" directory, I can see all the jobs in Jenkins GUI.
Is there any way I can see all my copied jobs under /var/lib/jenkins/jobs/Old_Jobs/ directory?
Note- I have tried changing permission to 777 in the destination folder, but it didn't work.
Ownership is also correct in the destination instance.
AFAIK, all the jobs are listed under /jobs/
Since you have created one more directory, "Old_Jobs" under /jobs, the required structure is not present.
Also, I remember facing similar issue (even with keeping same directory structure) and I had to copy "/workspace" folder as well to the new instance.
You can refer required directory structure over here : https://wiki.jenkins.io/display/JENKINS/Administering+Jenkins
It also mentioned below points:
Moving/copying/renaming jobs You can:
Move a job from one installation of Jenkins to another by simply copying the corresponding job directory.
Make a copy of an existing job by making a clone of a job directory by a different name.
Rename an existing job by renaming a directory. Note that the if you change a job name you will need to change any other job that tries
to call the renamed job.
Those operations can be done even when Jenkins is running. For changes
like these to take effect, you have to click "reload config" to force
Jenkins to reload configuration from the disk.

Resources