Long Running Powershell Script Freezes - powershell-2.0

We are using a long running PowerShell script to perform a lot of small operations that can take an extremely long amount of time. After about 30 minutes the scripts froze. We were able to get the scripts to start running again by pressing Ctrl-C which caused the scripts to resume execution instead of killing the process.
Is there some sort of script timeout or mechanism that prevents long running scripts within PowerShell?

I had this problem due to a bad habit I have. If you select a little bit of text inside a console powershell, scripts logs freeze. Make sure nothing is selected after launching a big script :)

Like mentioned, when clicking/selecting text in powershell console, the script stops. You can disable this behaviour like this:
Right-click the title bar
Select Properties
Select Options
Under Edit Options, disable QuickEdit Mode
Note: You won't be able to select text from powershell window anymore.

Try my kill timer script. Just change the $ScriptLocation variable to the script you want to run. That script will then run as a background job while the current windows keeps track of the timer. After the time expires the current window will kill the background job and write it all to logs.
Start-Transcript C:\Transcriptlog-Cleanup.txt #write log to this location
$p = Get-Process -Id $pid | select -Expand id # -Expand selects the string from the object id out of the current process.
Write-Host $p
$BJName = "Clean-up-script" #Define Background job name
$startTime = (Get-Date) # set start time
$startTime
$expiration = (Get-Date).AddMinutes(2)#program expires at this time
# you could change the expiration time by changing (Get-Date).AddSeconds(20) to (Get-Date).AddMinutes(10)or to hours or whatever you like
#-----------------
#Timer update function setup
function UpdateTime
{
$LeftMinutes = ($expiration) - (Get-Date) | Select -Expand minutes # sets minutes left to left time
$LeftSeconds = ($expiration) - (Get-Date) | Select -Expand seconds # sets seconds left to left time
#Write time to console
Write-Host "------------------------------------------------------------------"
Write-Host "Timer started at : " $startTime
Write-Host "Current time : " (Get-Date)
Write-Host "Timer ends at : " $expiration
Write-Host "Time on expire timer : " $LeftMinutes "Minutes" $LeftSeconds "Seconds"
Write-Host "------------------------------------------------------------------"
}
#get background job info and remove the it afterwards + print info
function BJManager
{
Receive-Job -Name $BJName #recive background job results
Remove-Job -Name $BJName -Force #remove job
Write-Host "Retrieving Background-Job info and Removing Job..."
}
#-----------------
$ScriptLocation = "C:\\Local-scripts\Windows-Server-CleanUp-Script-V2.4(Beta).ps1" #change this Var for different kind of script locations
Start-Job -Name $BJName -FilePath $ScriptLocation #start this script as background job
# dont start job in the loop.
do{ #start loop
Write-Host "Working"#start doing other script stuff
Start-Sleep -Milliseconds 5000 #add delay to reduce spam and processing power
UpdateTime #call upadate function to print time
Get-Job -Name $BJName | select Id, State ,Location , Name
if((Get-Job).State -eq "Failed")
{
BJManager
}
elseif((Get-Job).State -eq "Completed")
{
BJManager
}
}
until ($p.HasExited -or (Get-Date) -gt $expiration) #check exit time
Write-Host "Timer Script Finished"
Get-Job -Name $BJName | select Id, State ,Location , Name
UpdateTime
BJManager
Start-Sleep -Milliseconds 5000 #give it some time to write to log
Stop-Transcript
Start-Sleep -Milliseconds 5000 #give it some time to stop the logging before killing process
if (-not $p.HasExited) { Stop-Process -ID $p -PassThru } # kill process after time expires

try to add percentage calculation in your script.. so you can identity that how much time it would take to complete...

Related

Powershell, wait for first process to end

Powershell script should exit if one of the main Processes stopped.
This script is the main Docker process. Docker container should stop if one of those Apps (app1, app2)stopped.
Current approach is to use Exit Events for one of the Apps and Wait-Process for the other. Is there a better approach?
$pApp1 = Start-Process -PassThru app1
$pApp2 = Start-Process -PassThru app2
Register-ObjectEvent -InputObject $pApp1 -EventName exited -Action {
Get-EventSubscriber | Unregister-Event
exit 1
}
Wait-Process -Id $pApp2.id
exit 1
Wait for the HasExited property on either of them to change:
$apps = 'app1','app2' |ForEach-Object { Start-Process $_ -PassThru }
while(#($apps |Where HasExited -eq $true).Count -lt 1){
Write-Host "Waiting for one of them to exit..."
Start-Sleep -Seconds 1
}
As of PowerShell 7.2.1, Wait-Process, when given multiple processes, invariably waits for all of them to terminate before returning; potentially introducing an -Any switch so as to only wait for any one among them is the subject of GitHub proposal #16972, which would simplify the solution to Wait-Process -Any -Id $pApp1.id, $pApp2.id
Delegating waiting for the processes to exit to thread / background jobs avoids the need for an event-based or periodic-polling solution.
# Start all processes asynchronously and get process-information
# objects for them.
$allPs = 'app1', 'app2' | ForEach-Object { Start-Process -PassThru $_ }
# Start a thread job for each process that waits for that process to exit
# and then pass the process-info object for the terminated process through.
# Exit the overall pipeline once the first output object from one of the
# jobs is received.
$terminatedPs = $allPs |
ForEach-Object { Start-ThreadJob { $ps = $using:_; Wait-Process -Id $ps.Id; $ps } } |
Receive-Job -Wait |
Select-Object -First 1
Write-Verbose -Verbose "Process with ID $($terminatedPs.Id) exited."
exit 1
Note:
I'm using he Start-ThreadJob cmdlet, which offers a lightweight, much faster thread-based alternative to the child-process-based regular background jobs created with Start-Job.
It comes with PowerShell (Core) 7+ and in Windows PowerShell can be installed on demand with, e.g., Install-Module ThreadJob -Scope CurrentUser.
In most cases, thread jobs are the better choice, both for performance and type fidelity - see the bottom section of this answer for why.
If Start-ThreadJob isn't available to you / cannot be installed, simply substitute Start-Job in the code above.
PowerShell (Core) 7+-only solution with ForEeach-Object -Parallel:
PowerShell 7.0 introduced the -Parallel parameter to the ForEach-Object cmdlet, which in essence brings thread-based parallelism to the pipeline; it is a way to create multiple, implicit thread jobs, one for each pipeline input object, that emit their output directly to the pipeline (albeit in no guaranteed order).
Therefore, the following simplified solution is possible:
# Start all processes asynchronously and get process-information
# objects for them.
$allPs = 'app1', 'app2' | ForEach-Object { Start-Process -PassThru $_ }
$terminatedPs = $allPs |
ForEach-Object -Parallel { $_ | Wait-Process; $_ } |
Select-Object -First 1
Write-Verbose -Verbose "Process with ID $($terminatedPs.Id) exited."
exit 1

Jenkins: spawn multiple processes and wait for them to terminate

I've set up a Jenkins build server that's running a nightly build for a Unity project, building two different instances of it. Once these builds are done it runs a job on a different node to copy over the build binaries and run them. What I'm running into is finding a good way for the job to (1) run both executables simultaneously, (2) wait for both of them to finish before moving to the next 'build step' in the job (where it verifies test logs etc).
Initially this seemed to work when I tested it on my own computer: https://stackoverflow.com/a/18762607/14764114
.. but it does not in Jenkins, because the Jenkins node runs as a Windows Service and thus cannot use the START command in Batch.
I'm reading that running separate services might be a solution to explore here, but before I start diving into that I figured I'd ask the community if there isn't a more elegant solution here. In summary, I want to:
Run two executables from a Jenkins build step at the same time (from a Jenkins node running on Windows)
Wait for both executables to exit before continuing to the next build step
In the end I went with this solution, as it seems the Task Scheduler seems to be the only thing capable of starting a Unity game window in my scenario. So I create a task, run it and then delete it, after which I just wait for the processes to disappear from the tasklist:
#echo off
echo "Run FirstApp"
schtasks /create /sc MONTHLY /tn FirstAppTask /tr "%TARGET_DIR%\%APP_First%\FirstApp.exe -automatedtest -duration=%TEST_DURATION_SECONDS%"
schtasks /run /tn FirstAppTask
schtasks /delete /f /tn FirstAppTask
echo "Run SecondApp"
schtasks /create /sc MONTHLY /tn SecondAppTask /tr "%TARGET_DIR%\%APP_Second%\SecondApp.exe -automatedtest -duration=%TEST_DURATION_SECONDS%"
schtasks /run /tn SecondAppTask
schtasks /delete /f /tn SecondAppTask
echo "Wait for FirstApp.exe to end"
:LOOP1
tasklist | find /i "FirstApp" >nul 2>&1
IF ERRORLEVEL 1 (
GOTO CONTINUE1
) ELSE (
ping -n 5 ::1 >NUL
GOTO LOOP1
)
:CONTINUE1
echo "Wait for SecondApp.exe to end"
:LOOP2
tasklist | find /i "SecondApp" >nul 2>&1
IF ERRORLEVEL 1 (
GOTO CONTINUE2
) ELSE (
ping -n 5 ::1 >NUL
GOTO LOOP2
)
:CONTINUE2
echo Done running tests

Simultaneous PowerShell script execution

I have written the following script for SQL patching:
cls
$computers = Get-Content D:\Abhi\Server.txt
foreach ($line in $computers)
{
psexec \\$line -s -u Adminuser -p AdminPassword msiexec /i D:\SQL_PATCH\rsSharePoint.msi SKIPCA=1 /qb
}
My doubt here is to parallelize this script execution on all the servers mentioned in the text file. Meaning, as soon I start the execution of the script, this should initiate the patching activity on the servers simultaneously and also to track the progess on all the servers, as this script is doing now only for one server.
Kindly help me on this.
Thanks Ansgar Wiechers.
This piece of code did it. It helps in executing the .exe simultaneously on all the servers as well as track their status:
cls
$servers = Get-Content 'D:\Abhi\Server.txt'
$servers | ForEach-Object {$comp = $_
Start-Job -ScriptBlock {psexec \\$input -s -u Adminuser -p AdminPassword C:\SQL_PATCH\SQLServer2008R2SP3-KB2979597-x64-ENU.exe /quiet /action=patch /allinstances /IAcceptSQLServerLicenseTerms} -InputObject $comp}
While (Get-Job -State Running)
{
Get-Job | Receive-Job
#Start-Sleep 2
#Write-Host "Waiting for update removal to finish ..."
}
Remove-Job *

Sending a nagios alert when graphite does not get data

I am collecting some metrics using graphite, but sometimes there is no data coming into it (probably because the server has gone down, or no network connectivity). I want nagios to send me an alert during such an event. How do i do that?
You could use the check_file_age script from nagios-plugins to check a single known datapoint of interest per system that you are collecting data from.
check_file_age -w 600 -c 1800 /opt/graphite/storage/whisper/servers/$(uname -f)/cpu/idl.wsp
That would alert you if a certain metric was missing within 5 minutes.
Else
You could run a find command over all the points, and report any that have not been updated in n hours.
#!/bin/bash
OLD_GRAPHS=$(find /opt/graphite/storage/whisper -mmin +120 -type f | wc -l)
if [[ OLD_GRAPHS -gt 0 ]];then
echo "Found ${OLD_GRAPHS} graph(s) without an update in 120 minutes"
exit 1
fi
echo "All graphs are up to date"
exit 0

Powershell debugging event -Action code block

I have script watching file creation in a specific directory.
I'm using Register-ObjectEvent after creating a System.IO.FileSystemWatcher,
It works great, but if I set a break point in the -Action code block the IDE generates a:
WARNING: Breakpoint Line breakpoint on 'D:\My Stuff\Desktop\scripts\fileWatcher.ps1:15' will not be hit
this message happens right after I drop a file into the directory I'm watching and I can see my Write-Host printing out my message right after the above 'warning'.
Is this normal?
I need to add more code and could really use the debugger..
What can\should I do, so I can debug this code block?
$fsw = [System.IO.FileSystemWatcher] $path
Register-ObjectEvent -InputObject $fsw –EventName Created -SourceIdentifier DDFileCreated -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp" -fore green
Out-File -FilePath $logFile -Append -InputObject "The file '$name' was $changeType at $timeStamp"
}
If you trigger the event from a direct powershell command in the same ISE or console session, it actually works:
$path = (Resolve-Path '~\').Path
if(Test-Path "$path\newfile.txt"){ del "$path\newfile.txt" }
$fsw = [System.IO.FileSystemWatcher] $path
Register-ObjectEvent -InputObject $fsw –EventName Created -SourceIdentifier DDFileCreated -Action {
$name = $Event.SourceEventArgs.Name # put breakpoint here, via ISE or script
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp" -fore green
}
'foo' | out-file "$path\newfile.txt" # breakpoint hit
But if you simply hook up the event and wait for some external process to create a file, I see the issue you are hitting.
So if you can add some test code to directly trigger your event, go with that.
If not, read on...
It seems like you cannot enter the debugger while the shell or ISE is waiting for a new command, you can only enter while another command is executing.
Going with that theory, this is the workaround (works for me):
Set your breakpoint
Run your event subscription code
Run the command "Read-Host" in the ISE console window, or from the prompt in the shell
Wait until you know your event should have fired due to external process
Hit "enter" to allow Read-Host to complete
Viola, breakpoint hit!
The real key is to have a loop running so the script never exits. Then debugging and breakpoints work.
E.g.
# At the end of the script.
while ($true) {
Start-Sleep -Seconds 1
}
More detail is here.

Resources