I'm trying to write a simple script that allows me to check a directory, and find any matching filenames.
Once I find a match, I simply want to append the curent datetime to the front of the filename.
This sounds easy, but I can't quite figure it out
So far I have got
$directory = "c:\directoryName\"
$filename = "filename.log"
$d = Get-Date -format "yyyyMMddHHmm"
get-childitem -Path $cfg_logDirectory | foreach-object
{
if ($_.name -eq $cfg_fileName)
{
$fname = $_.name
Rename-Item -NewName $d."_".$fname
}
}
But When I run this (in ISE at least), I get a window comes up asking me for
"Cmdlet ForEach-Object at Command pipeline position 2 - Supply values
for the following parameters: Process[0]"
Is this the way to go, or is there a better way to do this?
Move your opening curly brace to the end of the previous line.
get-childitem -Path $cfg_logDirectory | foreach-object {
You're getting that error because Powershell is assuming the end of the line is the end of the statement, and then trying to prompt you for the missing parameter to foreach-object - in this case, its script block*. Moving the opening brace to the same line tells Powershell that there are more statements coming.
*Why 'Process'? The foreach-object cmdlet can take three script blocks as parameters: Begin, Process, and End. Begin and End are optional, but Process is required. If you supply a script block without a name, it is assumed to be Process.
Related
I have to open 2 URLs from powershell. Each creates a log file. Once the log file of the 1st URL is complete, the 2nd URL has to be launched. The completion of the file is indicated by a string "end of log". Each log takes a minimum of 15 minutes. Since the duration is not certain, I didn't use sleep commanlet.
Ok so its rather simple, You use Select-String to search a text file for a string. You do this in a while block so while Select-String returns false (-Quiet makes the cmdlet return a bool true or false if the value exists in the file) Wait the amount of time and then check again.
While ($(Select-String -Path 'C:\Logs\File1.txt' -Pattern 'end of log' -quiet) -eq $False)
{
Write-Host "No end of log in file 1 waiting.."
Sleep -Seconds 3
}
Since your first script takes 15minutes to finish, You might want to make it wait longer between checks since if its to frequent it might slow down the first script.
We have a error log directory structure wherein we store all errors log files for a particular day in datewise directories -
errorbackup/20150629/errorlogFile3453123.log.xml
errorbackup/20150629/errorlogFile5676934.log.xml
errorbackup/20150629/errorlogFile9812387.log.xml
errorbackup/20150628/errorlogFile1097172.log.xml
errorbackup/20150628/errorlogFile1908071_log.xml
errorbackup/20150627/errorlogFile5675733.log.xml
errorbackup/20150627/errorlogFile9452344.log.xml
errorbackup/20150626/errorlogFile6363446.log.xml
I want to search for a particular string in the error log file and get the output such that I will get directory wise search result of a count of that string's occurrence. For example grep "blahblahSQLError" should output something like-
20150629:0
20150628:0
20150627:1
20150626:1
This is needed because we fixed some errors in one of the release and I want to make sure that there are no occurrences of that error since the day it was deployed to Prod. Also note that there are thousands of error log files created every day. Each error log file is created with a random number in its name to ensure uniqueness.
If you are sure the filenames of the log files will not contain any "odd" characters or newlines then something like the following should work.
for dir in errorbackup/*; do
printf '%s:%s\n' "${dir#*/}" "$(grep -l blahblahSQLError "$dir/"*.xml | wc -l)"
done
If they can have unexpected names then you would need to use multiple calls to grep and count the matching files manually I believe. Something like this.
for dir in errorbackup/*; do
_dcount=0;
for log in "$dir"/*.xml; do
grep -l blahblahSQLError "$log" && _dcount=$((_dcount + 1));
done
done
Something like this should do it:
for dir in errorbackup/*
do
awk -v dir="${dir##*/}" -v OFS=':' '/blahblahSQLError/{c++} END{print dir, c+0}' "$dir"/*
done
There's probably a cuter way to do it with find and xargs to avoid the loop and you could certainly do it all within one awk command but life's too short....
As a TFS admin, time and again I have to archive/move the branches to other folders to make sure that our TFS folders are not cluttered with old un-used branches. But when I try to MOVE the branches, if any of the developers have checked-out a file from that branch in their workspace then TFS doesn't allow me complete the operation. I have to undo all those checkouts (by all the users) before I can MOVE the branch.
The TFS Power tools provides some relief here. It helps you to undo others checkout from within Visual studio (or command line). Right click the branch -> Find -> Find by Wildcard. You can see the screenshots below:
But the catch is that it can only perform the UNDO operation for one user at a time. So in a large organization if you have 100-200 developers working in a branch, that means if 100 developers have each checked out 1 file each from the branch, then I will have to press UNDO button 100 times to make the branch checkout free.
I searched extensively and couldnt find any out of the box solution. Finally the solution that I came up for this it to create a powershell script which queries the TFS (for a specific branch) to find the list of files checked out to users, then it loops through the user list and UNDO all the files checked-out to that user under the branch.
Does anyone have a better/easier solution? I will wait for inputs and if I dont see much response, I will add the script here so people who are in the similar situation can make use of it.
You should use the TFS Sidekicks. They have the ability to easily discover and undo those changes.
http://www.attrice.info/cm/tfs/
I would however question the viability of 'moving' the branches as TFS does a 'branch+delete' under the covers. You would be better deleting the branch and using the 'show deleted items' toggle to view that old stuff...
As #MrHinsh mentioned, you can install TFSSideKicks to solve the problem. Thank you MrHinsh.
If you do not want to install additional tools, you can achieve the same by using the following powershell script. Update the following parameters in the script and run it:
TF Location
Branch Name
TFS Collection Name
Temp Log file location
$tfLocation = "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE"
$tfsBranchName = “{enter your TFS branch/folder/file location}“
$tfsCollectionName = "http://tfsserver:8080/tfscollection”
$logFile = “{log file location}“
Function GetUserFileList($tfsBranchName)
{
try{
# Array to hold the object (user/file/workspace) objects
$arrayFileUserMapping = #();
If (Test-Path $logFile)
{
Remove-Item $logFile;
}
Set-Location $tfLocation;
.\tf.exe status $tfsBranchName /collection:$tfsCollectionName /user:* /format:detailed /recursive >> $logFile;
Set-Location $PSScriptRoot;
foreach ($line in Get-Content $logFile)
{
If($line.StartsWith("$"))
{
$objCurrFile = New-Object System.Object;
$splitStringFile = $line -Split ";";
$objCurrFile | Add-Member -type NoteProperty -name FileName -value $splitStringFile[0];
}
Else
{
foreach ($singleLine in $line){
If($singleLine.StartsWith(" User"))
{
$splitStringUser = $singleLine -Split ":";
$objCurrFile | Add-Member -type NoteProperty -name User -value $splitStringUser[1];
}
ElseIf($singleLine.StartsWith(" Workspace"))
{
$splitStringWS = $singleLine -Split ":";
$objCurrFile | Add-Member -type NoteProperty -name Workspace -value $splitStringWS[1];
}
}
}
$arrayFileUserMapping += $objCurrFile;
}
$uniqueWorkspaceArray = $arrayFileUserMapping | Group Workspace
$uniqueUserArray = $arrayFileUserMapping | Group User
for($i=0;$i -lt $uniqueUserArray.count; $i++)
{
$workspaceWOSpace = $uniqueWorkspaceArray[$i].Name.Trim()
$userWOSpace = $uniqueUserArray[$i].Name.Trim()
$workspace = $workspaceWOSpace + ";" + $userWOSpace;
Set-Location $tfLocation;
.\tf.exe undo $tfsBranchName /collection:$tfsCollectionName /workspace:""$workspace"" /recursive /noprompt;
Set-Location $PSScriptRoot;
}
}
Catch [system.exception]
{
"Oops, something's wrong!!!"
}
}
GetUserFileList($tfsBranchName);
I'm working on the first steps towards creating a powershell script that will read through printer logs (probably using get-WMI cmdlet), and parse through the logs. Afterwards, I plan on having the script output to a .txt file the name of the printer, a counter of the number of times a printer was used (if possible), and specific info found in the logs.
In order to do this, I've decided to try working backwards. Below is a small portion of what the logs will look like:
10 Document 81, A361058/GPR0000151814_1: owned by A361058 was printed on R3556 via port IP_***.***.***.***. Size in bytes: 53704; pages printed: 2 20130219123105.000000-300
10 Document 80, A361058/GPR0000151802_1: owned by A361058 was printed on R3556 via port IP_***.***.***.***. Size in bytes: 53700; pages printed: 2
Working backwards and just focusing on parsing first, I'd like to be able to specifically get the "/GRP", "R3446 (in general, R** as this is the printer name)", and get a counter that shows how often a specific printer appeared in the log files.
It has been a while since I last worked with Powershell, however at the moment this is what I've managed to create in order to try accomplishing my goal:
Select-String -Path "C:\Documents and Settings\a411882\My Documents\Scripts\Print Parse Test.txt" -Pattern "/GPR", " R****" -AllMatches -SimpleMatch
The code does not produce any errors, however I'm also unable to get any output to appear on screen to see if I'm capturing the /GRP and printer name. At the moment I'm trying to just ensure I'm gathering the right output before worrying about any counters. Would anyone be able to assist me and tell me what I'm doing wrong with my code?
Thanks!
EDIT: Fixed a small error with my code that was causing no data to appear on screen. At the moment this code outputs the entire two lines of test text instead of only outputting the /GPR and server name. The new output is the following:
My Documents\Scripts\Print Parse Test.txt:1:10 Document 81, A361058/GPR0000151814_1: owned by A361058 was printed on
R3556 via port IP_***.***.***.***. Size in bytes: 53704; pages printed: 2
20130219123105.000000-300
My Documents\Scripts\Print Parse Test.txt:2:10 Document 80, A361058/GPR0000151802_1: owned by A361058 was printed on
R3556 via port IP_***.***.***.***. Size in bytes: 53700; pages printed: 2
I'd like to try having it eventually look something like the following:
/GPR, R****, count: ## (although for now I'm less concerned about the counter)
You can try this. It only returns a line when /GPR (and "on" from "printed on") is present.
Get-Content .\test.txt | % {
if ($_ -match '(?:.*)(/GPR)(?:.*)(?<=on\s)(\w+)(?:.*)') {
$_ -replace '(?:.*)(/GPR)(?:.*)(?<=on\s)(\w+)(?:.*)', '$1,$2'
}
}
Output:
/GPR,R3556
/GPR,R3556
I'm sure there are better regex versions. I'm still learning it :-)
EDIT this is easier to read. The regex is still there for extraction, but I filter out lines with /GPR first using select-string instead:
Get-Content .\test.txt | Select-String -SimpleMatch -AllMatches -Pattern "/GPR" | % {
$_.Line -replace '(?:.*)(/GPR)(?:.*)(?<=on\s)(\w+)(?:.*)', '$1,$2'
}
I generally start with an example of the line I'm matching, and build a regex from that, substituting regex metacharacters for the variable parts of the text. This makes makes the regex longer, but much more intuitive to read later.
Assign the regex to a variable, and then use that variable in subsequent code to keep the messy details of the regex from cluttering up the rest of the code:
[regex]$DocPrinted =
'Document \d\d, \w+/(\D{3})[0-9_]+: owned by \w+ was printed on (\w+) via port IP_[0-9.]+ Size in bytes: \d+; pages printed: \d+'
get-content <log file> |
foreach {
if ($_ -match $DocPrinted)
{
$line -match $docprinted > $null
$matches
}
}
PowerShell rookie here. I'm stuck trying to get Select-String to parse the variable below.
I'm calling an external program which logs everything to standard output. This is for a backup sync software I'm using. The output is a lot of information such as source, destination, copied files, changed files, etc. I only care about a few items from the standard output. I'm taking the standard output and trying to parse it for the few items I care about so I only see those, rather than 50 other lines of miscellaneous information.
Here's the code:
$procInfo = New-Object System.Diagnostics.ProcessStartInfo
$procInfo.FileName = "C:\Program Files\Siber Systems\GoodSync\gsync.exe"
$procInfo.RedirectStandardError = $true
$procInfo.RedirectStandardOutput = $true
$procInfo.UseShellExecute = $false
$procInfo.Arguments = "/progress=yes /exit sync DroboBackup"
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo = $procInfo
$proc.Start() | Out-Null
(Get-Date -format T) + ": Backup Process Running. Please Stand By..."
$proc.WaitForExit()
if ($proc.ExitCode -eq 0)
{
"GoodSync Reported: Analyze or Sync Successfully Completed."
} elseif ($proc.ExitCode -eq 1)
{
"GoodSync Error: Analyze had Terminal Errors. Did gsync.exe close abruptly?"
} elseif ($proc.ExitCode -eq 2)
{
"GoodSync Error: Sync had Terminal Errors. Did gsync.exe close abruptly?"
} else
{
"GoodSync Error: General Error. Typo in job name?"
}
$output = $proc.StandardOutput.ReadToEnd()
$output += $proc.StandardError.ReadToEnd()
#$output | Out-File c:\output.txt -Append
$newOutput = $output | Select-String "Copy New","Copy Over","Delete File","Items Synced","Changes:"
$newOutput
What I get after running this is all lines from standard output. I'm not getting my nice and clean parsed return.
So, to try and troubleshoot, I sent $output to a text file, then ran the below, and this is what I got:
$x = Get-Content c:\output.txt | Select-String "Copy New","Copy Over","Delete File","Items Synced","Changes:"
$x
PS C:\> .\test.ps1
Changes: 0, Conflicts: 0, CopyTime: 0, CopyState: 0/0, Errors: 0
So as you can see, it is working against the text file, but not as the variable.
Any insight?
You are reading the standard output using StreamReader's ReadToEnd() method. This will return a single string containing the contents, including the carriage returns. So, when you output this string on the pipeline, Select-String only sees that one big string, not each individual line. What you could do is split the string on the carriage returns before passing it down the pipeline:
$output -split "`n" | Select-String "Copy New","Copy Over","Delete File","Items Synced","Changes:"