URL File Replace IP Address ONLY - url

I have a .URL file in my favorites like this:
[DEFAULT]
BASEURL=http://www.facebook.com/
[{000214A0-0000-0000-C000-000000000046}]
Prop3=19,2
[InternetShortcut]
URL=https://31.13.74.144/
IDList=
IconFile=http://www.facebook.com/favicon.ico
IconIndex=1
I want to replace the IP address portion with the IP address of FB which I pull out in a Powershell script:
# set .url file
$outfile = "C:\Users\aborgetti\Favorites\facebook.url"
# set regex pattern to replace in url file
$regex = '\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b'
# grab the ping
$ping = New-Object System.Net.NetworkInformation.Ping
# grab the facebook IP ONLY
$ips = $($ping.Send("www.facebook.com").Address).IPAddressToString
# output to shell for test
GC $outfile| Where-Object { $_ -match "(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"}| ForEach-Object {$_ -replace $ips}
I think I'm close, but my -match matches the entire line of the string and returns URL=https://31.13.74.144/
Would it be easier to store the results of GC $outfile| Where-Object { $_ -match "(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"} to a variable and then do a substring?
Am I missing something? I know there's got to be an easier way to do this.
Thanks in advance!

You're using it wrong. You're searching for lines including an IP, and then trying to remove the new ip, which wasn't even in the text. -replace 'matchpattern, 'newvalue' is the syntax to replace text. -replace 'matchpattern' deletes the matched text.
What you want to do is:
Show all lines(Get-Content) -> Replace all IPs with new IP. Like:
GC $outfile| Foreach-Object { $_ -replace "(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})", $ips}

Any help?
(gc $outfile) -replace '^URL=https://[0-9.]+/',"URL=https://$((Test-Connection www.facebook.com -Count 1 ).IPV4Address)"

Related

Want to recursively list full path of all files in a directory structure and pipe into text file without wordwrap

I want to get a listing of the full path of every file in a multi-level directory structure, and put that information into a text file - BUT, I don't want long path's to be word-wrapped - I want to have the full path of each line on a single line in the text regardless of how long that line is.
I started out with this command:
Get-ChildItem -Recurse | ForEach-Object { Write-Host $_.FullName.ToString() }
I then have tried various means to try and redirect this output to a text file. First I tried this (but it didn't work):
Get-ChildItem -Recurse | ForEach-Object { Write-Host $_.FullName.ToString() } | Out-File C:\temp\out.txt
Next, I tried just piping all of the StdErr, StdOut etc to the text file thus:
Get-ChildItem -Recurse | ForEach-Object { Write-Host $_.FullName.ToString() } *> C:\temp\out.txt
This last command got close - but still the longer path's are wrapped onto the next line. I have been trying all sorts of things to try and stop this (wordwrap) from happening. Can anyone tell me how to get the output from the above command to go into a text file with no word-wraps (even) for very long path's?
thanks heaps,
David. :-)

Converting text output to object using powershell

I am trying to schedule a list of URL's in maintenance mode in SCOM 2007 using powershell. I am trying to get the list of URLs display name from a text file and trying to pass as input to below command.However it's not working. Can some body help how to pass the display name in text file as input
$URLStuff = Get-Content C:\Display.txt
$URLWatcher = (Get-MonitoringClass -name Microsoft.SystemCenter.WebApplication.Perspective) |
Get-MonitoringObject | where {$_.DisplayName -eq $URLStuff}
get-content is returning you an array of string objects, one per line found in the file. You need to turn your where-object around to search that array for the DisplayName of each object found from SCOM.
$URLWatcher = (Get-MonitoringClass -name Microsoft.SystemCenter.WebApplication.Perspective) |
Get-MonitoringObject | where {$URLStuff -contains $_.DisplayName}
I'm assuming that you've already verified that DisplayName does contain the data you're looking for and will match the contents of Display.txt.

I cannot get Powershell 2.0 to move a file from one path to another

I need to make changes to to the following Powershell script, but am having a dickens of a time getting the resulting files to write to a different path...let's call it $destPath.
Consider:
Get-ChildItem $sourcePath | % { [system.io.file]::Move($_.fullname, ($_.FullName -replace '\[|\]|-|,|\(|\)', '') ) }
Based on my understanding of move syntax, $_.fullname is my original file, and $_.FullName -replace... is the NEW filename. Hoever, when I try to use $destPath.FullName -replace I get an error that an empty filename is not legal. Obviously, Powershell is not recognizing that as a valid pathname for the move command.
What am I missing?
Since you did not mention $destPath in context - maybe you did not define the variable $destPath at all? It is quite unlikely but just trying to narrow things down.
Another way to make this work:
Rename child-items first at original location. Then move them.
get-childitem *.txt | rename-item -newname {$_.name -replace 'o','O'}
get-childitem *.txt | % {move-item -path $_.fullname -destination .\Test-files}
But I prefer your one liner:)

Exporting results from get-acl to a .csv

I am trying to get the following powershell script to output the results into a .csv. The .csv file is created but there is no data, what am I doing wrong?
$rootfolder = Get-ChildItem -recurse -Path [PATH] | where {$_.Attributes -eq 'Directory'}
foreach ($userdir in $rootfolder) {
$userdir.FullName
get-acl $userdir.FullName | foreach {write-host "The owner is : " $_.Owner -Foregroundcolor Green } | export-csv c:\powershell\test.csv
Write-Host "`n"
}
You can't send the output of a foreach further in the pipeline in the way you're attempting; you need to export each item individually. Revised script (note that I've also amended your where-object to test the PSIsContainer property of each item returned by gci):
$rootfolder = Get-ChildItem -recurse -Path [path] | where {$_.PSIsContainer}
foreach ($userdir in $rootfolder) {
get-acl -path $userdir.FullName | export-csv c:\powershell\test.csv -NoTypeInformation -Append
}
However, I think you'll be disappointed with the results. get-acl produces a collection of objects, not a collection of basic strings. For example, if I run get-acl on my profile directory, I get this:
PS C:\Users\ME> get-acl .|select-object access|fl *
Access : {System.Security.AccessControl.FileSystemAccessRule, System.Security.AccessControl.FileSystemAccessRule,
System.Security.AccessControl.FileSystemAccessRule}
Running the above script, all I see in the Access column of the CSV is System.Security.AccessControl.AuthorizationRuleCollection - because that's what's produced, a collection. To get the full ACL, you will need to do additional processing to pull out each element of that collection. Repeat for the other fields that also return collections.

Powershell Memory Usage

Im abit of a noob to Powershell so please dont chastise me :-)
So I've got some rather large log files (600mb) that I need to process, my script essentially strips out those lines that contain "Message Received" then tokenises those lines and outputs a few of the tokens to an output file.
The logic of the script is fine (although Im sure it could be more efficient) but the problem is that as I write lines to the output file and the file subseuqenly grows larger, the amount of memory that powershell utilises also increases to the point of memory exhaustion.
Can anyone suggest how I can stop this occuring? I thought about breaking up the log into a temporary file of only say 10mb then processing on the temp file instead?
Heres my code, any help you guys could give would be fantastic :-)
Get-Date | Add-Content -Path d:\scripting\logparser\testoutput.txt
$a = Get-Content D:\scripting\logparser\importsample.txt
foreach($l in $a){
#$l | Select-String -Pattern "Message Received." | Add-Content -Path d:\scripting\logparser\testoutput.txt
if
(($l | Select-String -Pattern "Message Received." -Quiet) -eq "True")
{
#Add-Content -Path d:\scripting\logparser\testoutput.txt -value $l
$var1,$var2,$var3,$var4,$var5,$var6,$var7,$var8,$var9,$var10,$var11,$var12,$var13,$var14,$var15,$var16,$var17,$var18,$var19,$var20 = [regex]::split($l,'\s+')
Add-Content -Path d:\scripting\logparser\testoutput.txt -value $var1" "$var2" "$var3" "$var4" "$var16" "$var18
}
else
{}
}
Get-Date | Add-Content -Path d:\scripting\logparser\testoutput.txt
If you do everything in the pipe, only one object at a time (one line from the file in your case) needs to be in memory.
Get-Content $inputFile | Where-Object { $_ -match "Message Received" } |
foreach-object -process {
$fields = [regex]::split($_,'\s+') # An array is created
Add-Content -path $outputFile -value [String]::Join(" ", $fields[0,1,2,3,15,17])
}
The $fields[0,1,2,3,15,17] creates an array of the given indices of $fields.
This could also be done in a single pipeline using an expression rather than a property name passed to Select-Object, but would be less clear.
a working powershell example:
$csvFile = "c:\test.txt"
$file_reader = [System.IO.File]::OpenText($csvFile)
$row = "";
while(($row = $file_reader.ReadLine()) -ne $null)
{
# do something with '$row'
Write-Host row: $row
}
$file_reader.Close()
You're effectively storing the entire log file in memory instead of sequential accessing it bit by bit.
Assuming that you log file has some internal delimiter for each entry (maybe new line) you'd read in each entry at a time, not keeping more in memory than absolutely necessary.
You won't be able to rely on the built in PowerShell stuff because it's in affect stupid.
You'll have to apologize my code sample, my PowerShell is a bit rusty.
var $reader = Create-Object "System.IO.StreamReader" testoutput.txt
var $s = ""
while(($s = reader.ReadLine())!=null)
{
// do something with '$s'
// which would contain individual log entries.
}
$reader.Close()

Resources