I need help with a script that monitors the upload.
The script is intended to prevent crashes of OBS, where OBS simply stops sending data. I have already run this script on several streaming PCs - without problems. But on a new streaming PC it just won't run.
Here is the script for now:
$threshold = 6000
$timer = new-timespan -Seconds 10
$wmi = 0
$count = 0
$clock = [diagnostics.stopwatch]::StartNew()
while ($clock.elapsed -lt $timer){
$wmi += (Get-WMIObject -Class Win32_PerfFormattedData_Tcpip_NetworkInterface | Select-Object BytesSentPerSec).BytesSentPerSec
$count++
}
$kbytes = ($wmi/$count) / 1kb * 8 #to kbit/s
if($kbytes -le $threshold)
{
Write-Output $kbytes
Write-Host "Neustart"
taskkill /f /im obs64.exe
timeout /t 5
Start-Process -FilePath "obs64.exe" -WorkingDirectory "C:\Program Files\obs-studio\bin\64bit\" --startstreaming
}
else
{
Write-Output $kbytes
Write-Host "Alles OK"
}
As an error I get this:
Code:
Error when calling the method. [System.Object[]] no method with the name "op_Addition".
In C:\Users\stream1\Desktop\watchdog.ps1:8 Zeichen:5
+ $wmi += (Get-WMIObject -Class Win32_PerfFormattedData_Tcpip_Netwo ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (op_Addition:String) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound
The program was written for me by someone. Unfortunately, I myself have no experience in programming in Powershell. I would be very grateful for any help.
I hope someone can help me.
Thanks a lot
Greeting
Thomas
"$wmi = 0" --> "$wmi = #()"
Error:
Error calling the method because [System.Object[]] does not contain a method named "op_Division".
In C:\Users\stream1\Desktop\watchdog.ps1:12 character:1
+ $kbytes = ($wmi/$count) / 1kb * 8 #to kbit/s
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (op_Division:String) [], RuntimeException
+ FullyQualifiedErrorId : MethodNotFound
I am trying to migrate from specflow excel to using a json data file and the external data plugin
Related to Specflow - using data from external file in feature file is there a walkthrough to setup a feature file, and json data file with new #property approach to data?
I have a Specflow feature like this, which originally came from Specflow Excel.
Feature: EndToEndId
Scenario Outline: Single Payment
Given a client <clientNo> called <cName>
And a broker <bName> with book ref <bRef>
When a deal to <buyOrSell> an amount <fromAmt> of <ccy1> for <ccy2> for value <vDate> at client rate <cRate> and bank rate <bRate>
Then create a deal <dealNo> with client amt <toCAmt> and cover amt <coverAmt> and PnL <PnL> and profit rate <pRate> and <bYes>
And add beneficiary <benName> and country <ctry> and pay type <payType> and charge <charge> and <abYes>
When a credit file <c1> with <c1Name> for <c1Amt> <c1Ccy> for value <c1Date> with ref <EndToEndId1>
Then CreditNotice gets <fIn1Type> for client <fIn1Client>
Examples:
| case | clientNo | cName | bName | bRef | buyOrSell | fromAmt | ccy1 | ccy2 | vDate | cRate | bRate | dealNo | toCAmt | coverAmt | PnL | pRate | bYes | benName | ctry | payType | charge | abYes | c1 | c1Name | c1Amt | c1Ccy | c1Date | EndToEndId1 | fIn1Type | fIn1Client |
| T1: 99549 ###### | 99549 | Gherkin Test | MERCURY | 01W3RG5638 | SELL | 100000 | EUR | GBP | SP | 0.89435 | 0.89935 | ###### | 89435 | 89935 | 500 | 1 | yes | Gherkin Ben | GB | CHAPS | NONE | yes | 99549 | Gherkin Test | 100000 | EUR | SP | 99549 ###### | FullFundsIn | Gherkin Test |
To use the with a json file for the data, am I going to need to define each property again like this:
#property:clientNo=clientNo
#property:cName=cName
#property:bName=bName
and a json data file like this:
{
"case": "T-9: 99549 ######",
"clientNo": "99549",
"cName": "Gherkin Test",
"bName": "MERCURY",
"bRef": "01W3RG5638",
"buyOrSell": "SELL",
"fromAmt": "100,000",
"ccy1": "EUR",
"ccy2": "GBP",
"vDate": "SP",
"cRate": " 0.894350 ",
"bRate": " 0.894350 ",
"dealNo": "######",
"toCAmt": "89,435.00",
"coverAmt": "89,435.00",
"PnL": "0.00",
"pRate": " 1.000000 ",
"bYes": " yes ",
"benName": "Gherkin Ben",
"ctry": "GB",
"payType": "CHAPS",
"charge": "NONE",
"abYes": "yes",
"c1": "99549",
"c1Name": "Gherkin Test",
"c1Amt": "100,000",
"c1Ccy": "EUR",
"c1Date": "SP",
"EndToEndId1": "99549 ######",
"fIn1Type": "FullFundsIn",
"fIn1Client": "Gherkin Test"
},
{
"case": "T-8: 1234 1234",
"clientNo": "99549",
"cName": "Gherkin Test",
"bName": "JUPITER",
"bRef": "01W3RG5639",
"buyOrSell": "SELL",
"fromAmt": "200,000",
"ccy1": "EUR",
"ccy2": "GBP",
"vDate": "SP",
"cRate": " 0.894350 ",
"bRate": " 0.894350 ",
"dealNo": "######",
"toCAmt": "178,870.00",
"coverAmt": "178,870.00",
"PnL": "0.00",
"pRate": " 1.000000 ",
"bYes": " yes ",
"benName": "Gherkin Ben",
"ctry": "GB",
"payType": "FASTER",
"charge": "NONE",
"abYes": "yes",
"c1": "99549",
"c1Name": "Gherkin Test",
"c1Amt": "200,000",
"c1Ccy": "EUR",
"c1Date": "SP",
"EndToEndId1": "1234 1234",
"fIn1Type": "BankRec",
"fIn1Client": "Gherkin Test"
}
Thanks for some pointers on how to get the json data into the specflow feature file.
The SpecFlow.ExternalData plugin is currently limited to a single parameter.
Please upvote the feature request at https://support.specflow.org/hc/en-us/community/posts/360015106078-Allow-multiple-parameters-to-be-used-with-External-Data-plugin to get this higher in our backlog priority.
There is a separate plugin called Specflow.Contrib.JsonData as part of Nuget packages which accepts JSON data as an input for specflow scenarios. It is an extension to the Specflow.ExternalData plugin. It also accepts multiple properties. Check out https://libraries.io/nuget/SpecFlow.Contrib.JsonData.
long time listener, first time caller,
I have a need to create a Powershell script (ver 2 or lower) that:
-continually monitors one specific directory for new/changed files
-logs the file that was created with a date/time stamp in a log file that's:
--created on a daily basis with the name of "log Date/Time.txt"
-renames the file, appending the date/time
-logs that it renamed it
-maps a drive with a specific username/password combo
-moves it from dirA to dirB (the mapped drive is dirB)
-logs that it moved it
-unmaps the drive
-if for whatever reason it stopped running and we start it back up, it'll rename, maps the drive, move, unmap the drive, and log all files in dirA to dirB
In it's current form, I've stripped the Mapping of the dir out to troubleshoot it moving the file on a local drive just to keep from troubleshooting a network drive.
I've been staring at this for over a week and am tired of hitting my head against the desk. Can someone PLEASE put me out of my misery and let me know what I've done wrong?
THANK YOU in advance!
I've honestly tried SO many combos, different routines, I don't even know what to put here.
In the box below, I've put the main part of the script that isn't working correctly It renames the files as they come in, but doesn't move them.
$rename = $_.Name.Split(".")[0] + "_" + ($_.CreationTime | Get-Date -Format MM.dd.yyy) + "_" + ($_.CreationTime | Get-Date -Format hh.mm.ss) + ".log"
Write-Output "File: '$name' exists at: $source - renaming existing file first" >> $scriptlog
Rename-Item $_ -NewName $rename
Wait-Event -Timeout 3
Move-Item "$_($_.Directory)$name" -destination $destination
Write-Output "File: '$name' moved to $destination on $date" >> $scriptlog
Whole code available below:
#Log Rename/Move script
$userName = "copyuser"
$newpass = Read-Host -Prompt 'Type the new Password'
$password = ConvertTo-SecureString -String $newpass -AsPlainText -Force
$PathToMonitor = "C:\Users\Administrator\Desktop\FolderA"
$destination = "C:\Users\Administrator\Desktop\FolderB"
$scriptlog = "C:\Users\Administrator\Desktop\ScriptLogs\" + [datetime]::Today.ToString('MM-dd-yyy') + "_TransferLog.txt"
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
$FileSystemWatcher.Path = $PathToMonitor
$FileSystemWatcher.IncludeSubdirectories = $false
$FileSystemWatcher.EnableRaisingEvents = $true
$dateTime = [datetime]::Today.ToString('MM-dd-yyy') + " " + [datetime]::Now.ToString('HH:mm:ss')
Write-Output "*******************************************************************************************" >> $scriptLog
Write-Output "*********************Starting Log Move Script $dateTime**********************" >> $scriptLog
Write-Output "*******************************************************************************************" >> $scriptLog
$Action = {
$details = $event.SourceEventArgs
$Name = $details.Name
$FullPath = $details.FullPath
$OldFullPath = $details.OldFullPath
$OldName = $details.OldName
$ChangeType = $details.ChangeType
$Timestamp = $event.TimeGenerated
$text = "{0} was {1} at {2}" -f $FullPath, $ChangeType, $Timestamp
Write-Output "" >> $scriptlog
Write-Output $text >> $scriptlog
switch ($ChangeType)
{
'Changed' { "CHANGE"
Get-ChildItem -path $FullPath -Include *.log | % {
$rename = $_.Name.Split(".")[0] + "_" + ($_.CreationTime | Get-Date -Format MM.dd.yyy) + "_" + ($_.CreationTime | Get-Date -Format hh.mm.ss) + ".log"
Write-Output "File: '$name' exists at: $source - renaming existing file first" >> $scriptlog
Rename-Item $_ -NewName $rename
Wait-Event -Timeout 3
Move-Item "$_($_.Directory)$name" -destination $destination
Write-Output "File: '$name' moved to $destination on $date" >> $scriptlog
}
}
'Created' { "CREATED"
Get-ChildItem -path $FullPath -Include *.log | % {
$rename = $_.Name.Split(".")[0] + "_" + ($_.CreationTime | Get-Date -Format MM.dd.yyy) + "_" + ($_.CreationTime | Get-Date -Format hh.mm.ss) + ".log"
Write-Output "File: '$name' exists at: $source - renaming existing file first" >> $scriptlog
Rename-Item $_ -NewName $rename
Wait-Event -Timeout 3
Move-Item "$($_.Directory)$rename" -Destination $destination
Write-Output "File: '$name' moved to $destination on $date" >> $scriptlog
}
}
'Deleted' { "DELETED"
}
'Renamed' {
}
default { Write-Output $_ >> $scriptlog}
}
}
$handlers = . {
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Changed -Action $Action -SourceIdentifier FSChange
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Created -Action $Action -SourceIdentifier FSCreate
}
Write-Output "Watching for changes to $PathToMonitor" >> $scriptlog
try
{
do
{
Wait-Event -Timeout 1
Write-host "." -NoNewline
} while ($true)
}
finally
{
# EndScript
Unregister-Event -SourceIdentifier FSChange
Unregister-Event -SourceIdentifier FSCreate
$handlers | Remove-Job
$FileSystemWatcher.EnableRaisingEvents = $false
$FileSystemWatcher.Dispose()
write-output "Event Handler disabled." >> $scriptlog
}
Well, I found my issue. A dumb issue if I look at it and think about it. It was how I was calling the current directory containing the file(s) to move after being renamed.
Move-Item "$_($_.Directory)$name" -destination $destination
The issue in the above code is the "$($.Directory). It needs to be:
Move-Item -path $PathToMonitor$name -destination $destination
Other things may work, and may be better, but at least that fixes the moving after naming issue I was having.
Now onto adding the Mapping Drive and other stuff that's needed to fully finish what I need.
If I think about it, I'll post my entire code when done for anyone else it can help in the future.
I have an action that receives a file correctly and saves it to a destination folder without problems.
When the destination folder has a file with the same name, the method transferTo, first deletes the existing file, then copies the new one (http://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/web/multipart/commons/CommonsMultipartFile.html#transferTo-java.io.File-).
But if the destination file exists, Grails is throwing this error, and the existing file is deleted but the uploaded one is not copied.
I'm working in WinXP, so I don't think it is a permissions issues (the file is deleted so I guess has nothing to do with permissions).
| Error 2015-05-24 23:47:58,199 [http-bio-8090-exec-3] ERROR errors.GrailsExceptionResolver - FileNotFoundException occurred when processing request: [POST] /ehr/test/upload - parameters:
doit: upload
overwrite: true
SYNCHRONIZER_TOKEN: 8deaf46b-b6ff-4362-ac70-7223f37ae806
SYNCHRONIZER_URI: /ehr/test/upload
opts\Signos.opt (Access is denied). Stacktrace follows:
Message: opts\Signos.opt (Access is denied)
Line | Method
->> 221 | <init> in java.io.FileOutputStream
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 171 | <init> in ''
| 417 | write . . in org.apache.commons.fileupload.disk.DiskFileItem
| 85 | upload in test.TestController
| 198 | doFilter in grails.plugin.cache.web.filter.PageFragmentCachingFilter
| 63 | doFilter in grails.plugin.cache.web.filter.AbstractFilter
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run in java.util.concurrent.ThreadPoolExecutor$Worker
^ 744 | run . . . in java.lang.Thread
The upload action looks like:
def upload(boolean overwrite)
{
if (params.doit)
{
def errors = []
def f = request.getFile('opt')
def xml = new String( f.getBytes() )
def destination = config.opt_repo + f.getOriginalFilename()
File fileDest = new File( destination )
if (!overwrite && fileDest.exists())
{
errors << "The OPT already exists, do you want to overwrite?"
return [errors: errors, ask_overwrite: true]
}
// Some validation logic here ...
if (errors.size() == 0)
{
// http://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/web/multipart/commons/CommonsMultipartFile.html#transferTo-java.io.File-
// If the file exists, it will be deleted first
f.transferTo(fileDest)
}
}
}
I have made the following script from google for backing up the SSRS Encryption keys:
cls
$pwd = "sa#123#123"
$SSRSClass = Get-Wmiobject -namespace "root\microsoft\sqlserver\reportserver\rs_BPSSRS\v10\admin" -class "MSReportServer_ConfigurationSetting"
$key = $SSRSClass.BackupEncryptionKey($pwd)
$stream = [System.IO.File]::Create("c:\\SSRS.snk", $key.KeyFile.Length)
$stream.Write($key.KeyFile, 0, $key.KeyFile.Length)
$stream.Close()
But I'm getting the following errors:
Method invocation failed because [System.Object[]] doesn't contain a method named 'BackupEn
cryptionKey'.
At line:5 char:38
+ $key = $SSRSClass.BackupEncryptionKey <<<< ($results)
+ CategoryInfo : InvalidOperation: (BackupEncryptionKey:String) [], RuntimeEx
ception
+ FullyQualifiedErrorId : MethodNotFound
Exception calling "Create" with "2" argument(s): "Positive number required.
Parameter name: bufferSize"
At line:6 char:35
+ $stream = [System.IO.File]::Create <<<< ("c:\\SSRS.snk", $key.KeyFile.Length)
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
You cannot call a method on a null-valued expression.
At line:7 char:14
+ $stream.Write <<<< ($key.KeyFile, 0, $key.KeyFile.Length)
+ CategoryInfo : InvalidOperation: (Write:String) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
You cannot call a method on a null-valued expression.
At line:8 char:14
+ $stream.Close <<<< ()
+ CategoryInfo : InvalidOperation: (Close:String) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
I'm using powershell v2. I tried finding about this but no luck. There are around 50+ SSRS servers in our environment and doing backup manually is tiresome. Hence, we came up with this automation. Kindly provide your comments.
Thanks
This piece of code should do the trick:
$ComputerName = "."
$KeyFolder = "C:\Temp"
$KeyPassword = "sa#123#123"
$TimeStamp = Get-Date -Format "-yyyyMMdd-HHmmss"
Get-WmiObject -Namespace "Root\Microsoft\SqlServer\ReportServer" -Class "__Namespace" -ComputerName $ComputerName |
Select-Object -ExpandProperty Name |
% {
$NameSpaceRS = $_
$InstanceName = $NameSpaceRS.SubString(3)
$KeyFileName = Join-Path -Path $KeyFolder -ChildPath ($InstanceName + $Timestamp + ".snk")
"Found Reporting Services in instance '$($InstanceName)' on $($ComputerName); will save key to '$($KeyFileName)' ..."
$SQLVersion = (Get-WmiObject -Namespace "Root\Microsoft\SqlServer\ReportServer\$($NameSpaceRS)" -Class "__Namespace" -ComputerName $ComputerName).Name
$SSRSClass = Get-WmiObject -Namespace "Root\Microsoft\SqlServer\ReportServer\$($NameSpaceRS)\$($SQLVersion)\Admin" -Query "SELECT * FROM MSReportServer_ConfigurationSetting WHERE InstanceName='$($InstanceName)'" -ComputerName $ComputerName
$Key = $SSRSClass.BackupEncryptionKey($KeyPassword)
If ($Key.HRESULT -ne 0) {
$Key.ExtendedErrors -join "`r`n" | Write-Error
} Else {
$Stream = [System.IO.File]::Create($KeyFileName, $Key.KeyFile.Length)
$Stream.Write($Key.KeyFile, 0, $Key.KeyFile.Length)
$Stream.Close()
}
}
The errors that you have mentioned are because your code is trying to fetch information from duplicate entries. Meaning, if you try the above code on the server where only one named instance of SSRS is running, then I suppose it will succeed. Just try this piece and give your comments.
CHEERS.