Powershell: Must provide a value expression on the right-hand side of the '-' operator - powershell-2.0

#GET TEXT FILE WITH LIST OF "SAMACCOUNTNAME" TO LIST VARIABLE
$list = Get-Content C:\PSSCripts\listofusers.txt
#PULL INFORMATION FROM ACTIVE DIRECTORY TO USERRESULTS VARIABLE
$UserResults = Get-AdUser -filter * -searchbase "OU=THISOU,DC=THISDOMAIN,DC=int" -Properties displayname
#DETERMINE IF USER IS IN THE TXT LIST
foreach ($user in $UserResults)
{
if ($user.SamAccountName -in $list.SamAccountName)
{
#ECHO THEIR NAME TO VERIFY
write-host $user.displayName
}
}
#VERIFY USER TO BE OFFBOARDED VIA Y/N PROMPT - VISUALLY INSPECT LIST
$choice = ""
while ($choice -notmatch "[y|n]"){
$choice = read-host "The following user profiles have been loaded for offboarding. Do you want to continue? Please Verify the users before continuing. (Y/N)"
}
if ($choice -eq "y"){
# LOOP THROUGH USERS AND APPLY CHANGES
foreach ($user in $UserResults)
{
#DETERMINE IF USER IS IN TXT FILE
if ($user.SamAccountName -in $list.SamAccountName)
{
# DISABLE ACCOUNT
Disable-ADAccount -Identity $user
# CHANGE DISPLAYNAME AND DESCRIPTION TO DISPLAY TERMINATED - $USER
$newname = "Terminated - " + $user.displayName
Get-ADUser -Identity $user | Set-ADObject -Description $newname -DisplayName $newname
# CHANGE USER PASSWORD TO "Password1"
$password = "Password1" | ConvertTo-SecureString -AsPlainText -Force
Set-ADAccountPassword -NewPassword $password -Identity $user -Reset
# MOVE USER TO DIFFERENT LOCATION, Disabled Users organizational unit
Move-ADObject -Identity $user -TargetPath "OU=DisabledUsers,DC=THATDOMAIN,DC=int" -Confirm:$false
}
}
}
else {write-host "Script aborted!"}
Getting the following error:
*You must provide a value expression on the right-hand side of the '-' operator. At :11 char:29
if ($user.SamAccountName - <<<< in $list.SamAccountName)
Category Info : ParserError (:) [], ParseException
FullyQualifiedErrorID : ExpectedValueExpression
I have a list of users in a text file with the header SAMACCOUNTNAME. These users are being checked against the list of users in a particular OU. Powershell will echo the list of users in my text list to me (after having checked it against all the users in that OU in AD - to verify nothing is being offboarded / changed in error), prompt to verify (y|n) before moving forward and executing a script I wrote with the help of some redditors from /r/powershell earlier.
I'm not understanding why I'm getting this error, is
-in $list.SamAccountName
Not correct?
Thanks for the help, stackoverflow! First time posting, looking forward to getting better with Powershell and giving back to the community.

You should use "-eq" or "-contains" (I am not sure what is a scalar value and what is an array in your program).

Related

Find user by imported display name via powershell

I've checked code and examples from identical questions on this site, but I'm baffled, because I'm utilizing nearly the exact same code from Get-ADUser with display name as a value and it just pulls empty results for me. I don't know how to continue a conversation on an existing question, so I figured I had to make my own.
Here is what I have:
import-module activedirectory
$csv = Get-Content users.csv
foreach ($user in $csv) {
$user = $user.trim()
write-host $user #A line for testing to show me what name is being analyzed
$samName = Get-ADUser -Filter{displayName -like "$user*"} | select samaccountname
write-host $samName
}
cmd /c pause | out-null
The output I get is:
John Doe
Jane Doe
Billy Ray
But if I change
$samName = Get-ADUser -Filter{displayName -like "$user*"} | select samaccountname
to:
$samName = Get-ADUser -Filter{displayName -like "John Doe*"} | select samaccountname
It will correctly return John Doe's samaccountname.

Remove-ADPrincipalGroupMembership script is not removing user

I have the following script that runs without error but it doesn't remove the user from the group. I am not sure what I am missing. I have tried running this without the username variable and with an actual name but that did not work either. Thanks in advance for the help.
$group = #(
'Aberdeen Refer Team','Air Force Refer Team','Coliseum Refer Team','Denbigh Refer Team','Warwick Refer Team','Wards Corner Refer Team',
'Eagle HarborRefer Team','Chesapeake Refer Team','Willow Oaks Refer Team','poquoson Refer Team','Oyster Point Refer Team','NASA ReferTeam',
"Contact Center Refer Team",'Yorktown Refer Team','WB New Town Refer Team','WB City Refer Team','Stoneybrook Refer Team',
'Hayes Refer Team','Hilltop Refer Team')
$user = Get-aduser "%username%" -Properties MemberOf
if ($user.MemberOf -match $group)
{
foreach ($group in $user)
{
Remove-ADPrincipalGroupMembership -identity $user -MemberOf $group -confirm:$False
}
}
Replace %username% with $env:USERNAME to get the username, and then find all the user's groups with Get-ADPrincipalGroupMembership:
$TeamNames = #('Aberdeen Refer Team','Hilltop Refer Team')
$User = Get-ADUser "$($env:USERNAME)"
# Find all the groups that the user is a member of, and filter down to just those in the $TeamNames list
$UserGroups = Get-ADPrincipalGroupMembership -Identity $user | Where-Object { $TeamNames -contains $_.Name }
# Iterate over the groups
foreach ($Group in $UserGroups)
{
# Remove each group membership from the user
Remove-ADPrincipalGroupMembership -Identity $User -MemberOf $Group -Confirm:$false
}

Exporting Windows Event Logs to CSV - Powershell 2.0

I have a requirement to export Windows Event logs to CSV from our production environment periodically.
I have a simple XML Config file containing a list of machines I need the events from, and a list of Event Ids that I need to retrieve.
From here I'm looping through each machine name in turn, and then each event Id to retrieve the logs and then export to CSV. I'd like one CSV per machine per execution.
Once I've worked out all my variables the PS Command is quite simple to retrieve the log for one Event Id
foreach ($machine in $config.Configuration.Machines.Machine)
{
$csvname=$outputlocation + $machine.Value + "_" + $datestring + ".csv"
foreach ($eventid in $config.Configuration.EventIds.EventId)
{
Get-WinEvent -ComputerName $machine.Value -ErrorAction SilentlyContinue -FilterHashTable #{Logname='Security';ID=$eventid.Value} | where {$_.TimeCreated -gt $lastexecutiondate} | export-csv -NoClobber -append $csvname
}
}
Execpt I'm unable to append to a CSV each time, PS 2.0 apparently does not support this. I've tried extracting all Event Ids at once but this seems to be a bit long winded and may now allow use of a config file, but I'm fairly new to PowerShell so I haven't had much luck.
I also need to specify multiple LogNames (System, Security and Application), and would prefer to run one statement as opposed to the same statement 3 times and appe but I'm unsure of how to do this.
Unfortunately at this point Google has me running in circles.
The following is something I culled together to allow me to export the prior 24 hours of events for select event logs - I'm going to create a scheduled task out of it so it pulls a daily.
Hope this helps someone else...
$eventLogNames = "Application", "Security", "System", "Windows PowerShell"
$startDate = Get-Date
$startDate = $startDate.addDays(-1).addMinutes(-15)
function GetMilliseconds($date)
{
$ts = New-TimeSpan -Start $date -End (Get-Date)
[math]::Round($ts.TotalMilliseconds)
}
$serverName = get-content env:computername
$serverIP = gwmi Win32_NetworkAdapterConfiguration |
Where { $_.IPAddress } | # filter the objects where an address actually exists
Select -Expand IPAddress | # retrieve only the property *value*
Where { $_ -notlike '*:*' }
$fileNameDate = Get-Date -format yyyyMMddhhmm
$endDate = Get-Date
$startTime = GetMilliseconds($startDate)
$endTime = GetMilliseconds($endDate)
foreach ($eventLogName in $eventLogNames)
{
Write-Host "Processing Log: " $eventLogName
<# - Remove comment to create csv version of log files
$csvFile = $fileNameDate + "_" + $serverIP +"_" + $eventLogName + ".csv"
Write-Host "Creating CSV Log: " $csvFile
Get-EventLog -LogName $eventLogName -ComputerName $serverName -After $startDate -ErrorAction SilentlyContinue | Sort MachineName, TimeWritten | Select MachineName, Source, TimeWritten, EventID, EntryType, Message | Export-CSV $csvFile #ConvertTo-CSV #Format-Table -Wrap -Property Source, TimeWritten, EventID, EntryType, Message -Autosize -NoTypeInformation
#>
$evtxFile = $fileNameDate + "_" + $serverIP + "_" + $eventLogName + ".evtx"
Write-Host "Creating EVTX Log: " $evtxFile
wevtutil epl $eventLogName $evtxFile /q:"*[System[TimeCreated[timediff(#SystemTime) >= $endTime] and TimeCreated[timediff(#SystemTime) <= $startTime]]]"
}
Why do I get Failed to export log Security. The specified query is invalid. I get this for each type of event log (system, application etc). This happens only to evtx export. I get the csv file tho`....

Excluding author from peer reviewer list in gerrit

I'm setting up the access control for my company in gerrit and in our current internal process has cross-over between peer reviewers and coders (they tend to be the same group of people). We also want to only require 1 reviewer to peer review the code and submit it if it looks good.
With the default setup any user with the +2: Looks good to me, approved option can peer review their own code.
Is there any way to prevent the author from reviewing their own code, but still allow them to fully review other's code? I haven't been able to find any kind of exclude author in the access control group setup or permissions setups.
The Gerrit Cookbook Example 8 does not strictly prevent the Author to review his/her own change, but will require someone else to +2 it before being able to submit.
This is working for me, but it's a quick hack:
allows a configurable number of +1s to count as a +2 for manual submission
optionally automatically submit with enough +1 votes
optionally counts -1 votes as countering +1 votes for the purposes of the tally
optionally ignores the uploader's own +1 (you may prefer a check against the author, which I've not done)
I've tweaked my earlier answer so it doesn't assume you're using a mysql server.
You might want to move the logfile somewhere it'll be subject to any normal log rotation - perhaps in ../logs/comment-added.log.
I've tried to pull the configurable bits to the fore. Call this file comment-hook and
put it in $gerrit_root/hooks, chmod it 755 or similar. Set up a robot user in the admin
group so the hook can use the sql interface (and comment +2 on things with enough +1s).
#!/usr/bin/perl
#
# comment-hook for a +2 approval from a simple quorum of +1 votes.
#
# Licence: Public domain. All risk is yours; if it breaks, you get to keep both pieces.
$QUORUM = 2; # Total number of +1 votes causing a +2
$PLEBIANS = 'abs(value) < 2'; # or 'value = 1' to ignore -1 unvotes
$AUTO_SUBMIT_ON_QUORACY = '--submit'; # or '' for none
$AND_IGNORE_UPLOADER = 'and uploader_account_id != account_id'; # or '' to let uploaders votes count
$GERRIT_SSH_PORT = 29418;
$SSH_PRIVATE_KEY = '/home/gerrit2/.ssh/id_rsa';
$SSH_USER_IN_ADMIN_GROUP = 'devuser';
# Hopefully you shouldn't need to venture past here.
$SSH = "ssh -i $SSH_PRIVATE_KEY -p $GERRIT_SSH_PORT $SSH_USER_IN_ADMIN_GROUP\#localhost";
$LOG = "/home/gerrit2/hooks/log.comment-added";
open LOG, ">>$LOG" or die;
sub count_of_relevant_votes {
# Total selected code review votes for this commit
my $relevance = shift;
$query = "
select sum(value) from patch_sets, patch_set_approvals
where patch_sets.change_id = patch_set_approvals.change_id
and patch_sets.patch_set_id = patch_set_approvals.patch_set_id
and revision = '$V{commit}'
and category_id = 'CRVW'
and $relevance
$AND_IGNORE_UPLOADER
;";
$command = "$SSH \"gerrit gsql -c \\\"$query\\\"\"";
#print LOG "FOR... $command\n";
#lines = qx($command);
chomp #lines;
#print LOG "GOT... ", join("//", #lines), "\n";
# 0=headers 1=separators 2=data 3=count and timing.
return $lines[2];
}
sub response {
my $review = shift;
return "$SSH 'gerrit review --project=\"$V{project}\" $review $V{commit}'";
}
# ######################
# Parse options
$key='';
while ( $_ = shift #ARGV ) {
if (/^--(.*)/) {
$key = $1;
}
else {
$V{$key} .= " " if exists $V{$key};
$V{$key} .= $_;
}
}
#print LOG join("\n", map { "$_ = '$V{$_}'" } keys %V), "\n";
# ######################
# Ignore my own comments
$GATEKEEPER="::GATEKEEPER::";
if ($V{comment} =~ /$GATEKEEPER/) {
# print LOG localtime() . "$V{commit}: Ignore $GATEKEEPER comments\n";
exit 0;
}
# ######################
# Forbear to analyse anything already +2'd
$submittable = count_of_relevant_votes('value = 2');
if ($submittable > 0) {
# print LOG "$V{commit} Already +2'd by someone or something.\n";
exit 0;
}
# ######################
# Look for a consensus amongst qualified voters.
$plebicite = count_of_relevant_votes($PLEBIANS);
#if ($V{comment} =~ /TEST:(\d)/) {
# $plebicite=$1;
#}
# ######################
# If there's a quorum, approve and submit.
if ( $plebicite >= $QUORUM ) {
$and_submitting = ($AUTO_SUBMIT_ON_QUORACY ? " and submitting" : "");
$review = " --code-review=+2 --message=\"$GATEKEEPER approving$and_submitting due to $plebicite total eligible votes\" $AUTO_SUBMIT_ON_QUORACY";
}
else {
$review = " --code-review=0 --message=\"$GATEKEEPER ignoring $plebicite total eligible votes\"";
# print LOG "$V{commit}: $review\n";
exit 0;
}
$response = response($review);
print LOG "RUNNING: $response\n";
$output = qx( $response 2>&1 );
if ($output =~ /\S/) {
print LOG "$V{commit}: output from commenting: $output";
$response = response(" --message=\"During \Q$review\E: \Q$output\E\"");
print LOG "WARNING: $response\n";
$output = qx( $response 2>&1 );
print LOG "ERROR: $output\n";
}
exit 0;
Gerrit allows you to set up prolog "submit rules" that define when a change is submittable.
The documentation includes several examples, including one that prevents the author from approving his own change.
You can do it from the GUI in the access tab.
Go to the /refs/heads/ section -> add group 'change owner' in Label Code-Review section -> choose -1..+1
This will make the change owner to privilege for giving -1 to +1
I have just written this prolog filter for our Gerrit installation. I did it as a submit_filter in the parent project because I wanted it to apply to all projects in our system.
%filter to require all projects to have a code-reviewer other than the owner
submit_filter(In, Out) :-
%unpack the submit rule into a list of code reviews
In =.. [submit | Ls],
%add the non-owner code review requiremet
reject_self_review(Ls, R),
%pack the list back up and return it (kinda)
Out =.. [submit | R].
reject_self_review(S1, S2) :-
%set O to be the change owner
gerrit:change_owner(O),
%find a +2 code review, if it exists, and set R to be the reviewer
gerrit:commit_label(label('Code-Review', 2), R),
%if there is a +2 review from someone other than the owner, then the filter has no work to do, assign S2 to S1
R \= O, !,
%the cut (!) predicate prevents further rules from being consulted
S2 = S1.
reject_self_review(S1, S2) :-
%set O to be the change owner
gerrit:change_owner(O),
% find a +2 code review, if it exists, and set R to be the reviewer - comment sign was missing
gerrit:commit_label(label('Code-Review', 2), R),
R = O, !,
%if there isn't a +2 from someone else (above rule), and there is a +2 from the owner, reject with a self-reviewed label
S2 = [label('Self-Reviewed', reject(O))|S1].
%if the above two rules didn't make it to the ! predicate, there aren't any +2s so let the default rules through unfiltered
reject_self_review(S1, S1).
The benefits (IMO) of this rule over rule #8 from the cookbook are:
The Self-Reviewed label is only shown when the the change is being blocked, rather than adding a Non-Author-Code-Review label to every change
By using reject(O) the rule causes the Self-Reviewed label to literally be a red flag
As a submit_filter instead of a submit_rule, this rule is installed in a parent project and applies to all sub-projects
Please Note: This rule is authored to prevent the Owner from self-reviewing a change, while the example from the cookbook compares against the Author. Depending on your workflow, you may want to replace the 2 gerrit:change_owner(O) predicates with gerrit:commit_author(O) or gerrit:commit_committer(O)

JIRA: Generating per-user time report?

Sorry if SO is not the best place, but I have time-tracking enabled in JIRA and want to be able to generate a time-report for each user over a given date range. The only time-tracking report option I have is very limited and doesn't do what I want, is it possible through standard functionality or a free plugin perhaps?
You might want to check out Tempo Plugin for JIRA timetracking. It offers timesheets, reports and gadgets on user, team, project, and customer levels.
how about this one:
https://plugins.atlassian.com/plugin/details/294
If you don't want to pay a lot of money for a simple action like getting a summary of time per user.
I found this flow useful:
Create a filter that you like to measure (I measure time only by sub tasks)
Export it to excel
Copy and paste it into a google docs spreadsheet
In google docs you have an option to create a Pivot Table, so just create one that the rows are the assignees and the values are the time
You can also create a calculated column to get the time in hours (just divide it by 3600)
Hope it helps
Using the Better Excel Plugin you can take advantage of all reporting features in Microsoft Excel.
This plugin exports any sort of JIRA data (including issue fields and worklogs) to custom Excel templates. The templates can use filtering to the date range, and can display your report in an Excel pivot table. If you need further dimensions (like additional grouping by project, by component, by week, by month, etc.), these are super simple to add. You can also visualize the output in a pivot chart.
Tip: there is a default template included in the plugin, called worklog-report.xlsx, which can be used as is, or as starting point for further customization. It looks like this (there is a time-by-project pivot chart in the first worksheet, but I don't have a screenshot about that):
After the template is created, you can merge that with the most current JIRA data any time by a single click, or even generate it and email it to you automatically.
Disclaimer: I'm a developer working on this paid add-on.
You can easily do it with Everhour add-on for JIRA. It allows receiving a comprehensive report for each user over a given date range. And you are absolutely free to build any other layout of your reports and add as many data columns as you need.
Jira Sample Report - Everhour
If you're on Windows you can run the following powershell script to extract the data to CSV file.
## Instructions ##
Open Powershell ISE (It's installed to all windows 7 and later PCs)
Create a new PowerShell script (ctrl+n)
Paste the text from the following code block into the new file
##################################################################
# Variables
##################################################################
$username = "myname#asdf.com"
$password = Read-host "What's your Jira password?" -AsSecureString
#$password = ""
$jiraDomain = "asdf.atlassian.net"
$projectKey = "ABC"
$startDate = [datetime]::ParseExact('2017-05-08', 'yyyy-MM-dd', $null)
$endDate = Get-Date
#Get-Date = today
$csvFileName =c:\temp\Worklog.csv
##################################################################
# Functions
##################################################################
function get-jiraData {
param( [string]$restRequest)
Invoke-RestMethod -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Uri $restRequest
}
function get-issues {
param( [string]$projectName)
$uri = "https://${jiraDomain}/rest/api/2/search?jql=project=${projectName}"
$issuesPage = get-jiraData -RestRequest $uri
#write first batch of issues
$issuesPage.issues
#do next batches
do {
$startAt = $issuesPage.maxResults + 1
$uri = "https://${jiraDomain}/rest/api/2/search?jql=project=${projectName}&startAt=$startAt"
$issuesPage = get-jiraData -RestRequest $uri
#write next batch of issues
$issuesPage.issues
} while (($issuesPage.startAt + $issuesPage.maxResults) -lt $issuesPage.total)
}
filter convert-worklog {
$worklog = New-Object System.Object
$worklog | Add-Member –type NoteProperty –Name Person –Value $_.author.name
$worklog | Add-Member –type NoteProperty –Name IssueKey –Value $key
$startDate = [datetime]::ParseExact($_.started.Substring(0,16), 'yyyy-MM-ddTHH:mm', $null)
$worklog | Add-Member –type NoteProperty –Name DateLogged –Value $startDate
$TimeMinutes = $_.timeSpentSeconds / 60
$worklog | Add-Member –type NoteProperty –Name TimeSpent –Value $TimeMinutes
$worklog | Add-Member –type NoteProperty –Name Comment –Value $_.comment
$worklog
}
filter extract-worklogs {
#$key = "WL-22"
$key = $_.key
$uri = "https://${jiraDomain}/rest/api/2/issue/${key}/worklog"
$worklogsPage = get-jiraData -RestRequest $uri
#write first batch of worklogs
$worklogsPage.worklogs | convert-worklog
#Check for another batch of worklogs
do {
$startAt = $worklogsPage.maxResults + 1
$uri = "https://${jiraDomain}/rest/api/2/issue/${key}/worklog?startAt=$startAt"
$worklogsPage = get-jiraData -RestRequest $uri
#write next batch of worklogs
$worklogsPage.worklogs | convert-worklog
} while (($worklogsPage.startAt + $worklogsPage.maxResults) -lt $worklogsPage.total)
}
##################################################################
# Execution
##################################################################
#Setup Authentication variable
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
#This grabs all the worklogs for a project, then filters them by
$WorkLogs = get-issues -projectName $projectKey | extract-worklogs | ?{ $_.DateLogged -gt $startDate -and $_.DateLogged -lt $endDate } | sort DateLogged
$WorkLogs | export-csv $csvFileName -NoTypeInformation
Modify the variables at the start of the file
Save as a powershell script somewhere on your PC
Run the script by double clicking it

Resources