Sorry if SO is not the best place, but I have time-tracking enabled in JIRA and want to be able to generate a time-report for each user over a given date range. The only time-tracking report option I have is very limited and doesn't do what I want, is it possible through standard functionality or a free plugin perhaps?
You might want to check out Tempo Plugin for JIRA timetracking. It offers timesheets, reports and gadgets on user, team, project, and customer levels.
how about this one:
https://plugins.atlassian.com/plugin/details/294
If you don't want to pay a lot of money for a simple action like getting a summary of time per user.
I found this flow useful:
Create a filter that you like to measure (I measure time only by sub tasks)
Export it to excel
Copy and paste it into a google docs spreadsheet
In google docs you have an option to create a Pivot Table, so just create one that the rows are the assignees and the values are the time
You can also create a calculated column to get the time in hours (just divide it by 3600)
Hope it helps
Using the Better Excel Plugin you can take advantage of all reporting features in Microsoft Excel.
This plugin exports any sort of JIRA data (including issue fields and worklogs) to custom Excel templates. The templates can use filtering to the date range, and can display your report in an Excel pivot table. If you need further dimensions (like additional grouping by project, by component, by week, by month, etc.), these are super simple to add. You can also visualize the output in a pivot chart.
Tip: there is a default template included in the plugin, called worklog-report.xlsx, which can be used as is, or as starting point for further customization. It looks like this (there is a time-by-project pivot chart in the first worksheet, but I don't have a screenshot about that):
After the template is created, you can merge that with the most current JIRA data any time by a single click, or even generate it and email it to you automatically.
Disclaimer: I'm a developer working on this paid add-on.
You can easily do it with Everhour add-on for JIRA. It allows receiving a comprehensive report for each user over a given date range. And you are absolutely free to build any other layout of your reports and add as many data columns as you need.
Jira Sample Report - Everhour
If you're on Windows you can run the following powershell script to extract the data to CSV file.
## Instructions ##
Open Powershell ISE (It's installed to all windows 7 and later PCs)
Create a new PowerShell script (ctrl+n)
Paste the text from the following code block into the new file
##################################################################
# Variables
##################################################################
$username = "myname#asdf.com"
$password = Read-host "What's your Jira password?" -AsSecureString
#$password = ""
$jiraDomain = "asdf.atlassian.net"
$projectKey = "ABC"
$startDate = [datetime]::ParseExact('2017-05-08', 'yyyy-MM-dd', $null)
$endDate = Get-Date
#Get-Date = today
$csvFileName =c:\temp\Worklog.csv
##################################################################
# Functions
##################################################################
function get-jiraData {
param( [string]$restRequest)
Invoke-RestMethod -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Uri $restRequest
}
function get-issues {
param( [string]$projectName)
$uri = "https://${jiraDomain}/rest/api/2/search?jql=project=${projectName}"
$issuesPage = get-jiraData -RestRequest $uri
#write first batch of issues
$issuesPage.issues
#do next batches
do {
$startAt = $issuesPage.maxResults + 1
$uri = "https://${jiraDomain}/rest/api/2/search?jql=project=${projectName}&startAt=$startAt"
$issuesPage = get-jiraData -RestRequest $uri
#write next batch of issues
$issuesPage.issues
} while (($issuesPage.startAt + $issuesPage.maxResults) -lt $issuesPage.total)
}
filter convert-worklog {
$worklog = New-Object System.Object
$worklog | Add-Member –type NoteProperty –Name Person –Value $_.author.name
$worklog | Add-Member –type NoteProperty –Name IssueKey –Value $key
$startDate = [datetime]::ParseExact($_.started.Substring(0,16), 'yyyy-MM-ddTHH:mm', $null)
$worklog | Add-Member –type NoteProperty –Name DateLogged –Value $startDate
$TimeMinutes = $_.timeSpentSeconds / 60
$worklog | Add-Member –type NoteProperty –Name TimeSpent –Value $TimeMinutes
$worklog | Add-Member –type NoteProperty –Name Comment –Value $_.comment
$worklog
}
filter extract-worklogs {
#$key = "WL-22"
$key = $_.key
$uri = "https://${jiraDomain}/rest/api/2/issue/${key}/worklog"
$worklogsPage = get-jiraData -RestRequest $uri
#write first batch of worklogs
$worklogsPage.worklogs | convert-worklog
#Check for another batch of worklogs
do {
$startAt = $worklogsPage.maxResults + 1
$uri = "https://${jiraDomain}/rest/api/2/issue/${key}/worklog?startAt=$startAt"
$worklogsPage = get-jiraData -RestRequest $uri
#write next batch of worklogs
$worklogsPage.worklogs | convert-worklog
} while (($worklogsPage.startAt + $worklogsPage.maxResults) -lt $worklogsPage.total)
}
##################################################################
# Execution
##################################################################
#Setup Authentication variable
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
#This grabs all the worklogs for a project, then filters them by
$WorkLogs = get-issues -projectName $projectKey | extract-worklogs | ?{ $_.DateLogged -gt $startDate -and $_.DateLogged -lt $endDate } | sort DateLogged
$WorkLogs | export-csv $csvFileName -NoTypeInformation
Modify the variables at the start of the file
Save as a powershell script somewhere on your PC
Run the script by double clicking it
Related
We are using Azure DevOps 2019 on-prem in our firm, and I would like to create an option box field in our Bug work item, and I want it to be a combo-box where the values are builds from all the build definitions under the project.
From checking the documentation of the system, I did not find any option to how to do it, and ether if it would be better to query the System through the API, or Query the DB.
I don't think there is a built-in feature like this.
What you can do, is to create a string field that takes the values from the gloabllist, in the globallist create in the first time a globallist with the project name, for example:
<GLOBALLIST name="MyProject-builds">
</GLOBALLIST>
Now you can use PowerShell to get the build definitions for this project, and update this globallist with the values:
Param(
[string]$collection = "http://tfs-server:8080/tfs/collection",
[string]$project = "MyProject",
[string]$filePath = "C:\Globallist.xml"
)
$url = "$collection/$project/_apis/build/definitions?api-version=4.0"
$builds = (Invoke-RestMethod -Uri $url -Method Get -UseDefaultCredentials -ContentType application/json).value.name
witadmin exportgloballist /collection:$collection /f:$filePath
[xml]$gloabllist = Get-Content $filePath
$gloabllist.GLOBALLISTS.GLOBALLIST.Where({ $_.name -eq "$project-builds" }).LISTITEM | %{ $_.ParentNode.RemoveChild($_) | Out-Null }
$node = $gloabllist.GLOBALLISTS.GLOBALLIST.Where({ $_.name -eq "$project-builds" })
$builds.ForEach({
$child = $gloabllist.CreateElement("LISTITEM")
$att = $gloabllist.CreateAttribute("value")
$child.Attributes.Append($att)
$child.value = "$_"
$node.AppendChild($child)
})
$gloabllist.Save($filePath)
witadmin importgloballist /collection:$collection /f:$filePath
You can set a scheduled build that tun this script each day to be updated all the time.
You can also improve the script to get all the projects, itreate them, get the build definitions names and update the globallist file.
Maybe someone here can help me out with this. I am trying to convert all XLS to XLSX/M files with powershell and interop. So far so good. In my next step, I have to adapt the link sources in each file, which works sometimes (also from XLS to XLSX/M).
I don’t know why, but sometimes the original worksheet name does not exist in the linked Excel file and results in a pop up with which the user has to interact:
I actually really don’t care so much about the sheet and I just want to ignore the message so that the script can continue.
In my code I use the function ChangeLink, like this:
$workbook.ChangeLink($fileLink_old, $fileLink_new)
I also have deactivated any warning on the excel object itself, but nothing helps:
$excel.DisplayAlerts = $False
$excel.WarnOnFunctionNameConflict = $False
$excel.AskToUpdateLinks = $False
$excel.DisplayAlerts = $False
The most convinient way for me would be just ignoring the pop up.
Is there a way without going through all cells by itself or modifing the externalLinks/_rels inside of the excel file?
Thanks in advance
Stephan
Edit:
To loop through each cell, not really efficient
ForEach ($Worksheet in #($workbook.Sheets)) {
Write-Host $Worksheet.Name
ForEach ($filelink in $fileLinks){
$worksheetname = $null
$fl_we = $fileLink.Substring(0, $fileLink.LastIndexOf('.'))
$found = $Worksheet.Cells.Find($fl_we.Substring(0, $fl_we.LastIndexOf('\')) + '\[' + $fl_we.Substring($fl_we.LastIndexOf('\')+1))
if($found -ne $null){
Write-Host Search $filelink
Write-Host $Worksheet.Cells($found.Row,$found.Column).Formula
$str_formula = $Worksheet.Cells($found.Row,$found.Column).Formula
$worksheetname = $str_formula.Substring($str_formula.IndexOf(']')+1,$str_formula.IndexOf('!')-$str_formula.IndexOf(']')-2)
Write-Host $worksheetname -ForegroundColor DarkGray
#Add worksheets with filename to list
}
}
}
#Check if worksheet exists in linked file
Our team is quite used to the build quality value in TFS 2015 and earlier XAML builds. This is not possible with the new Build & Release, but we can add tags to a build. However, they are not shown in the list of builds, we can only filter by tags in the build definition history (which shows a list of builds). Is there anyway to configure this to show tags for builds? Or any other way to show the tags in a list of builds?
We can use REST API to get these values back, is there a way to modify the web pages or add our own?
Note - we did NOT install SharePoint so we can't use that.
There isn't a way to configure to show tags for builds, it's not supported.
There is a User Voice here to suggest the feature, you can go and vote it up to achieve it in future.
As a workaround, you can list the tags with build tags REST API, then filter the builds by tags just as you did.
Another way is retrieving the build list with tags using REST API.
For example, you can use below PowerShell script to get the build list with tags and export the build list to a .csv file.
$Collection = "http://server:8080/tfs/DefaultCollection"
$teamproject = "ProjectName"
$baseUrl = "$Collection/$teamproject/_apis/build/builds?api-version=2.0"
$builds = (Invoke-RestMethod -Uri $baseUrl -Method Get -UseDefaultCredential).value
$BuildResults = #()
foreach($build in $builds){
$customObject = new-object PSObject -property #{
"BuildDefinition" = $build.definition.name
"BuildId" = $build.id
"BuildNumber" = $build.buildNumber
"status" = $build.status
"result" = $build.result
"finishTime" = $build.finishTime
"sourceBranch" = $build.sourceBranch
"sourceVersion" = $build.sourceVersion
"tags" = #($build.tags -join ',')|Select-Object
"RequestedFor" = $build.requestedFor.displayName
}
$BuildResults += $customObject
}
$BuildResults | Select `
BuildDefinition,
BuildId,
BuildNumber,
status,
result,
finishTime,
sourceBranch,
sourceVersion,
tags,
RequestedFor|export-csv -Path E:\user\$teamproject-Build.csv -NoTypeInformation
I've checked code and examples from identical questions on this site, but I'm baffled, because I'm utilizing nearly the exact same code from Get-ADUser with display name as a value and it just pulls empty results for me. I don't know how to continue a conversation on an existing question, so I figured I had to make my own.
Here is what I have:
import-module activedirectory
$csv = Get-Content users.csv
foreach ($user in $csv) {
$user = $user.trim()
write-host $user #A line for testing to show me what name is being analyzed
$samName = Get-ADUser -Filter{displayName -like "$user*"} | select samaccountname
write-host $samName
}
cmd /c pause | out-null
The output I get is:
John Doe
Jane Doe
Billy Ray
But if I change
$samName = Get-ADUser -Filter{displayName -like "$user*"} | select samaccountname
to:
$samName = Get-ADUser -Filter{displayName -like "John Doe*"} | select samaccountname
It will correctly return John Doe's samaccountname.
I have a requirement to export Windows Event logs to CSV from our production environment periodically.
I have a simple XML Config file containing a list of machines I need the events from, and a list of Event Ids that I need to retrieve.
From here I'm looping through each machine name in turn, and then each event Id to retrieve the logs and then export to CSV. I'd like one CSV per machine per execution.
Once I've worked out all my variables the PS Command is quite simple to retrieve the log for one Event Id
foreach ($machine in $config.Configuration.Machines.Machine)
{
$csvname=$outputlocation + $machine.Value + "_" + $datestring + ".csv"
foreach ($eventid in $config.Configuration.EventIds.EventId)
{
Get-WinEvent -ComputerName $machine.Value -ErrorAction SilentlyContinue -FilterHashTable #{Logname='Security';ID=$eventid.Value} | where {$_.TimeCreated -gt $lastexecutiondate} | export-csv -NoClobber -append $csvname
}
}
Execpt I'm unable to append to a CSV each time, PS 2.0 apparently does not support this. I've tried extracting all Event Ids at once but this seems to be a bit long winded and may now allow use of a config file, but I'm fairly new to PowerShell so I haven't had much luck.
I also need to specify multiple LogNames (System, Security and Application), and would prefer to run one statement as opposed to the same statement 3 times and appe but I'm unsure of how to do this.
Unfortunately at this point Google has me running in circles.
The following is something I culled together to allow me to export the prior 24 hours of events for select event logs - I'm going to create a scheduled task out of it so it pulls a daily.
Hope this helps someone else...
$eventLogNames = "Application", "Security", "System", "Windows PowerShell"
$startDate = Get-Date
$startDate = $startDate.addDays(-1).addMinutes(-15)
function GetMilliseconds($date)
{
$ts = New-TimeSpan -Start $date -End (Get-Date)
[math]::Round($ts.TotalMilliseconds)
}
$serverName = get-content env:computername
$serverIP = gwmi Win32_NetworkAdapterConfiguration |
Where { $_.IPAddress } | # filter the objects where an address actually exists
Select -Expand IPAddress | # retrieve only the property *value*
Where { $_ -notlike '*:*' }
$fileNameDate = Get-Date -format yyyyMMddhhmm
$endDate = Get-Date
$startTime = GetMilliseconds($startDate)
$endTime = GetMilliseconds($endDate)
foreach ($eventLogName in $eventLogNames)
{
Write-Host "Processing Log: " $eventLogName
<# - Remove comment to create csv version of log files
$csvFile = $fileNameDate + "_" + $serverIP +"_" + $eventLogName + ".csv"
Write-Host "Creating CSV Log: " $csvFile
Get-EventLog -LogName $eventLogName -ComputerName $serverName -After $startDate -ErrorAction SilentlyContinue | Sort MachineName, TimeWritten | Select MachineName, Source, TimeWritten, EventID, EntryType, Message | Export-CSV $csvFile #ConvertTo-CSV #Format-Table -Wrap -Property Source, TimeWritten, EventID, EntryType, Message -Autosize -NoTypeInformation
#>
$evtxFile = $fileNameDate + "_" + $serverIP + "_" + $eventLogName + ".evtx"
Write-Host "Creating EVTX Log: " $evtxFile
wevtutil epl $eventLogName $evtxFile /q:"*[System[TimeCreated[timediff(#SystemTime) >= $endTime] and TimeCreated[timediff(#SystemTime) <= $startTime]]]"
}
Why do I get Failed to export log Security. The specified query is invalid. I get this for each type of event log (system, application etc). This happens only to evtx export. I get the csv file tho`....