I am trying to write a plugin for SonarQube that uses the blame information provided by the SCM-Activity plugin. The problem is that, in Sonar's database, the blame information seems to either be missing or encrypted.
For example, I ran the following query against Sonar's database in MySQL Workbench:
SELECT p.kee, m.name, pm.text_value
FROM sonar.project_measures pm
JOIN sonar.snapshots s on pm.snapshot_id = s.id
JOIN sonar.metrics m on m.id = pm.metric_id
JOIN sonar.projects p on s.project_id = p.id
WHERE s.root_project_id = 1 and m.domain = 'SCM';
Here's a sample of the result:
As you can see, there are four metrics that pertain to the SCM-Activity Plugin for SonarQube:
authors_by_line
revisions_by_line
last_commit_datetimes_by_line
scm.hash
So, here are my questions:
Why is it that scm.hash is the only metric that has any value in the column text_value, and the other metrics don't? (I tried the other columns in the project_measures table, and none of them seem to have any values either.)
How do I get useful, decrypted information from the scm.hash metric? Is there a ruby method somewhere that I can use in the front end to get it? (I figure there must be, or how else is SonarQube displaying the blame info when I drill down on lines?)
If there are Ruby methods that allow retrieval and decryption of blame info, they must be located in SonarQube's source code itself, as the SCM-Activity Plugin source code seems to be devoid of any Ruby. If I am right, then where in SonarQube's source code are these Ruby methods located? I've been unable to find them.
You see NULL values in "text_value" because those metrics need so store more than just a simple line of text. So you have to join the table "MEASURE_DATA" to get the value of those measures.
Related
I have an exam using SQLPlus and sometimes I don't remember the exact syntax of SQL statements, so I was wondering if there is any way to get some nice inline help from inside SQLPlus.
For instance, say I forgot how to use INSERT INTO, and I want some reminder like this:
INSERT INTO table-name (column-names)
VALUES (values)
Is this possible?
I tried HELP command but none of that seems to suits my needs.
I Googled it with no success.
No. SQL is a standardized language (at least ANSI SQL) and SQLPlus "just" uses that syntax, so it's not covered by internal help. Internal help lists only SQLPlus specific commands (ex. SET, CONNECT, SPOOL).
It is possible to workaround that in some way, but very limited. You can call dbms_metadata.get_ddl function for some existing object. Some of those DDLs could have statements you are intrested in. For example - you'd like to see select statement - then you could call dbms_metadata.get_ddl for some existing view:
select dbms_metadata.get_ddl('VIEW', 'USER_TABLES', 'SYS')
from dual;
Be aware - it works only for Oracle 11G and lower, in the newest one SYS objects are not accessible in that way (I'm not sure about Oracle 12.1).
The more interesting are tiggers, procedures, functions, and packages. You cannot use dbms_metadata to get DDLs of packages owned by SYS, but maybe you can connect to some sample schemas like HR (Human Resources), AD (Academic), SH (Sales History).
In HR schema there is stored procedure ADD_JOB_HISTORY, which has inside insert statement, so it looks like that:
select dbms_metadata.get_ddl('PROCEDURE', 'ADD_JOB_HISTORY')
from dual;
CREATE OR REPLACE EDITIONABLE PROCEDURE "HR"."ADD_JOB_HISTORY"
( p_emp_id job_history.employee_id%type
, p_start_date job_history.start_date%type
, p_end_date job_history.end_date%type
, p_job_id job_history.job_id%type
, p_department_id job_history.department_id%type
)
IS
BEGIN
INSERT INTO job_history (employee_id, start_date, end_date,
job_id, department_id)
VALUES(p_emp_id, p_start_date, p_end_date, p_job_id, p_department_id);
END add_job_history;
There are better ways and better tools to achieve your goal - see below.
Are you allowed to use SQL Developer instead of SQLPlus? SQL Developer has nice feature to drag-and-drop table icon into worksheet, then you will be nicely prompted to choose what kind of example statement you are looking for (SELECT, INSERT, UPDATE etc.) - after choosing one you will get sample statement.
But the best way is just open in browser Database SQL Language Reference:
https://docs.oracle.com/database/121/SQLRF/toc.htm
I have an nFeed plugin on my Jira instance that I'm trying to learn to use. I have a select list in my create screen and I have it configured to JNDI. When I go to my select list, it has 3 options (how many rows in my table), but they're all blank.
Here is my query
SELECT p.PRODUCT_NAME, p.PRODUCT_ID FROM NIRD_Product_Groups p
and my key is 0
I have native filter checked and the display template is {1}
anyone hae an ideas what the problem could be?
It looks like probelm with your SQL query.
- Have you tried to enable the Debug mode of this nFeed field to figure out if ther is any error raised ?
- What type of database are you pulling data from ? Depending on its type you probably need to put double quotes around your attribute and table name in your query.
- You set the key as 0, which refers to the PRODUCT_NAME attribute. Would not it be safer to set the key as PRODUCT_ID ? This sounds more accurate to me.
- Have you tried to test this configuration on a different web browser ?
If you need further help, Valiantys is providing free support on its plugin here : https://jira.valiantys.com
I need to fetch all build records with following fields, BuildName, VersionNo, BuildDate & BuildStatus from TFS_Analysis. Can someone help me to write MDX query for this.
Am unable to see BuildStatus field in TFS_Warehouse DB hence trying to get it from TFS_Analysis.
It looks a little bit like sql but the way things work are a world apart from sql. The following is complete guess work as there is not much detail in your question!...
SELECT
[Measures].[SOMEMEASUREINCUBE] ON 0
,
[BuildName].[BuildName].MEMBERS*
[VersionNo].[VersionNo].MEMBERS*
[BuildDate].[BuildDate].MEMBERS*
[BuildStatus].[BuildStatus].MEMBERS ON 1
FROM TFS_Analysis;
I have a very large block of SQL that I am trying to execute inside of Delphi, against a Microsoft SQL Database. I am getting this:
Multiple-step OLE DB operation generated errors.
Check each OLE DB status value, if available. No work was done.
The script has multiple sql IF statements followed by BEGIN and END blocks with invocations of stored procedures, declaration of variables, and EXEC inside that. Finally it returns some of the variable values by SELECT #Variable1 AsName1,#Variable2 AsName2....
The above multi-step error is coming in as an OLEException from ADO, not from the Delphi code, and happens after all the SQL exec-stored-procedure have occurred, and therefore I suspect it's firing this OLE exception when it reaches the final stage which SELECT #Variable1 AsName1,... to get back a few variable values for my program to see them.
I know about this retired/deprecated MS KB article, and this is unfortunately not my actual issue:
http://support.microsoft.com/kb/269495
In short that KB article says to fix a registry key and remove "Persist Security Info" from the connection string. That's not my problem. I'm asking this question because I found the answer already and I think that someone else who gets stuck here might not want to waste several hours finding potential issues when there are several that I have found after searching for solutions for several hours. Anyone who wants to add another answer with different options, is fine, and I'll select yours if it's reproducible, and if necessary I'll turn this one into a Community Wiki because there could be a dozen obscure causes for this "ADO recordset is in a bad mood and is unhappy with your T-SQL" exception.
I have found several potential causes that are listed in various sources of documentation. The original KB article in the question suggests removing the 'Persist Security Info' from my ADO connection string, however in a standalone test in an application with just a TADOConnection and a single TADOQuery, the presence or absence of Persist Security Info had no effect, nor did explicitly setting it True or False.
What DID fix it was removing this CursorType declaration:
CursorType=ctKeyset
What I have learned is that bidirectional ADO datasets are fine for SELECT * FROM TABLE in ADO but are not so fine for complex SQL scripts.
Potential source of this error is updating char field with large value.
Example: Form has edit box with max length property set to 20 characters and Oracle database table has field defined as char(10).
Updating with 10 characters (or less) will work fine while updating with more then 10 characters will cause 'Multiple step...' error on ADOQuerry.UpdateBatch().
You also have to know that CHAR will allways have 20 characters. Consider Trimming value in edit box. CHAR behaves different than VARCHAR2 type.
If you have a query with parameter ,check the number of parameters in the query is matched with script...!
I had a project in Redmine with more than 600 issues. I moved all the issues to a different project. I had no idea that the move deletes all the data for the custom fields!
So all the custom field values are now lost. I did not backup the database before this action as I really did not think that I was going to do any harm by moving issues as moving is a native function in the UI.
What I noticed is though that the production.log contains events for all creation and updates. All my 600 issues are in order in the production log. How can I use these log statements to repeat the actions? If I can import all the log actions, I can migrate the custom fields that it writes to the original Redmine instance and restore my values.
Entries look like this:
Processing IssuesController#update (for XX.XX.XX.X at 2013-02-07 11:19:54) [PUT]
Parameters: {"_method"=>"put", "authenticity_token"=>"nWNSSRYjHhN0BGb+Ya8M4pYWPPgsfdM=", "issue"=>{"assigned_to_id"=>"", "custom_field_values"=>{"10"=>"", "5"=>"Not translated", "1"=>"fi", "8"=>"http://screencast.com/t/ODknR8K", "9"=>"", "3"=>"", "4"=>""}, "done_ratio"=>"0", "due_date"=>"", "priority_id"=>"4", "estimated_hours"=>"", "start_date"=>"2013-02-07", "subject"=>"1\tInstallation in English", "tracker_id"=>"1", "lock_version"=>"0", "description"=>"Steps:\r\nOpen Nitro\r\n\r\nProblem:\r\nNot localized"}, "controller"=>"issues", "time_entry"=>{"hours"=>"", "activity_id"=>"", "comments"=>""}, "attachments"=>{"1"=>{"description"=>""}}, "id"=>"3876", "action"=>"update", "commit"=>"Submit", "notes"=>""}
I am really hoping that there is a way, any help will be greatly appreciated
You could use a decent text editor and/or spreadsheet application and do a massive find and replace and construct a series of UPDATE SQL commands and run them directly on the database (TEST FIRST!!)
Extract from log
Remove unnessary information
Copy into spreadsheet
Split text into columns
Add in columns with necessary SQL commands "UPDATE SET etc" copy into all rows of this column etc.
Join columns to make one text command per row
Export joined data to a text file
Run against test database as sql
If all goes well run against production database as sql
The log entry, following "Parameters:", looks like a regular Ruby hash definition. I'd parse that out and eval it back into a hash variable.
From there you will need to peel off elements and insert them into a database. I'd do that using Sequel, but use what works for you.
Talk to the RedMine support people and get the schema for their tables so you can figure out what data goes where and the database driver needed.