Wrap ef core database update in a transaction - entity-framework-migrations

My recent database update failed in the middle of a migration, leaving the database inconsistent:
> dotnet ef database update
Applying migration '20200429031727_NewRequisitesAndCountry'.
Failed executing DbCommand (97ms) [Parameters=[], CommandType='Text', CommandTimeout='30']
[...]
Microsoft.Data.SqlClient.SqlException (0x80131904): Subquery returned more than 1 value. This is not
permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.
The statement has been terminated.
Is there a way to wrap the migration in a transaction? I am interested in doing so both from the command line and programmatically while calling context.Database.Migrate().

Related

Event-Time Temporal Table Join requires both primary key and row time attribute in versioned table, but no row time attribute can be found

I have tried to use lookup join but i find this problem:
SELECT
> e.isFired,
> e.eventMrid,
> e.createDateTime,
> r.id AS eventReference_id,
> r.type
> FROM Event e
> JOIN EventReference FOR SYSTEM_TIME AS OF e.createDateTime AS r
> ON r.id = e.eventReference_id;
[ERROR] Could not execute SQL statement. Reason: org.apache.flink.table.api.ValidationException: Event-Time Temporal Table Join requires both primary key and row time attribute in versioned table, but no row time attribute can be found.
Whether that query will be interpreted by the Flink SQL planner as a temporal join or a lookup join depends on the type of the table on the right-hand side. In this case I guess you haven't used a lookup source. And your time attribute might not be defined correctly.
Temporal (time-versioned) joins require
an equality predicate on the primary key of the versioned table
a time attribute
and lookup joins require
a lookup source connector, (e.g., JDBC, HBase, Hive, or something custom)
an equality join predicate
using a processing time attribute in combination with
FOR SYSTEM_TIME AS OF (to prevent needing to update the join results)

How to use Rails with uppercase column name?

I have the following as part of an AR query:
.having('COUNT(foo.id) > bar.maxUsers')
This generates an error:
ActiveRecord::StatementInvalid:
PG::UndefinedColumn: ERROR: column bar.maxusers does not exist
^
HINT: Perhaps you meant to reference the column "bar.maxUsers".
I am referencing the column bar.maxUsers.
So, apparently AR downcases the query. How to suppress this behavior?
Rails 4.2.10
PostgreSQL
EDIT: SQL:
SELECT ... HAVING COUNT(foo.id) > bar.maxUsers
So it is happening after the to_sql. Maybe from the execute?
This isn't an ActiveRecord or AREL issue, this is just how case sensitivity works in SQL and PostgreSQL.
Identifiers in SQL (such as table and column names) are case-insensitive unless they're quoted. Standard SQL says that unquoted identifiers are folded to upper case, PostgreSQL folds them to lower case, hence the bar.maxusers in the error message.
The solution is to quote the offending column name:
.having('COUNT(foo.id) > bar."maxUsers"')
Note that you must use double quotes for quoting the identifier as single quotes are only for string literals. Also note that identifier quoting is database-specific: standard SQL and PostgreSQL use double quotes, MySQL uses backticks, SQL Server uses brackets, ...

Null / Out of Bounds Error when running Liquibase

I am using the database-migration Grails plugin. But running dbm-update runs into fatal errors for some of my sql when using the liquibase formatted sql migrations. I get this error:
liquibase : 'Change Set GraphFunctions.sql::graph_functions_initialize_1::<user> failed. Error: null'
java.lang.ArrayIndexOutOfBoundsException
This happens when I run the code:
--changeset <username>:graph_functions_initialize_1
CREATE OR REPLACE FUNCTION build_trcd(
IN new_parent_id bigint,
IN new_child_id bigint)
RETURNS TABLE(ancestor_id bigint, descendant_id bigint, paths bigint, cost bigint) AS '
SELECT
t1.ancestor_id AS ancestor_id,
t2.descendant_id AS descendant_id,
SUM(t1.paths*t2.paths)::bigint AS paths,
MIN(t1."cost"+t2."cost")+1::bigint AS "cost"
FROM db_set_membership_closure t1, db_set_membership_closure t2
WHERE t1.descendant_id=new_parent_id AND t2.ancestor_id=NEW_child_id
GROUP BY t1.ancestor_id, t2.descendant_id
UNION
SELECT
NEW_parent_id AS ancestor_id,
descendant_id AS descendant_id,
paths AS paths ,
(c."cost" + 1)::bigint AS "cost"
FROM db_set_membership_closure c
WHERE ancestor_id = NEW_child_id
UNION
SELECT
ancestor_id AS ancestor_id,
NEW_child_id AS descendant_id,
paths AS paths,
c."cost" + 1::bigint AS "cost"
FROM db_set_membership_closure c
WHERE descendant_id = NEW_parent_id
UNION VALUES (NEW_parent_id, NEW_child_id,1::bigint,1::bigint);
' LANGUAGE sql;
--rollback drop function build_trcd;
If I don't use formatted-sql then it runs fine. However, if I do that, then I cannot manage a rollback through the Liquibase interface. Does anyone have an insight into what I might change to make this work?
It turns out that sql changesets including function declarations have been failing because they contain semi-colons in the middle of the create statement. To fix these errors I just had to change the formatted-sql to not split the statements:
--changeset <username>:graph_functions_initialize_1 splitStatements:false

JIRA Database Values Plugin custom sql

The jira-database-values-plugin-10290.properties file is:
# The database connection parameters
database.driver=net.sourceforge.jtds.jdbc.Driver
database.user=...
database.password=...
database.connection.url=jdbc:jtds:sqlserver://localhost:1433/jiradb
# Cache Timeout (= 15 minutes by default). The actual db is queried only once and then the results are kept in the cache for the given timeout. Uncomment the line below to change it.
#cache.timeout=900000
# The SQL Query that will be executed on the database
sql.query=select start_date from date_table where personel like (select reporter from jiraissue where pkey like '${jira.project.key}' )
# The column number (starting from 0) that contains the primary key of the returned data.
primarykey.column.number=0
# The pattern that should be used to render the data in view mode. Use {column_number} as placeholders. You can use HTML, but make sure you close your tags properly!
rendering.viewpattern={0}
# The pattern that should be used to render the data in edit mode. Use {column_number} as placeholders.
rendering.editpattern={0}
# The pattern that should be used to render the data in searcher. Use {column_number} as placeholders.
rendering.searchpattern={0}
When i mention a specific project key (for example JRA-621  ) as below, it works fine..
sql.query=select start_date from date_table where personel like (select reporter from jiraissue where pkey like 'JRA-621' )
but when I want to call the current project key value, as below, it does not work :(
sql.query=select start_date from date_table where personel like (select reporter from jiraissue where pkey like '${jira.project.key}' )
What is wrong in my expression?
id personel start_date end_date
== ======== ========== ========
0 u.name 2012-05-05 NULL
1 u.name2 2012-04-02 NULL
...
Thanks

Rails PostgreSQL problems with order statement

Hi
I changed my database from mySql to PostgreSQL and get an error every time I use query with :order statement
For example the following code works perfectly in MySQL
Hour.sum("working_hours",:conditions=>['project_id=? AND reported_date=?',project,h.reported_date],:order=>"reported_date
But gives me an error in PostgreSQL
PGError: ERROR: column "hours.reported_date" must appear in the GROUP BY clause or be used in an aggregate function
LINE 1: ...rted_date='2010-10-06 00:00:00.000000') ORDER BY reported_d..
: SELECT sum("hours".working_hours) AS sum_working_hours FROM "hours" WHERE (project_id=1 AND reported_date='2010-10-06 00:00:00.000000') ORDER BY reported_date
If I delete the order statement then the query works ok
I will most appreciate any help on this subject
PostreSQL is stricter to the SQL standard than MySQL is.
SQL states that if you ORDER by a column, that column must be SELECTed and appear in the GROUP BY clause.
Try this:
Hour.sum("working_hours",:conditions=>['project_id=? AND reported_date=?',project,h.reported_date], :order=>"reported_date", :group_by => "working_hours"

Resources