Generic Parsing Error - parsing

In a 4D application, I have the following lines of code:
Begin SQL
UPDATE Keys
SET Desc = :$desc,
KeyStamp = :$key_stamp
WHERE KeyTitle = :$key_title
End SQL
When trying to run the code, the following error is displayed:
Generic parsing error. Parsing failed in or around the following
substring interval - ( 16, 23 ) - ... SET Desc = ...
Does anyone see the problem with the code? Keys is not a keyword or anything, is it?

Seems like someone forgot that Desc is a keyword for "descending." The solution is to rename the column and update all references to that column. Otherwise, the column cannot be referenced in SQL.

Related

EXECUTE FORMAT() USING in postgres is showing error for $

i will directly get to the code rather than explaining too much
execute format('
"$1" = select "Source1" from temp_tables._%s;
'::text, (translate("Song_Id_"::text, '-', '_')))
using "Source1__";
the table is dynamically created and the table name is all fine as i have used that table to insert some data into it. if i run this code, the error i am getting is
ERROR: syntax error at or near "$1"
LINE 1: $1 = select "Source1" from temp_tables._24af1593_3539_49fd_9...
^
QUERY: $1 = select "Source1" from temp_tables._24af1593_3539_49fd_9ef4_29307f301d38;
i have tried other method too like
execute
'$1 = select "Source1" from temp_tables._' || (translate("Song_Id_"::text, '-', '_')) ||';'
using "Source1__";
even this gives the same error.
note : "Source1__" is a variable of type text declared in the stored procedure where everything else is being executed too.
This is wrong. The EXECUTE command accepts only SQL statements. There is nothing statement like var = SELECT .... More - the you can pass just value by using clause USING. You cannot to pass any reference to variable. Solution is easy. Just use clause INTO
EXECUTE 'SELECT ... ' INTO target_plpgsql_variable
Please, read related documentation. Unfortunately, some parts the stored procedures are not extra intuitive because there is mix of two very different languages. It is good by reading documentation, because is hard to find correct solution without knowledge of possibilities and syntax.

Getting error when using timestamp index in dask with snowflake

I am trying read tables from snowflake and use merge_asof to perform point in time correct join. Here is the corresponding code:
next_failure = dd.read_sql_table('vm_next_failure', conn_url, index_col='ts')
errors = dd.read_sql_table('vm_errors', conn_url, index_col='ts')
ndf = dd.merge_asof(next_failure, errors, left_index=True, right_index='ts', by="machineid", suffixes=("_l", "_r"), allow_exact_matches=False)
Here is the error I get:
ProgrammingError: (snowflake.connector.errors.ProgrammingError) 252004: Failed processing pyformat-parameters: 255001: Binding data in type (timestamp) is not supported.
[SQL: SELECT vm_errors.ts, vm_errors.machineid, vm_errors.error1, vm_errors.error2, vm_errors.error3, vm_errors.error4, vm_errors.error5
FROM vm_errors
WHERE vm_errors.ts >= %(ts_1)s AND vm_errors.ts <= %(ts_2)s]
[parameters: {'ts_1': Timestamp('2015-01-01 06:00:00'), 'ts_2': Timestamp('2016-01-01 05:00:00')}]
(Background on this error at: http://sqlalche.me/e/13/f405)
Any thoughts on how to workaround this?
Looks like you need to replace Timestamp with TO_Timestamp.
Please run the following query in the snowflake:
select TO_Timestamp('2016-01-01 05:00:00')

Arel + Rails 4.2 causing problems (bindings being lost)

We recently upgraded to Rails 4.2 from Rails 4.1 and are seeing problems with using Arel + Activerecord because we're getting this type of error:
ActiveRecord::StatementInvalid: PG::ProtocolViolation: ERROR: bind message supplies 0 parameters, but prepared statement "" requires 8
Here's the code that is breaking:
customers = Customer.arel_table
ne_subquery = ImportLog.where(
importable_type: Customer.to_s,
importable_id: customers['id'],
remote_type: remote_type.to_s.singularize,
destination: 'hello'
).exists.not
first = Customer.where(ne_subquery).where(company_id: #company.id)
second = Customer.joins(:import_logs).merge(
ImportLog.where(
importable_type: Customer.to_s,
importable_id: customers['id'],
remote_type: remote_type.to_s.singularize,
status: 'pending',
destination: 'hello',
remote_id: nil
)
).where(company_id: #company.id)
Customer.from(
customers.create_table_alias(
first.union(second),
Customer.table_name
)
)
We figured out how to solve the first part of the query (running into the same rails bug of not having bindings) by moving the exists.not to be within Customer.where like so:
ne_subquery = ImportLog.where(
importable_type: Customer.to_s,
importable_id: customers['id'],
destination: 'hello'
)
first = Customer.where("NOT (EXISTS (#{ne_subquery.to_sql}))").where(company_id: #company.id)
This seemed to work but we ran into the same issue with this line of code:
first.union(second)
whenever we run this part of the query, the bindings get lost. first and second are both active record objects but as soon as we "union" them, they lose the bindings are become arel objects.
We tried cycling through the query and manually replacing the bindings but couldn't seem to get it working properly. What should we do instead?
EDIT:
We also tried extracting the bind values from first and second, and then manually replacing them in the arel object like so:
union.grep(Arel::Nodes::BindParam).each_with_index do |bp, i|
bv = bind_values[i]
bp.replace(Customer.connection.substitute_at(bv, i))
end
However, it fails because:
NoMethodError: undefined method `replace' for #<Arel::Nodes::BindParam:0x007f8aba6cc248>
This was a solution suggested in the rails github repo.
I know this question is a bit old, but the error sounded familiar. I had some notes and our solution in a repository, so I thought I'd share.
The error we were receiving was:
PG::ProtocolViolation: ERROR: bind message supplies 0 parameters, but
prepared statement "" requires 1
So as you can see, our situation is a bit different. We didn't have 8 bind values. However, our single bind value was still being clobbered. I changed the naming of things to keep it general.
first_level = Blog.all_comments
second_level = Comment.where(comment_id: first_level.select(:id))
third_level = Comment.where(comment_id: second_level.select(:id))
Blog.all_comments is where we have the single bind value. That's the piece we're losing.
union = first_level.union second_level
union2 = Comment.from(
Comment.arel_table.create_table_alias union, :comments
).union third_level
relation = Comment.from(Comment.arel_table.create_table_alias union2, :comments)
We created a union much like you except that we needed to union three different queries.
To get the lost bind values at this point, we did a simple assignment. In the end, this is a little simpler of a case than yours. However, it may be helpful.
relation.bind_values = first_level.bind_values
relation
By the way, here's the GitHub issue we found while working on this. It doesn't appear to have any updates since this question was posted though.

wrong number of arguments (1 for 2..3) for Active Record postgresql query (Rails 4/postgresql 9.4) [duplicate]

Right now I am in the middle of migrating from SQLite to Postgresql and I came across this problem. The following prepared statement works with SQLite:
id = 5
st = ActiveRecord::Base.connection.raw_connection.prepare("DELETE FROM my_table WHERE id = ?")
st.execute(id)
st.close
Unfortunately it is not working with Postgresql - it throws an exception at line 2.
I was looking for solutions and came across this:
id = 5
require 'pg'
conn = PG::Connection.open(:dbname => 'my_db_development')
conn.prepare('statement1', 'DELETE FROM my_table WHERE id = $1')
conn.exec_prepared('statement1', [ id ])
This one fails at line 3. When I print the exception like this
rescue => ex
ex contains this
{"connection":{}}
Executing the SQL in a command line works. Any idea what I am doing wrong?
Thanks in advance!
If you want to use prepare like that then you'll need to make a couple changes:
The PostgreSQL driver wants to see numbered placeholders ($1, $2, ...) not question marks and you need to give your prepared statement a name:
ActiveRecord::Base.connection.raw_connection.prepare('some_name', "DELETE FROM my_table WHERE id = $1")
The calling sequence is prepare followed by exec_prepared:
connection = ActiveRecord::Base.connection.raw_connection
connection.prepare('some_name', "DELETE FROM my_table WHERE id = $1")
st = connection.exec_prepared('some_name', [ id ])
The above approach works for me with ActiveRecord and PostgreSQL, your PG::Connection.open version should work if you're connecting properly.
Another way is to do the quoting yourself:
conn = ActiveRecord::Base.connection
conn.execute(%Q{
delete from my_table
where id = #{conn.quote(id)}
})
That's the sort of thing that ActiveRecord is usually doing behind your back.
Directly interacting with the database tends to be a bit of a mess with Rails since the Rails people don't think you should ever do it.
If you really are just trying to delete a row without interference, you could use delete:
delete()
[...]
The row is simply removed with an SQL DELETE statement on the record’s primary key, and no callbacks are executed.
So you can just say this:
MyTable.delete(id)
and you'll send a simple delete from my_tables where id = ... into the database.

T-SQL Error in non-executing branch of if..else block

I'm hitting my head against the wall with this one.
We have a stored procedure that is being called in an API that we are developing and the stored procedure has the following code:
if(#StatusCode = 41 and #OperationName != 'convert')
Begin
EXEC [uspCreateOrg] #RequestID = #_RequestId
End
else
Begin
EXEC [uspUpsertOrg] #RequestID = #_RequestId
End
Using the profiler, we can that the 'if' branch is the one that gets executed, but we also see that SQL Server is looking down the 'else' branch and calling into that stored procedure and throwing an exception. The uspUpsertOrg procedure calls the DBAmp BulkOps which has the following code in it:
create table #errorlog (line varchar(255))
insert into #errorlog
exec #Result = master..xp_cmdshell #Command
-- print output to msgs
declare #line varchar(255)
declare #printCount int
set #printCount = 0
DECLARE tables_cursor CURSOR FOR SELECT line FROM #errorlog
OPEN tables_cursor
FETCH NEXT FROM tables_cursor INTO #line
WHILE (##FETCH_STATUS <> -1)
BEGIN
if #line is not null
begin
print #line
exec SF_Logger #SPName,N'Message', #Line
set #errorLines = #errorLines + #line
set #printCount = #printCount +1
end
FETCH NEXT FROM tables_cursor INTO #line
END
deallocate tables_cursor
-- drop temp output table
drop table #errorlog
The exception that gets thrown is that the #errorLog table does not exist. So in summary we are getting an exception that a temp table created on the line above the insert does not exist in a stored procedure that does not even get called...Fun...
When we comment out the call to uspUpsertOrg everything works as expected. When we change the temp table to a real table, it still fails, but if we create it outside the procedure and then run the process, it works. In any of these cases, the code does not go down the 'else' branch in the sense that the 'else' branch would be the one that gets executed. It's almost as if SQL server is looking ahead down all code paths--I know that SQL Server does that kind of thing for optimization, etc, but why would it miss the fact that the table IS being created before use? I've done this kind of thing before and never had problems.
Thanks for the help!
According to this article on Execution Plan Basics, this exact scenario causes a problem for the algebrizer that doesn't execute your code, but is responsible for generating the execution plan. Look for the section When the Estimated Plan is Invalid.
I think this workaround will work for you: Instead of the CREATE statement, use
SELECT CAST('' as VARCHAR(255)) as line INTO #errorlog

Resources