SQL Server 2014 CSV Import not recognising end of row/line - stored-procedures

I am trying to bulk insert a csv file into SQL Server 2014. I have managed to work out that it is not recognising the end of line character in the CSV file.
The error I get is:
Msg 4864, Level 16, State 1, Procedure spTempImport, Line 74 Bulk load
data conversion error (type mismatch or invalid character for the
specified codepage) for row 1, column 51 (Val_0000).
I have read other posts on here and have tried them all and I cannot get it to work.
The SPROC is as follows:
USE [xxxxxxx]
GO
/****** Object: StoredProcedure [dbo].[spTempImport] Script Date: 26/07/2016 19:59:56 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[++++++++]
AS
BEGIN
SET NOCOUNT ON
CREATE TABLE #tmp (
--[BatchID] NVARCHAR(MAX) NOT NULL,
[ID] NVARCHAR(50) NOT NULL,
[Serial] NVARCHAR(30) NOT NULL,
[Date] VARCHAR(15) NOT NULL,
[Val_0030]NVARCHAR(15) NULL,
[Val_0000]NVARCHAR(15) NULL
)
BULK INSERT #tmp
FROM 'C:\Temp\test1.csv'
WITH
(
FIRSTROW = 1,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\r\n'
)
SELECT * FROM #tmp
I have tried using \r \n \n\r \rn CR\LF LF\CR and the various hex codes. I created the file in excel and saved as a CSV and it didn't work. I copied the data into notepad and it wont recognise the line feed/carriage return.
What am I doing wrong?
P.S. I have removed soem of the columns in the code example, the val columns are the same other than the name and [Val_0000]NVARCHAR(15) NULL is the final column.

I set the columns in the SPROC to VARCHAR(MAX) and it worked.

Related

Calling a Sybase stored procedure from another stored procedure without displaying called stored procedure results

I am calling a Sybase stored procedure X from another stored procedure Y. Following the answer to a similar question , I create a #tmp_table to hold the results from stored procedure X.
create table #tmp_table(
col1 int,
col2 varchar(100),
...
) exec sp_stored_procedureX 888, 'Parameter2', ...
select * from #tmp_table
The above successfully loads stored procedure X's results into #tmp_table but it shows the results of stored procedure X twice. I guess the first one is from "exec sp_stored_procedureX ..." part and the second one is from "select * from #tmp_table" which I intended. I don't want to display the results from the first "exec sp_stored_procedureX ..." part. How can I store data to #tmp_table without displaying it?
Please kindly let me know if more clarification/information is needed.
Thanks & Regards,
Kyoto
your syntax is incorrect for normal table in ASE. But for ASE, there's a special table name RPC table can map the output of procedure to a table format output. Maybe that's what you are looking for...And that also can be called from remote ASE.
Here's a sample --
use omni_rpc
go
create table rmtbl
(
aint int null,
bchr char(10) null,
cchr char(10) null
)
go
insert rmtbl values (11, "b_row1", "c_row1")
insert rmtbl values (22, "b_row2", "c_row2")
insert rmtbl values (33, "b_row3", "c_row3")
insert rmtbl values (44, "b_row4", "c_row4")
insert rmtbl values (55, "b_row5", "c_row6")
go
create proc procImm #Colnames varchar(100), #NameT varchar(20), #nameCol varchar
(20), #value char(2)
as
execute ('select ' + #Colnames + ' from ' + #NameT + ' where '
+ #nameCol + ' = ' + #value)
Here #NameT and #Colnames are command parameters, and #value is a search parameter based on the terms defined at the beginning of the paper.
In the local server:
use test
go
sp_addobjectdef myrpc_imm, "THIS.omni_rpc..procImm", "rpc"
go
(return status = 0)
create existing table myrpc_imm
(
NameT varchar(20),
nameCol varchar(20),
value varchar(10)
)
external procedure at "THIS.omni_rpc..procImm"
go
select * from myrpc_imm
where NameT = 'rmtbl' and nameCol = 'aint' and value = '33'
go
NameT nameCol value
-------------------- -------------------- ----------
(0 rows affected)
dbcc traceon(11225)
go
00:00000:00017:2004/04/01 12:18:47.03 server DBCC TRACEON 11225, SPID 17
DBCC execution completed. If DBCC printed error messages, contact a user with
System Administrator (SA) role.
select * from myrpc_imm
where NameT = 'rmtbl' and nameCol = 'aint' and value = '33'
go
NameT nameCol value
-------------------- -------------------- ----------
33 b_row3 c_row3
(1 row affected)

Is there any way to do error handling in snowflake?

Am currently loading data from one snowflake table to another table in snowflake, also doing some datatype conversions while doing the data loads
But when there is any error, my load is getting failed.I need to capture the error rows in a table and continue my load though any errors occur.
I have tried that using stored procedure as below but only able to capture error information:-
Please let me know if there is any way to achieve this in snowflake.
CREATE OR REPLACE PROCEDURE LOAD_TABLE_A()
RETURNS varchar
NOT NULL
LANGUAGE javascript
AS
$$
var result;
var sql_command = "insert into TABLE A"
sql_command += " select"
sql_command += " migration_status,to_date(status_date,'ddmmyyyy') as status_date,"
sql_command += " to_time(status_time,'HH24MISS') as status_time,unique_unit_of_migration_number,reason,"
sql_command += " to_timestamp_ntz(current_timestamp) as insert_date_time"
sql_command += " from TABLE B"
sql_command += " where insert_date_time>(select max(insert_date_time) from TABLE A);"
try {
snowflake.execute({ sqlText: sql_command});
result = "Succeeded";
}
catch (err) {
result = "Failed";
snowflake.execute({
sqlText: `insert into mcs_error_log VALUES (?,?,?,?)`
,binds: [err.code, err.state, err.message, err.stackTraceTxt]
});
}
return result;
$$;
I worked through an example how to send good rows from one table to another while sending bad ones to a separate table. It should be on the Snowflake blog shortly. The key is using multi-table inserts like so:
-- Create a staging table with all columns defined as strings.
-- This will hold all raw values from the load filess.
create or replace table SALES_RAW
( -- Actual Data Type
SALE_TIMESTAMP string, -- timestamp
ITEM_SKU string, -- int
PRICE string, -- number(10,2)
IS_TAXABLE string, -- boolean
COMMENTS string -- string
);
-- Create the production table with actual data types.
create or replace table SALES_STAGE
(
SALE_TIMESTAMP timestamp,
ITEM_SKU int,
PRICE number(10,2),
IS_TAXABLE boolean,
COMMENTS string
);
-- Simulate adding some rows from a load file. Two rows are good.
-- Four rows generate errors when converting to the data types.
insert into SALES_RAW
(SALE_TIMESTAMP, ITEM_SKU, PRICE, IS_TAXABLE, COMMENTS)
values
('2020-03-17 18:21:34', '23289', '3.42', 'TRUE', 'Good row.'),
('2020-17-03 18:21:56', '91832', '1.41', 'FALSE', 'Bad row: SALE_TIMESTAMP has the month and day transposed.'),
('2020-03-17 18:22:03', '7O242', '2.99', 'T', 'Bad row: ITEM_SKU has a capital "O" instead of a zero.'),
('2020-03-17 18:22:10', '53921', '$6.25', 'F', 'Bad row: PRICE should not have a dollar sign.'),
('2020-03-17 18:22:17', '90210', '2.49', 'Foo', 'Bad row: IS_TAXABLE cannot be converted to true or false'),
('2020-03-17 18:22:24', '80386', '1.89', '1', 'Good row.');
-- Make sure the rows inserted okay.
select * from SALES_RAW;
-- Create a table to hold the bad rows.
create or replace table SALES_BAD_ROWS like SALES_RAW;
-- Insert good rows into SALES_STAGE and
-- bad rows into SALES_BAD_ROWS
insert first
when SALE_TIMESTAMP_X is null and SALE_TIMESTAMP is not null or
ITEM_SKU_X is null and SALE_TIMESTAMP is not null or
PRICE_X is null and PRICE is not null or
IS_TAXABLE_X is null and IS_TAXABLE is not null
then
into SALES_BAD_ROWS
(SALE_TIMESTAMP, ITEM_SKU, PRICE, IS_TAXABLE, COMMENTS)
values
(SALE_TIMESTAMP, ITEM_SKU, PRICE, IS_TAXABLE, COMMENTS)
else
into SALES_STAGE
(SALE_TIMESTAMP, ITEM_SKU, PRICE, IS_TAXABLE, COMMENTS)
values
(SALE_TIMESTAMP_X, ITEM_SKU_X, PRICE_X, IS_TAXABLE_X, COMMENTS)
select try_to_timestamp (SALE_TIMESTAMP) as SALE_TIMESTAMP_X,
try_to_number (ITEM_SKU, 10, 0) as ITEM_SKU_X,
try_to_number (PRICE, 10, 2) as PRICE_X,
try_to_boolean (IS_TAXABLE) as IS_TAXABLE_X,
COMMENTS,
SALE_TIMESTAMP,
ITEM_SKU,
PRICE,
IS_TAXABLE
from SALES_RAW;
-- Examine the two good rows
select * from SALES_STAGE;
-- Examine the four bad rows
select * from SALES_BAD_ROWS;
Load error information is captured by Snowflake and can be accessed by querying the COPY_HISTORY table function.
https://docs.snowflake.net/manuals/sql-reference/functions/copy_history.html
Within the COPY INTO command you can decide how to proceed with a file if one or more rows fail the load process by using the ON_ERROR parameter.
https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html#copy-options-copyoptions
I recommend you check out try_cast.
https://docs.snowflake.net/manuals/sql-reference/functions/try_cast.html
Also for your query, I would just use a view and if performance is an issue a materialized view.
I think a nice solution is to wrap your SQL calls with a helper method.
For example lets say instead of doing snowflake.execute({}) ...
You use something like:
EXEC(select * from table1 where x > ?,[param1]);
Inside the EXEC method you can have a try catch and you can easily add things like a continue handler, or exit_handler were you can put logic to log your errors on a table.
I have assembled a repo with a tools and some snippets. Maybe take a look at: https://github.com/orellabac/SnowJS-Helpers

Records updating in a weird order when saving to localdb - LINQ

My records update successfully in my controller, but View Data shows weird indexing behaviour.
Now, this would kinda make sense if my ID field wasnt an unique identity but... scripting to clipboard generates this:
USE [myDb]
GO
/****** Object: Table [dbo].[Contacts] Script Date: 10/25/2018 1:02:01 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Contacts] (
[Id] INT IDENTITY (1, 1) NOT NULL,
[Name] NVARCHAR (255) NOT NULL,
[ContactType] NVARCHAR (255) NOT NULL,
[ContactNumber] BIGINT NOT NULL,
[BirthDate] DATE NOT NULL
);
What am I missing here? This is a small project and shouldnt really be an issue because my page loads with the Contact ID and then updates that same ID in my controller.. but it'd be nice to know why this is happening for the future.

iOS: SQLITE INSERT QUERY ERROR

I am constantly receiving DB error NO Column found inspite i have recreated column and verified it too many times.
Below is the table structure:
CREATE TABLE "ContractorTester" ("ContrTestID" VARCHAR NOT NULL ,"Ack" VARCHAR NOT NULL ,"TesterLName" VARCHAR,"TesterFName" VARCHAR,"GaugeName1" VARCHAR,"GaugeMake1" VARCHAR,"TestCrossConLic" VARCHAR,"CCLicExpDate" VARCHAR,"GaugeSerialNum1" VARCHAR,"GaugeCalibrDate1" VARCHAR,"ContrCompanyName1" VARCHAR,"ContrAddr1" VARCHAR,"ContrCity1" VARCHAR,"ContrState1" VARCHAR,"ContrZip1" VARCHAR,"ContrPhone1" VARCHAR,"Lat" DOUBLE,"Log" DOUBLE,"MCreatedDate" VARCHAR,"MUpdatedDate" VARCHAR,"ActLocalCT
" VARCHAR,"ContrTestTranID" VARCHAR PRIMARY KEY )
Below is Insert Query:
INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0')
Below is the error:
SQLiteManager: Likely SQL syntax error: INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0') [ table ContractorTester has no column named ActLocalCT ]
Exception Name: NS_ERROR_FAILURE
Exception Message: Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [mozIStorageConnection.createStatement]
Please any one review and let me know what is the wrong in above Queries.
Thanks in advance.
Replace
INSERT OR REPLACE INTO ContractorTester
with
INSERT OR REPLACE INTO ContractorTester VALUES
Without the VALUES the list in parens is the list of columns to insert into, not the list of values to insert.

Informix trigger to change inserted values

I would like to change a couple of column values before they get inserted.
I am using Informix as database.
I have a table consisting of 3 columns: Name (NVARCHAR), Type (INT), Plan (NVARCHAR).
Every time a new record is inserted, I would like to check the Name value before inserting it. If the Name starts with an F, I would like to set the Type value to 1 and the Plan Name to "Test"
In short, what I want the trigger to do is:
For every new insertion, first check if Name value starts with F.
If yes, set the Type and Plan to 1 and "Test" then insert.
If no, insert the values as-is.
I have looked up the CREATE TRIGGER statement with BEFORE and AFTER. However, I would like to have a clearer example. My case would probably involve BEFORE though.
The answer of #user3243781 get close, but did not work because it returns the error:
-747 Table or column matches object referenced in triggering statement.
This error is returned when a triggered SQL statement acts on the
triggering table, or when both statements are updates, and the column
that is updated in the triggered action is the same as the column that
the triggering statement updates.
So the alternative is handle with the NEW variable directly.
For that you need to use a procedure with the triggers reference resource, which means the procedure will able to act like the trigger by self.
Below is my example which I run with dbaccess over a Informix v11.70.
This resource is available only for versions +11 of the engine, as far I remember.
create table teste ( Name NVARCHAR(100), Type INT , Plan NVARCHAR(100) );
Table created.
create procedure check_name_values()
referencing new as n for teste ;;
define check_type integer ;;
define check_plan NVARCHAR ;;
if upper(n.name) like 'F%' then
let n.type = 1;;
let n.plan = "Test";;
end if
end procedure ;
Routine created.
;
create trigger trg_tablename_ins
insert on teste
referencing new as new
for each row
(
execute procedure check_name_values() with trigger references
);
Trigger created.
insert into teste values ('cesar',99,'myplan');
1 row(s) inserted.
insert into teste (name) values ('fernando');
1 row(s) inserted.
insert into teste values ('Fernando',100,'your plan');
1 row(s) inserted.
select * from teste ;
name cesar
type 99
plan myplan
name fernando
type 1
plan Test
name Fernando
type 1
plan Test
3 row(s) retrieved.
drop table if exists teste;
Table dropped.
drop procedure if exists check_name_values;
Routine dropped.
create trigger trg_tablename_ins
insert on tablename
referencing new as new
for each row
(
execute procedure check_name_values
(
new.name,
new.type,
new.plan
)
);
create procedure check_name_values
(
name NVARCHAR,
new_type integer,
new_plan NVARCHAR,
)
define check_type integer ;
define check_plan NVARCHAR ;
let check_type = 1;
let check_plan = "Test";
if name = 'F%'
then
insert into tablename (name,type,plan) values (name,check_type,check_plan);
else
insert into tablename (name,type,plan) values (name,new_type,new_plan);
end if ;
end procedure ;
Here is my version an adaptation of an old example I found in the informix usenet group.
It is possible to update columns in a trigger statement but not very straight forward. You have to use stored procedures an the into statement with the execute procedure command.
It worked here for IBM Informix Dynamic Server Version 12.10.FC11WE.
drop table if exists my_table;
drop sequence if exists my_table_seq;
create table my_table (
id INTEGER
NOT NULL,
col_a char(32)
NOT NULL,
col_b char(20)
NOT NULL,
hinweis char(64),
uslu char(12)
DEFAULT USER
NOT NULL,
dtlu DATETIME YEAR TO SECOND
DEFAULT CURRENT YEAR TO SECOND
NOT NULL
)
;
create sequence my_table_seq
increment 1
start 1;
drop procedure if exists get_user_datetime();
create function get_user_datetime() returning char(12),datetime year to second;
return user, current year to second;
end function
;
drop trigger if exists ti_my_table;
create trigger ti_my_table insert on my_table referencing new as n for each row (
execute function get_user_datetime() into uslu, dtlu
)
;
drop trigger if exists tu_my_table;
create trigger tu_my_table update on my_table referencing new as n for each row (
execute function get_user_datetime() into uslu, dtlu
)
;
insert into my_table values (my_table_seq.nextval, "a", "b", null, "witz", mdy(1,1,1900)) ;
SELECT *
FROM my_table
WHERE 1=1
;

Resources