I am having trouble loading my data from a csv file. There are some null values which I think is where the error is coming from. I have tried doing it different ways but it keeps saying the data truncated at row 4, which is where the null starts. I am doing it on mysql command line. Please let me know what the error is and how it can be fixed!
This is the first one I tried:
load data infile 'c:/ProgramData/MySQL/MySQL Server 8.0/Uploads/wind.csv' into table wind fields terminated by ',' lines terminated by '\r\n' (stationname, year, month, #vfour) SET windspeed = NULLIF(#vfour,'');
This is the second:
load data infile 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/wind.csv'
into table wind fields terminated by ',' lines terminated by '\n';
I followed kylin tutorial and able to create kylin model and kylin cube successfully.Kylin cube build is also completed successfully.
I create one fact table as,
create table sales_fact(product_id int,state_id int,location_id string,sales_count int)
row format delimited
fields terminated by ','
lines terminated by '\n'
stored as textfile;
create table state_details(state_id int,state_name string)
row format delimited
fields terminated by ','
lines terminated by '\n'
stored as textfile;
I loaded these tables as,
fact_table
1000,1,AP1,50
1000,2,KA1,100
1001,2,KA1,50
1002,1,AP1,50
1003,3,TL1,100
state_details
1,AP
2,Karnataka
3,Telangana
4,kerala
But if i queried simple query as,
select sales_count from sales_fact where state_name="Karnataka";
it is error as:
Error while executing SQL "select sales_count from sales_fact where state_name="Karnataka" LIMIT 50000": From line 1, column 42 to line 1, column 51: Column 'STATE_NAME' not found in any table
I am not able to find the cause.Anybody have any idea please tell me.
state_name is not on table sales_fact, please try:
select sales_count from sales_fact as f inner join state_details as d on f.state_id = d.state_id where d.state_name='Karnataka';
I am trying to bulk insert a csv file into SQL Server 2014. I have managed to work out that it is not recognising the end of line character in the CSV file.
The error I get is:
Msg 4864, Level 16, State 1, Procedure spTempImport, Line 74 Bulk load
data conversion error (type mismatch or invalid character for the
specified codepage) for row 1, column 51 (Val_0000).
I have read other posts on here and have tried them all and I cannot get it to work.
The SPROC is as follows:
USE [xxxxxxx]
GO
/****** Object: StoredProcedure [dbo].[spTempImport] Script Date: 26/07/2016 19:59:56 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[++++++++]
AS
BEGIN
SET NOCOUNT ON
CREATE TABLE #tmp (
--[BatchID] NVARCHAR(MAX) NOT NULL,
[ID] NVARCHAR(50) NOT NULL,
[Serial] NVARCHAR(30) NOT NULL,
[Date] VARCHAR(15) NOT NULL,
[Val_0030]NVARCHAR(15) NULL,
[Val_0000]NVARCHAR(15) NULL
)
BULK INSERT #tmp
FROM 'C:\Temp\test1.csv'
WITH
(
FIRSTROW = 1,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\r\n'
)
SELECT * FROM #tmp
I have tried using \r \n \n\r \rn CR\LF LF\CR and the various hex codes. I created the file in excel and saved as a CSV and it didn't work. I copied the data into notepad and it wont recognise the line feed/carriage return.
What am I doing wrong?
P.S. I have removed soem of the columns in the code example, the val columns are the same other than the name and [Val_0000]NVARCHAR(15) NULL is the final column.
I set the columns in the SPROC to VARCHAR(MAX) and it worked.
I am constantly receiving DB error NO Column found inspite i have recreated column and verified it too many times.
Below is the table structure:
CREATE TABLE "ContractorTester" ("ContrTestID" VARCHAR NOT NULL ,"Ack" VARCHAR NOT NULL ,"TesterLName" VARCHAR,"TesterFName" VARCHAR,"GaugeName1" VARCHAR,"GaugeMake1" VARCHAR,"TestCrossConLic" VARCHAR,"CCLicExpDate" VARCHAR,"GaugeSerialNum1" VARCHAR,"GaugeCalibrDate1" VARCHAR,"ContrCompanyName1" VARCHAR,"ContrAddr1" VARCHAR,"ContrCity1" VARCHAR,"ContrState1" VARCHAR,"ContrZip1" VARCHAR,"ContrPhone1" VARCHAR,"Lat" DOUBLE,"Log" DOUBLE,"MCreatedDate" VARCHAR,"MUpdatedDate" VARCHAR,"ActLocalCT
" VARCHAR,"ContrTestTranID" VARCHAR PRIMARY KEY )
Below is Insert Query:
INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0')
Below is the error:
SQLiteManager: Likely SQL syntax error: INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0') [ table ContractorTester has no column named ActLocalCT ]
Exception Name: NS_ERROR_FAILURE
Exception Message: Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [mozIStorageConnection.createStatement]
Please any one review and let me know what is the wrong in above Queries.
Thanks in advance.
Replace
INSERT OR REPLACE INTO ContractorTester
with
INSERT OR REPLACE INTO ContractorTester VALUES
Without the VALUES the list in parens is the list of columns to insert into, not the list of values to insert.
I have the following 3 records stored in a text file...
But the SQL script will need to process 4.1 million records in good time...
//Instruction:
Copy this to a text file named: Deceased.txt
000101001118 IDENTITY NUMBER NOT NUMERIC
0001010061181PERSON DECEASED 19990101OBSTRUCTIVE AIRWAYS SYNDROME BABA NOWEZILE
0001010077097 COERTZEN AZIL CUBITT JONO
-> I need to write a query could be function / stored procedure that does the following:
Takes each record and copies to a sql table.
The table will have the following columns:
National_id Errmsg DeceasedDTE DeceasedReason Surname FirstNames
First_Initial Second_Initial Third_Initial FName1 FName2 FName3
Herewith the offset values for only the first 6 columns...
,LTRIM(SUBSTRING([TABLE],1,13)) --National_id
,LTRIM(SUBSTRING([TABLE],14,43)) --Errmsg
,LTRIM(SUBSTRING([TABLE],57,8)) --DeceasedDTE
,LTRIM(SUBSTRING([TABLE],65,50)) --DeceasedReason
,LTRIM(SUBSTRING([TABLE],115,45)) --Surname
,LTRIM(SUBSTRING([TABLE],158,50)) --FirstNames
Notes I need to use the FirstNames column to help populate the remaining columns..
Also the FirstNames is separated with a space and I need them spit up into each FName1, FName2 and FName3 ... with its corresponding first letter that makes up the initial..
I then need to create a script send to a .txt file that creates an insert statement for each record with the following columns... national_id, surname, First_Initial, Second_Initial,Third_Initial, First_Name, Second_Name, Third_Name
E.G.
Set #Insert = "insert into prodmgr.t_unverified (national_id, surname, First_Initial, Second_Initial,Third_Initial, First_Name, Second_Name, Third_Name) values ('"
I used Bulk insert to make this process more efficient ... but not successful as yet... the script will need to process millions of records..
Please help :)