I am having trouble loading my data from a csv file. There are some null values which I think is where the error is coming from. I have tried doing it different ways but it keeps saying the data truncated at row 4, which is where the null starts. I am doing it on mysql command line. Please let me know what the error is and how it can be fixed!
This is the first one I tried:
load data infile 'c:/ProgramData/MySQL/MySQL Server 8.0/Uploads/wind.csv' into table wind fields terminated by ',' lines terminated by '\r\n' (stationname, year, month, #vfour) SET windspeed = NULLIF(#vfour,'');
This is the second:
load data infile 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/wind.csv'
into table wind fields terminated by ',' lines terminated by '\n';
Related
I have two topics users (contains all the users info) and transactions (contains all the transactions made by the users it includes the 'sender' and 'receiver id'), all of my topics data are nested.
First thing I've done was CREATE STREAM, then I CREATED another STREAM to rename those nested fields because PARTITION BY does not accept nested fields somehow, everything works great and my question is that I want to partition transactions per sender and receiver id so I can join it with users does KSQL accepts PARTITION BY two columns? Do I need to PARTITION BY two columns to get this working or do I just need to partition by either sender or receiver?
I have tried this but came back with an error, I also tried to add PARTITION BY (sender, receiver) at the end and came back with a other error
ksql> CREATE STREAM transactions WITH (PARTITIONS=1) AS \ SELECT * FROM >>flattentransactions PARTITION BY sender,receiver;
line 1:105: mismatched input ',' expecting ';'
Statement: CREATE STREAM transactions WITH (PARTITIONS=1) AS SELECT * >>FROM flattentransactions PARTITION BY sender,receiver;
Caused by: line 1:105: mismatched input ',' expecting ';'
Caused by: org.antlr.v4.runtime.InputMismatchException ##
You need to concatenate the columns first:
CREATE STREAM transactions
WITH (PARTITIONS=1) AS
SELECT X,Y,Z,SENDER + RECEIVER AS MSG_KEY
FROM flattentransactions
PARTITION BY MSG_KEY
Yesterday I was able to figure out how to use the following query in order to get rid of duplicate results (see SQL Query to delete duplicate values in a 3 tables inner join with two different databases for additional info if necessary):
SELECT
"AM-Martin".dbo.CpCore_Site.Number,"AM-Martin".dbo.CpCore_Site.Latitude,"AM-Martin".dbo.CpCore_Site.Longitude,"AM-Martin".dbo.CpSm_Face.RetiredOn,**CAST("AM-Martin_bin".dbo.CpCore_Image.Bytes as Varbinary)**, "AM-Martin".dbo.CpCore_Site.Name, "AM-Martin".dbo.CpCore_Site.Zipcode
FROM
"AM-Martin".dbo.CpCore_Site
INNER JOIN "AM-Martin".dbo.CpSm_Face on "AM-Martin".dbo.CpSm_Face.SiteId = "AM-Martin".dbo.CpCore_Site.Oid
INNER JOIN "AM-Martin_bin".dbo.CpCore_Image on "AM-Martin".dbo.CpSm_Face.Oid = "AM-Martin_bin".dbo.CpCore_Image.OwnerId
WHERE
"AM-Martin".dbo.CpSm_Face.RetiredOn LIKE '%9999%'
AND "AM-Martin".dbo.CpCore_Site.Number LIKE N'%LA%' OR "AM-Martin".dbo.CpCore_Site.Number LIKE N'%LC%' OR "AM-Martin".dbo.CpCore_Site.Number LIKE N'%BH%'
AND "AM-Martin".dbo.CpCore_Site.Latitude > 0.0
GROUP BY "AM-Martin".dbo.CpCore_Site.Number,"AM-Martin".dbo.CpCore_Site.Latitude,"AM-Martin".dbo.CpCore_Site.Longitude,"AM-Martin".dbo.CpSm_Face.RetiredOn,**CAST("AM-Martin_bin".dbo.CpCore_Image.Bytes as Varbinary)**, "AM-Martin".dbo.CpCore_Site.Name, "AM-Martin".dbo.CpCore_Site.Zipcode;
This query allowed me to use Group By on the images column. Afterwards I took the data and put it into SQL Statements and imported them into my Postgres SQL DB. When I click on an image value I am unable to open it the message I get in my program (RazorSQL) is:
Unable to display image
I also checked in the SQL Database Table directly where I ran the query and I am unable to open the images. I suspect it has to do with this line:
**CAST("AM-Martin_bin".dbo.CpCore_Image.Bytes as Varbinary)**
Now the issue is the image values are encoded and I do not know how to decode them when I transfer them over like the SQL statement below:
CREATE TABLE "map" (
number varchar(32) NOT NULL,
latitude float NOT NULL,
longitude float NOT NULL,
retiredon timestamp NOT NULL,
image bytea NOT NULL,
name varchar(256) NOT NULL,
zipcode varchar(11) NOT NULL
);
INSERT INTO map("number", "latitude", "longitude", "retiredon", "image", "name", "zipcode") VALUES ('BH-0001', 34.059858, -118.376056, '9999-12-31 00:00:00.0', decode('FFD8FFE000104A46494600010100000100010000FFDB0043000604050605','hex'), 'NB La Cienega Blvd FS Olympic Blvd NEC', '90035');
I want to render the images on my website but they appear to be encoded. How do I decode them when I bring them over to postgresql from SQL server in the SQL statement?
Update
I tried to render the images in my rails project without the image column everything works smoothly. As soon as I try to render the image data I get the following error message:
Encoding::UndefinedConversionError
Also see
You may want to consider translating your byte to hex before migrating the images into the database as strings and then translating them back to byte form before displaying. See this blog for help in the translations.
Alternatively, you could always try hosting your images on AWS and saving the image url as a string in your database.
I followed kylin tutorial and able to create kylin model and kylin cube successfully.Kylin cube build is also completed successfully.
I create one fact table as,
create table sales_fact(product_id int,state_id int,location_id string,sales_count int)
row format delimited
fields terminated by ','
lines terminated by '\n'
stored as textfile;
create table state_details(state_id int,state_name string)
row format delimited
fields terminated by ','
lines terminated by '\n'
stored as textfile;
I loaded these tables as,
fact_table
1000,1,AP1,50
1000,2,KA1,100
1001,2,KA1,50
1002,1,AP1,50
1003,3,TL1,100
state_details
1,AP
2,Karnataka
3,Telangana
4,kerala
But if i queried simple query as,
select sales_count from sales_fact where state_name="Karnataka";
it is error as:
Error while executing SQL "select sales_count from sales_fact where state_name="Karnataka" LIMIT 50000": From line 1, column 42 to line 1, column 51: Column 'STATE_NAME' not found in any table
I am not able to find the cause.Anybody have any idea please tell me.
state_name is not on table sales_fact, please try:
select sales_count from sales_fact as f inner join state_details as d on f.state_id = d.state_id where d.state_name='Karnataka';
I am constantly receiving DB error NO Column found inspite i have recreated column and verified it too many times.
Below is the table structure:
CREATE TABLE "ContractorTester" ("ContrTestID" VARCHAR NOT NULL ,"Ack" VARCHAR NOT NULL ,"TesterLName" VARCHAR,"TesterFName" VARCHAR,"GaugeName1" VARCHAR,"GaugeMake1" VARCHAR,"TestCrossConLic" VARCHAR,"CCLicExpDate" VARCHAR,"GaugeSerialNum1" VARCHAR,"GaugeCalibrDate1" VARCHAR,"ContrCompanyName1" VARCHAR,"ContrAddr1" VARCHAR,"ContrCity1" VARCHAR,"ContrState1" VARCHAR,"ContrZip1" VARCHAR,"ContrPhone1" VARCHAR,"Lat" DOUBLE,"Log" DOUBLE,"MCreatedDate" VARCHAR,"MUpdatedDate" VARCHAR,"ActLocalCT
" VARCHAR,"ContrTestTranID" VARCHAR PRIMARY KEY )
Below is Insert Query:
INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0')
Below is the error:
SQLiteManager: Likely SQL syntax error: INSERT OR REPLACE INTO ContractorTester ('GaugeName1','GaugeMake1','ContrPhone1','ContrTestTranID','TesterFName','ContrAddr1','ContrCity1','ContrZip1','CCLicExpDate','TestCrossConLic','GaugeCalibrDate1','Ack','TesterLName','ContrTestID','ContrState1','ContrCompanyName1','GaugeSerialNum1','ActLocalCT','Log','Lat') VALUES ('TK-99F','MIDWEST','(847) 111-3314','0','Jack','819 Main1','Lake Zurich','60051','2016-04-17T00:00:00.003','XC3673','2015-04-17T00:00:00.003','0','Skirm','5','IL','American Backflow Prevention Inc.','TG0605','1','0','0') [ table ContractorTester has no column named ActLocalCT ]
Exception Name: NS_ERROR_FAILURE
Exception Message: Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [mozIStorageConnection.createStatement]
Please any one review and let me know what is the wrong in above Queries.
Thanks in advance.
Replace
INSERT OR REPLACE INTO ContractorTester
with
INSERT OR REPLACE INTO ContractorTester VALUES
Without the VALUES the list in parens is the list of columns to insert into, not the list of values to insert.
I have the following 3 records stored in a text file...
But the SQL script will need to process 4.1 million records in good time...
//Instruction:
Copy this to a text file named: Deceased.txt
000101001118 IDENTITY NUMBER NOT NUMERIC
0001010061181PERSON DECEASED 19990101OBSTRUCTIVE AIRWAYS SYNDROME BABA NOWEZILE
0001010077097 COERTZEN AZIL CUBITT JONO
-> I need to write a query could be function / stored procedure that does the following:
Takes each record and copies to a sql table.
The table will have the following columns:
National_id Errmsg DeceasedDTE DeceasedReason Surname FirstNames
First_Initial Second_Initial Third_Initial FName1 FName2 FName3
Herewith the offset values for only the first 6 columns...
,LTRIM(SUBSTRING([TABLE],1,13)) --National_id
,LTRIM(SUBSTRING([TABLE],14,43)) --Errmsg
,LTRIM(SUBSTRING([TABLE],57,8)) --DeceasedDTE
,LTRIM(SUBSTRING([TABLE],65,50)) --DeceasedReason
,LTRIM(SUBSTRING([TABLE],115,45)) --Surname
,LTRIM(SUBSTRING([TABLE],158,50)) --FirstNames
Notes I need to use the FirstNames column to help populate the remaining columns..
Also the FirstNames is separated with a space and I need them spit up into each FName1, FName2 and FName3 ... with its corresponding first letter that makes up the initial..
I then need to create a script send to a .txt file that creates an insert statement for each record with the following columns... national_id, surname, First_Initial, Second_Initial,Third_Initial, First_Name, Second_Name, Third_Name
E.G.
Set #Insert = "insert into prodmgr.t_unverified (national_id, surname, First_Initial, Second_Initial,Third_Initial, First_Name, Second_Name, Third_Name) values ('"
I used Bulk insert to make this process more efficient ... but not successful as yet... the script will need to process millions of records..
Please help :)