Why is SQL CE Toolbox failing to parse my csv file? - parsing

SQLCEToolbox looks very promising.
I installed it, opened VS 2012, selected Tools | SQL Server Compact Toolbox, quickly and easily set up a connection to my sdf database, which I had (also quickly and easily) created in WebMatrix.
I then right-clicked the database in SQLCEToolbox's treeview and selected Import from CSV file.
Note: One must explicitly specify the column names on the first line in the file (line/row 0) and explicitly add the ID/Primary Key/Identity vals (if you have such a column).
Once I fixed the various formatting errors with my csv file, it seemed to import well. But I got this err msg:
Error Code: 80004005
Message : The data was truncated while converting from one data type to another. [ Name of function(if known) = ]
Minor Err.: 25920
Source : SQL Server Compact ADO.NET Data Provider
My data is in the format:
1,SomeCategory,SomeName,SomeURL,Longitude,Latitude
As an example, here's a sample line:
3,ArtGallery, Henry Miller Memorial Library,http://www.henrymiller.org/, -121.7537834,36.2207945
Except for the int ID column, they are all nvarchar(50) columns. So I realized that some of my URLs were longer than 50 chars; so, in WebMatrix, I modified that column to be nvarchar(128). On trying again to import, apparently that first problem was solved, but now I get:
Error Code: 80004005
Message : The column cannot be modified. [ Column name = ID ]
Minor Err.: 25004
Source : SQL Server Compact ADO.NET Data Provider
Err. Par. : ID
Okay, I reason, since the ID column is to be automatically updated, I shouldn't have it in my csv file. So I changed it to this:
Category,Name,URL,Longitude,Latitude
ArtGallery, Henry Miller Memorial Library,http://www.henrymiller.org/, -121.7537834,36.2207945
...but when I tried importing the records that way (with "ID" removed from row 0 and the corresponding val from each line/record), and try to import, I get:
Error Code: 80004005
Message : The data was truncated while converting from one data type to another. [ Name of function(if known) = ]
Minor Err.: 25920
Source : SQL Server Compact ADO.NET Data Provider
I tried to post a question at http://sqlcetoolbox.codeplex.com/, but trying to do so simply took me to a generic codeplex area (rather than a SQL CE Toolbox-specific area).
So what might be the problem here (the revised row 0 and another sample row are shown below) now? The longest URL is only 98 chars.
What's wrong with my data or methodology?

Related

Could not upgrade RDL on Report

I'm getting error when importing report (as text file) to database (Nav 2017 CU11).
: [21431012] Could not upgrade RDL on Report XXXXXX Report Name.
', hexadecimal value 0x14, is an invalid character. Line 8693, position 64.
[0] The import stopped at line 2810411.
The thing is that I'm only getting this error when trying to import this object inside the docker container. DB and service tier are hosted inside the same container.
Same object works/imports fine on regular PC. Nav versions exactly the same on PC and in the container.
Moreover there is not a single place in the text file where I could find mentioned 0x14 character.
As far as I know Line 8693, position 64. refers to the RDLc part of the text file. At this position stands legit letter й (Russian).
It must have something to do with encoding but I found no way to play with it inside the container.
maybe is the language settings/localization/locale in(side) the container?
Maybe this will help...
Cheers

cypher statement returns (no changes, no rows)

I've watched Nicole White's awesome youtube “Using LOAD CSV in the Real World” and decided to re-create the neo4j data using the same method.
I’ve cloned her git repo on this subject and have been working this example on the community version of neo4j on my Mac.
I’m stepping thru the load.cql file one command at a time pasting each command into the command window.
Things are going pretty good- I’ve got a bunch of nodes created. To deal with
null values for sub_products and sub_issues in the master file, I created
two other csv files: sub_issues.csv and sub_products.csv as described in the video.
But when I try reading ether these files, I get "(no changes, no rows)”
somehow I get the impression there is something wrong…
below is the actual command sequence I used for the incremental read.
// Load.
USING PERIODIC COMMIT
LOAD CSV WITH HEADERS
FROM 'file:///Volumes/microSD/neo4j-complaints/sub_issue.csv' AS line
WITH line
WHERE line.`Sub-issue` <> '' AND
line.`Sub-issue` IS NOT NULL
MATCH (complaint:Complaint { id: TOINT(line.`Complaint ID`) })
MATCH (complaint)-[:WITH]->(issue:Issue)
MERGE (subIssue:SubIssue { name: UPPER(line.`Sub-issue`) })
MERGE (subIssue)-[:IN_CATEGORY]->(issue)
CREATE (complaint)-[:WITH]->(subIssue)
;
Strip out some of the later statements and do a "RETURN identifier1, identifier2" etc. to see what the engine is doing.

Breeze allegedly saving a value that doesn't make it onto the database?

Immediately after my datacontext.savechanges I stop the code and use the Chrome dev tools inspector to look at the XHR request that just ran. This is what I see:
$type: "Breeze.WebApi.SaveResult, Breeze.WebApi"
Entities: [{$id:2, $type:pdb.productMaster, PDB, Id:1912, ProductCode:a18, Description:a18t, GroupId:116}]
0: {$id:2, $type:pdb.productMaster, PDB, Id:1912, ProductCode:a18, Description:a18t, GroupId:116}
$id: "2"
$type: "pdb.productMaster, PDB"
ProductCode: "a18"
ProductVersions: null
Description: "a18t"
GroupId: 116
Id: 1912
Errors: null
I then go straight to SQL Management Studio and look at the just-added record:
Id ProductCode Description GroupId
1912 a18 a18t 1
Yup, every single record I'm adding has groupID set to 1! It is an int field. I can manually set it to 116 in SQL Management studio.
The response from the XHR request also shows no errors and says the groupID is 116.
I have absolutely no idea what is going on here! Breeze error? It's totally bizarre.
I'd start by looking at the metadata for your "productMaster" entity type. You can use Breeze's MetadataStore.getEntityType method and then investigate the data and navigation properties. My guess is that this might give you more insight into the issue... possibly a mismatch between your client and server code.
You can also create a save interceptor on the server and investigate what the server side entity looks like when materialized on the server just before the save.

REXML :: RuntimeError (entity expansion has grown too large)

After upgrading to Ruby-1.9.3-p392 today, REXML throws a Runtime Error when attempting to retrieve an XML response over a certain size - everything works fine and no error is thrown when receiving under 25 XML records, but once a certain XML response length threshold is reached, I get this error:
Error occurred while parsing request parameters.
Contents:
RuntimeError (entity expansion has grown too large):
/.rvm/rubies/ruby-1.9.3-p392/lib/ruby/1.9.1/rexml/text.rb:387:in `block in unnormalize'
I realize this was changed in the most recent Ruby version:
http://www.ruby-lang.org/en/news/2013/02/22/rexml-dos-2013-02-22/
As a quick fix, I've changed the size of REXML::Document.entity_expansion_text_limit to a larger number and the error goes away.
Is there a less risky solution?
This issue is generated when you send too much content as XML response.
To fix this issue : You need to restrict the data(< 10k) in the individual node (Instead of sending the whole data, show truncated data and provide a seperate link to view full content)
The error is being raised from the below file :
ruby-2.1.2/lib/ruby/2.1.0/rexml/text.rb
# Unescapes all possible entities
def Text::unnormalize( string, doctype=nil, filter=nil, illegal=nil )
sum = 0
string.gsub( /\r\n?/, "\n" ).gsub( REFERENCE ) {
s = Text.expand($&, doctype, filter)
if sum + s.bytesize > Security.entity_expansion_text_limit
raise "entity expansion has grown too large"
else
sum += s.bytesize
end
s
}
end
The limit ruby-2.1.2/lib/ruby/2.1.0/rexml/text.rb defaults to 10240 which means 10k data per node.
REXML already defaults to only allow 10000 entity substitutions per document, so the maximum amount of text that can be generated by entity substitution will be around 98 megabytes. (Refer https://www.ruby-lang.org/en/news/2013/02/22/rexml-dos-2013-02-22/ )
That sounds like a LOT of XML. Do you really need to get all of it? Maybe you can just request certain fields from the remote server? One option might be to try another XML parser (Nokogiri for example). Another option to maybe use something other than XML as a transport (JSON? Binary?).

Multiple-step OLE DB operation generated errors [duplicate]

Dim NorthWindOledbConnection As String = "Provider=SQLOLEDB;DataSOurce=SARAN-PC\SQLEXPRESS;Integrated Security=ssp1;InitialCatalog=Sara"
Dim rs As New ADODB.Recordset()
rs.Open("select * from SecUserPassword", NorthWindOledbConnection, ADODB.CursorTypeEnum.adOpenStatic, ADODB.LockTypeEnum.adLockBatchOptimistic)
i tried to run this above code in visual studio 2008 - it shows the following error:
"Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done"
Firstly, don't use ADO in VB.NET. Use ADO.NET.
Other than that, create a proper Connection object instead of passing around a string.
And fix your connection string. It's SSPI, not SSP1. And it's Data Source, not DataSOurce. And it's Initial Catalog, not InitialCatalog.
You are using a very very very old way to access a Database that has been used with Visual Basic 6 and older.
Check to use ADO.NET instead of old ADO. For example you can use this code that is "similar" to the code you are using (but is not the best way to access the data on VS2008)
OleDbConnection con= New OleDbConnection( **Your Connection String** )
con.Open()
Dim command As OleDbCommand = New OleDbCommand("select * from SecUserPassword", con)
sqlCommand .CommandType = CommandType.Text
Dim reader As OleDbDataReader = TheCommand.ExecuteReader()
While reader.Read()
System.Console.Write(reader(** Your Table Field Name** ).ToString())
End While
con.Close()
To view how to create a correct connection String see the site http://www.connectionstrings.com/
If you want to access to an SQL Server database also you can use the SQLClient namespace instead the OleDb. For example System.Data.SqlClient.SqlConnection instead the OleDbConnection to provide better performance for SQL Server
The link below is an article that gives a great breakdown of the 6 scenarios this error message can occur:
Scenario 1 - Error occurs when trying to insert data into a database
Scenario 2 - Error occurs when trying to open an ADO connection
Scenario 3 - Error occurs inserting data into Access, where a fieldname has a space
Scenario 4 - Error occurs inserting data into Access, when using adLockBatchOptimistic
Scenario 5 - Error occurs inserting data into Access, when using Jet.OLEDB.3.51 or ODBC driver (not Jet.OLEDB.4.0)
Scenario 6 - Error occurs when using a Command object and Parameters
http://www.adopenstatic.com/faq/80040e21.asp
Hope it may help others that may be facing the same issue.

Resources