I am new to FileNet. Is it possible for me to write a SQL statement to retrieve the file size of the document from FileNet oracle database ? What will be the field name be?
Related
We have user data (avro files) validated and ingested into HDFS using Schema Registry(data keep on evolving) and using GreenPlum with PXF to access HDFS data. Created one external table and trying to query the HDFS data but getting error as,
warehouse=# select * from user;
ERROR: Record has 151 fields but the schema size is 152 (seg1 slice1 192.168.1.17:6001 pid=6582)
CONTEXT: External table user
warehouse=#
user HDFS files are ingested using different schema versions, and GreenPlum external table has been created with fields from all the schema versions.
I am working on a rails engine which takes a excel file as an input and than saves it to temp folder in app. Than when i click on validate it validates the file and if there are no errors than it will save file to database. I am able to get the columns of excel file but my problem is that a excel file can have different columns than what in my excel file.
Also i am not generating any migration for engine. db table will be provided by the app and my engine will only parse data, validate and than if excel file is valid than it will save all the data in table.
I am getting confused how to set mapping such that it will save data to db.
For e.g. - My engine have model upload.rb where i am doing all validation for excel file and than saving it. Now suppose my app have a model employee.rb which have columns first name, last name, employee id etc. while my excel file can have any alias or different column name for data. e.g. fname, lname etc.
Suppose i have validation in my engine model, so my engine have no idea about mapping, it will only validate and save the data in db. So mapping needs to be set in my app.
Can anyone tell me how can i set mapping so that whenever my engine is mounted it will parse excel and save to to app's db table as per the mapping set ?
Any help will be appreciated.
I need to generate an XML file from a table with 500.000 rows and 200 columns using Firedac Query and ClientDataSet to record ClientDataSet.SaveToFile.
ClientDataSet.Close;
Query.Sql.Clear;
Query.Sql.Add ("Select * from something");
ClientDataSet.Open;
ClientDataSet.SaveToFile("destination_folder.xml");
But to save the file with SaveToFile is the insufficient memory error.
How could I save this file? There is a way to write to multiple files the ClientDataSet already loaded?
Best regards.
I am importing data from dbase into sql and am using the following connection string to read data:
"Provider=Microsoft.Jet.OLEDB.4.0;" + #"Data Source=D:\GS\Source;"
+ "Extended Properties=dbase 5.0;User ID=Admin;Password=;"
The data is read successfully but i find that it includes DELETED rows too. as per dbase, there will be an asterisk as the first character, however we find that neither the file is excluded nor is the asterisk coming as a column.
So how do we load data without this deleted records?
Actually, deleted records are internally marked with a flag and not an "*" that you can query from. However, VFP does have a function to test this deleted flag, but only really applicable if running from a single table and not multiple/joins as it won't know which table you are wondering out... Ex:
select * from YourTable where not deleted()
That being said, VFP has some other "environment" setting commands that MAY work via the OleDB, but I've never actually tried THIS.
Once you have your connection, and it's open, run the following ExecNonQuery might help...
OleDbCommand oCmdTest = new OleDbCommand( "SET DELETED OFF", YourConnection );
OCmdTest.ExecuteNonQuery();
oCmdTest = new OleDbCommand( "Select * from YourTable", YourConnection );
execute it into a data table result set as you are, and you should be good.
One other part. I would not be using the Jet OleDB, but actually using the Microsoft VFP OleDb driver
In dBase after delete command 'pack all' has to be given so that the deleted records are
removed permanently.
If you have access to dBase use the concerned file and then give pack all command
HI,
My Question may be a simple one for you.
I have to import a csv file to my delphi application. This file contains 3 columns and I want to match the columns to one of Dataset( TQuery connected to a Firebird table) and show on a grid.
My Question is, is it possible to use the Csv file as a table, who can access by a SQL query and join to a Db table ?
I have tried with TTable with TableType property as ttASCII. It loads the file.However this loads the contents to a single fields,
ie, Fields[0].asstring gives '11,12,abc.txt'
I want this on different fields
ie,
Fields[0].asstring = '11'
Fields[1].asstring = '12'
Fields[3].asstring = 'abc.txt'
Hope you understand my requirement. Kindly take a look and let me know your thoughts
Thanks and Regards,
Vijesh V.Nair
System Analyst
Vijesh , you have to create a schema definition file to access a txt file from a TTable component, the name of the schema file must the same of the text file but with the SCH extension.
in this link you can found more information about the format of the schema file Using The ASCII Driver With Comma-delimited Files, also you can check the BDE32.HLP file.