get fieldtype of DBF File - oledb

I am reading field type of DBF file. I am also creating DBF file. I am using oledb reader to get filedtype. Reader always return decimal type weather i set Numeric,Float or double. Why is it so? Why reader is not returning same type as stored in DBF?

Different providers may map the physical field types of a DBF to different OLEDB field types.
Another issue is the SQL result type guessomatic that kicks in if you select expressions (select amt * 0.01 from foo) instead of naked fields and do not cast the result to some type explicitly (select cast(amt * 0.01 as currency) from foo). Again, different providers/engines may give different results in such a situation; some - like the VFP9 provider - allow explicit casts, some do not.
You may get better results if you switch to a different provider. For example, the Visual FoxPro 9.0 OLEDB provider - which is freely available for download - maps the Fox DBF field types 'I' (integer), 'B' (double), 'Y' (currency) and 'N' (numeric) faithfully to the corresponding OLEDB/ADO field types adInteger, adDouble, adCurrency and adNumeric. On the other hand, the Fox provider understands only the Fox field types and legacy DBF types corresponding to Clipper and dBASE 3 or thereabouts. It cannot read tables produced by newer dBASE versions.
Several of the freely available OLEDB providers understand some DBF dialect or other; which one is most suitable would depend on the type of your DBF files.
The DBF file structure is fairly simple; depending on your application it might make sense to pull the meta data directly from the field definitions in the DBF header.

Related

HL7 ADT Message parsing: date ranges

Note:
This question is not asking for advice on which library to use; I'm rolling my own.
I'm reading through the HL7 v2.5.1 spec in order to make a parsing engine for iOS and Windows.
My question is related to the Name Validity Range component in the Patient Name field (PID-5). But I think it applies generally to all DR (Date Range) components.
In Chapter 3: Patient Administration, on page 75, the following information is listed:
Components: {...omitted...} ^ <Name Validity Range (DR)> ^
{...omitted...}
Subcomponents for Name Validity Range (DR):
<Range Start Date/Time (TS)> & <Range End Date/Time (TS)>
Subcomponents for Range Start Date/Time (TS):
<Time (DTM)> & <Degree of Precision (ID)>
Subcomponents for Range End Date/Time (TS):
<Time (DTM)> & <Degree of Precision (ID)>
I understand how the fields, components and subcomponents are structured and how their separators are used... or at least I think I do. However, the above information confuses me as to how the data would be expressed. I have searched, but cannot find a suitable message sample for this kind of data. Based on my understanding of the HL7 data structures, here's how the data would be encoded:
PID|||01234||JONES^SUSIE^Q^^^^^^^199505011201&M&199505011201&M^199505011201&M&199505011201&M
The problem here, of course, is that having subcomponents embedded in subcomponents leaves you unsure exactly how to parse the data and what data goes where.
I did look into Chapter 2: Control, Appendix A and found this text on page 160:
Note: DR cannot be legally expressed when embedded within another data type. Its use is constrained to a segment field.
So, it appears that the standard listed for PID-5 is invalid. I haven't seen any messages from my system that even generate this information, so it may be a moot point for my particular case, but I don't like developing solutions with known holes. Has anybody encountered this "in the wild"?
An item with DR data type can be subdivided and have a precision subcomponent if the item is of type field .eg. ARQ/11 Requested start date/time range.
It can be subdivided in start and end of data range subcomponents but not precision subcomponent if the item with DR data type is already part of an other data type as in your example PID/5.
Patient name is an XPN data type which is a composite data type. That basically mean it can have a combination of Primary (like ST) and other Composites, as shown here
Now, you are looking at XPN.10 which is 10th component which is DR Data type, and again DR is combination of 2 primary DTM - start and end - or 2 subcomponents. And subcomponents are seperated by &.

Delphi + Firebird (using DBExpress): aggregate issue with TFMTBCDField

I have a table with a NUMERIC(15,2) field in my database. In Delphi, I use the trio TSQLDataset + TDataSetProvider + TClientDataSet. This field is created as TFMTBCDField. The problem is that I need to use an Aggregate field in my ClientDataSet in order to sum the values and for some reason it's not calculated correctly (see image below). However, if I set my field type as DOUBLE PRECISION in my database (Delphi creates the field as TFloatField in dataset) the aggregate works as expected.
PS: I've read somewhere that is better using NUMERIC than DOUBLE PRECISION in Firebird.
I'm using Delphi XE7 and Firebird 2.5
How do you deal with this issue?
Thanks.

Csv Type Provider convert to Json

I am using the Csv Type Provider to read data from a local csv file.
I want to export the data as json, so I am taking each row and serializing it using the json.net Library with JsonConvert.SerializeObject(x).
The problem is that each row is modeled as a tuple, meaning that the column headers do not become property names when serializing. Instead I get Item1="..."; Item2="..." etc.
How can I export to Json without 'hand-rolling' a class/record type to hold the values and maintain the property names?
The TypeProvider works by providing compile time type safety. The actual code that is compiled maps (at compile time) the nice accessors to tupled values (for performance reasons, I guess). So at run time the JSON serializer sees tuples only.
AFAIK there is no way around hand-rolling records. (That is unless we eventually get type providers that are allowed to take types as parameters which would allow a Lift<T>-type provider or the CSV type provider implementation is adjusted accordingly.)

Can a Type Provider be passed into a function as a parameter

I am learning F# and the FSharp.Data library. I have a task which I need to read 20 CSV files. Each file has different number of columns but the records share the same nature: keyed on a date string and all the rest of the columns are float numbers. I need to do some statistical calculation on the float format data columns before persist the results into the database. Although I got all the plumbing logic working:
read in the CSV via FSharp.Data CSV type provider,
use reflection to get the type of the each column fields together with the header names they are fed into a pattern match, which decides the relevant calculation logics
sqlbulkcopy the result), I ended 20 functions (1 per CSV file).
The solution is far from acceptable. I thought I could create a generic top level function as the driver to loop through all the files. However after days of attempts I am getting nowhere.
The FSharp.Data CSV type provider has the following pattern:
type Stocks = CsvProvider<"../docs/MSFT.csv">
let msft = Stocks.Load("http://ichart.finance.yahoo.com/table.csv?s=MSFT")
msft.Data |> Seq.map(fun row -> do something with row)
...
I have tried:
let mainfunc (typefile:string) (datafile:string) =
let msft = CsvProvider<typefile>.Load(datafile)
....
This doesnt work as the CsvProvider complains the typefile is not a valid constant expression. I am guessing the type provider must need the file to deduce the type of the columns at the coding time, the type inference can not be deferred until the code where the mainfunc is called with the relevant information.
I then tried to pass the Type into the mainfunc as a parameter
neither
let mainfunc (typeProvider:CsvProvider<"../docs/MSFT.csv">) =
....
nor
let mainfunc<typeProvider:CsvProvider<"../docs/MSFT.csv">> =
....
worked.
I then tried to pass the MSFT from
type Stocks = CsvProvider<"../docs/MSFT.csv">
let msft = Stocks.Load("http://ichart.finance.yahoo.com/table.csv?s=MSFT")
Into a mainFunc. According to the intellisence, MSFT has a type of CsvProvider<...> and MSFT.Data has a type of seq<CsvProvider<...>>. I have tried to declare a input parameter with explicit type of these two but neither of them can pass compile.
Can anyone please help and point me to the right direction? Am I missing somthing fundamental here? Any .net type and class object can be used in a F# function to explicitly specify the parameter type, but can i do the same with the type from a type provider?
If the answer to above question is no, what are the alternative to make the logic generic to handle 20 files or even 200 different files?
This is related to Type annotation for using a F# TypeProvider type e.g. FSharp.Data.JsonProvider<...>.DomainTypes.Url
Even though intellisense shows you CsvProvider<...>, to reference the msft type in a type annotation you have to use Stocks, and for msft.Data, instead of CsvProvider<...>.Row, you have to use Stocks.Row.
If you want to do something dynamic, you can get the columns names with msft.Headers and you can get the types of the columns using Microsoft.FSharp.Reflection.FSharpType.GetTupleElements(typeof<Stocks.Row>) (this works because the row is erased to a tuple at runtime)
EDIT:
If the formats are incompatible, and you're dealing with dynamic data that doesn't conform to a common format, you might want to use CsvFile instead (http://fsharp.github.io/FSharp.Data/library/CsvFile.html), but you'll lose all the type safety of the type provider. You might also consider using Deedle instead (http://bluemountaincapital.github.io/Deedle/)

Why is my query returning the wrong string type?

According to the official Firebird documentation, columns containing Unicode strings (what SQL Server calls NVARCHAR) should be declared as VARCHAR(x) CHARACTER SET UNICODE_FSS. So I did that, but when I query the table with DBExpress, the result I get back is a TStringField, which is AnsiString only, not the TWideStringField I was expecting.
How do I get DBX to give me a Unicode string result from a Unicode string column?
With Firebird, your only option is to set the whole database connection to a Unicode char set, for example to utf8.
That way, all the VarChar columns will result in fields of type TWideStringField. The fields will be always TWideStringFields despite the particular char set declared when creating the column.
Setting this, will result in this:
I collect this images now from a example project I created while teaching Delphi a few months ago. You have to set this property before creating any persistent fields if that's your case.
It looks like the driver does not support the UNICODE_FSS charset, as my first action was to create a new project, set the property and then create some fields. IMHO it's better to declare the whole database as utf8 or other charset supported by the driver in the create database sentence, and then match the database charset in Delphi to avoid string conversions.

Resources