Quickbooks - migrate import IIF to QODBC - quickbooks

I'm trying to migrate my actual logic to import invoices into Quickbooks using QODBC tool, and I'm having trouble finding the relationship of the columns of the IIF file with the columns of the DB.
Here is the header of my IIF file
!TRNS TRNSID TRNSTYPE DATE ACCNT NAME CLASS AMOUNT DOCNUM MEMO CLEAR TOPRINT NAMEISTAXABLE ADDR1 ADDR2 ADDR3 ADDR4 ADDR5 PONUM DUEDATE TERMS OTHER1
!SPL SPLID TRNSTYPE DATE ACCNT NAME CLASS AMOUNT DOCNUM MEMO CLEAR QNTY PRICE INVITEM PAYMETH TAXABLE VALADJ SERVICEDATE OTHER2 EXTRA
!ENDTRNS
In which tables and columns should insert data to generate an invoice with similar data that I have now?
There is some documentation where this all these relationships?
Thanks in advance!

I would suggest trying Bait and Sync technique.
Enter something unique in customer, note, address, item, etc in the front end or the source from where you are creating the IIF file. Once the data is entered you can export it to IIF file.
Once you have that you know which fields of IIF file links to what.
You can then refer QODBC Invoice / InvoiceLine table schema/relationship.
Refer:
http://qodbc.com/schema.htm
You need to then do the field mapping, get the SQL Statement generated and execute the insert statements.
Refererences:
http://support.flexquarters.com/esupport/index.php?/Knowledgebase/Article/View/2389/0/how-to-create-invoices-using-qodbc
http://support.flexquarters.com/esupport/index.php?/Knowledgebase/Article/View/2810/44/how-to-insert-invoice-using-excel---vba

Related

How to remove records from a file having same values at a location?

I have a sequential file having record length of 11. I have a field in-between starting from 9th position till 11th position and is of PIC 9(03). I want to delete all the records where I have same data in above specified location. This needs to be done using JCL only. Any utility can be used but should be supporting in microfocus cobol. See the example below:
Example File:
Rob ,d,012
Mike ,h,013
Kim ,g,014
Bob ,k,014
Wiz ,t,015
In the above example I want to delete rows for Kim and Mike as it is having same value for the location i.e. 014 and final output should be:
Rob ,d,012
Mike ,h,013
Wiz ,t,015
Try these statements in the SYSIN DD of DFSORT utility.
SORT FIELDS=COPY
OMIT COND=(9,3,ZD,EQ,014)
Microfocus use the mfsort utility which emulates all the major functions of IBM's DFSORT product.
mfsort option copy
use input-file [record definition]
[org organization] [key structure]
give output-file [record definition]
[org organization] [key structure]
omit cond (9,3,nu,eq,014)
More details about mfsort can be found here.
checkout nsort from ordinal. also syncsort now precisely. these are cots utilities that can drop duplicate records.
Sort by the name, and add up the count of records by name.
Use the OUTFIL command to delete the records that had more than one record:
RECORD TYPE=F,LENGTH=11
INREC FIELDS=(1,11,C'0001') * Include a dummy field to sum up
SORT FIELDS=(9,3,BI,A) * Sort by the "id"
SUM FIELDS=(12,4,ZD) * Count the number of records by "id"
OUTFIL FILES=OUT,BUILD=(1,11),OMIT=(12,4,NE,C'0001') * Get rid of duplicated records
END
This should work on DFSORT or SYNCSORT, not sure about MFSORT.

Mapping Values in Qlikview

In Qlikview, I have an excel sheet that I use to map USERNAME to a TEAM value. But everytime I refresh the dashboard, new USERNAME values come up and since they are not in the excel sheet, these USERNAME values show up as their own value in the TEAM column. How would I make it so that any USERNAME that is not in the excel sheet shows up as 'Unidentified' or another value under the TEAM column instead of showing up as their own separate value?
First of all when posting question here if possible always include the source code so everybody will have more clear picture about your problem. Just saying.
On the topic ...
Use the mapping load in this case with supplying the third parameter. For example:
TeamMapping:
Mapping
Load
UserName,
Team
From
[User_to_Team_Mapping.xlsx] (ooxml, embedded labels, table is [Sheet1])
;
Transactions:
Load
Id,
Amount,
ApplyMap( 'TeamMapping', User, 'Unidentified') as Team
From
Transactions.qvd (qvd)
;
The third parameter in ApplyMap is the default string value when mapping value was not found in the mapping table (TeamMapping)

(Ruby on Rails) How to import some data from xml into a relation table?

I have 3 tables. One of them is a relation table stored the relationship of the other two. Courses, Staff, and Courses-staff (this is the relation table). There was no primary key in my Courses table, so I want to let Rails help me set it (id). But when I import these data from an xml into my sqlite tables, I don't know the id of each entry in my Courses table. How can I store the primary keys of these two tables into my "Courses-staff" table ??
You don't really explain what you've tried, so it's hard for anyone to answer the question. If your XML is formulated as such:
<course>
<something>bla</something>
<somethingElse>blabla</somethingElse>
<staffIDs>
<staffID>124233</staffID>
<staffID>123532</staffID>
</staff>
</course>
<staff>
<staffMember>
<staffID>124233</staffID>
<staffName>Jim</staffName>
</staffMember>
<staffMember>
<staffID>123532</staffID>
<staffName>Sophia</staffName>
</staffMember>
</staff>
Then make sure that you parse and create the Staff first, then when you're parsing the Courses, you can do something like (pseudocode):
courses.each do |c|
# ...
staffIDs.each do |s|
#course.staff << Staff.find_by_external_id(s)
end
end

Reading Informix-SE audit trail log tables

INFORMIX-SQL 7.32 (SE):
I've created an audit trail "a_trx" for my transaction table to know who/when has added or updated rows in this table, with a snapshot of the rows contents. According to documentation, an audit table is created with the same schema of the table being audited, plus the following audit info header columns pre-fixed:
table a_trx
a_type char(2) {record type: aa = added, dd =deleted,
rr = before update image, ww = after update image.}
a_time integer {internal time value.}
a_process_id smallint {Process ID that changed record.}
a_usr_id smallint {User ID that changed record.}
a_rowid integer {Original rowid.}
[...] {Same columns as table being audited.}
So then I proceeded to generate a default perform screen for a_trx, but could not locate a_trx for my table selection. I aborted and ls'd the .dbs directory and did not see a_trx.dat or a_trx.idx, but found a_trx, which appears to be in .dat format, according to
my disk editor utility. Is there any other method for accessing this .dat clone or do I have to trick the engine by renaming it to a_trx.dat, create an .idx companion for it, tweak SYSTABLES, SYSCOLUMNS, etc. to be able to access this audit table like any other table?.. and what is the internal time value of a_time, number of seconds since 12/31/1899?
The audit logs are not C-ISAM files; they are plain log files. IIRC, they are created with '.aud' as a suffix. If you get to choose the suffix, then you would create it with a '.dat' suffix, making sure the name does not conflict with any table name.
You should be able to access them as if they were a table, but you would have to create a table (data file) and the index file to match the augmented schema, and then arrange for the '.aud' file to refer to the same location as the '.dat' file - presumably via a link or possibly a symbolic link. You can specify where the table is stored in the CREATE TABLE statement in SE.
The time is a Unix time stamp - the number of seconds since 1970-01-01T00:00:00Z.

On subsonic3 How to get the name of a field from a table, and the max length, type etc

How to get the string-name of a field from a table on subsonic 3?
On subsonic version 2 I use the TableName.columns.Field
Also in subsonic 2 I have the opportunity to get the max length of a string field.
How can I do that on subsonic 3 ?
I dont have the code in front of me but you can access some of the table info such as the column names by using [TableName]Table.columns , replace [TableName] with the name of your table
I end up that we need to change the tt files and create that fields.
Of cource maybe they was better to exist in the default files, however they not
here is the page for how to create the dal for t4
http://subsonicproject.com/docs/Creating_Your_Own_DAL

Resources