Header and Detail Records
Moderators: chulett, rschirm, roy
RE: Don't know, how do you get around this?
I don't know the answer to ur Qn - but then again, to attack even the basic problem - how do we skip rows that are not data. For example, one of the files I have to deal with has some human readable header content like, for example, no. of transactions in the file or something like that. Now, I know that it takes an N no. of lines everytime and also its going to be first N no. of lines of the file, but then how to get skip those lines at the top and start parsing from a pre-specified line number?
The reason I ask this question, is that you said you thought that there is going to be only one header at the top in your case (which sounded like if it were the case, you are ok!)
Thanks!
The reason I ask this question, is that you said you thought that there is going to be only one header at the top in your case (which sounded like if it were the case, you are ok!)
Thanks!
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
This is probably most easily solved with three separate targets containing some sortable keys to allow the data to be merged subsequently. What form is your source data in?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hello Ray,
I have achieved part of the problem, but not the whole solution yet. The source is a fixed-width flat file. And then there exists the header, detail and tender for each transaction. Each transaction has one header record, may have one or more detail record(s) and one tender record.
The way I'm implementing my job to achieve varying record length target file format is as follows:
1. I have written a routine 'VarRecLen' which takes in all the target fields as its arguments.
2. It concatenates the fields required for the header record and then concatenates with char(13):char(10) {for DOS}.
3. It then concatenates the above string (in 2.) with the fields required for the detail record and then concatenates with char(13):char(10) {for DOS}.
4. Finally, it concatenates the fields required for the tender record in the same way.
5. The most important point though is that my target is a sequential file with no delimiter and no quote character.
6. So, my job has one source seq. file, one transformer (where I call the routine) and the target seq. file.
But, I have one last major hurdle. Since, each transaction may have multiple 'detail' records. Now, I'm able to generate one header, one detail and one tender record of varying record lengths. But I'm stuck on how to generate those multiple detail records if I encounter the same transaction number more than once.
I hope my explanation is comphrensive and decipherable.
Zillion thanks!
Whale.
I have achieved part of the problem, but not the whole solution yet. The source is a fixed-width flat file. And then there exists the header, detail and tender for each transaction. Each transaction has one header record, may have one or more detail record(s) and one tender record.
The way I'm implementing my job to achieve varying record length target file format is as follows:
1. I have written a routine 'VarRecLen' which takes in all the target fields as its arguments.
2. It concatenates the fields required for the header record and then concatenates with char(13):char(10) {for DOS}.
3. It then concatenates the above string (in 2.) with the fields required for the detail record and then concatenates with char(13):char(10) {for DOS}.
4. Finally, it concatenates the fields required for the tender record in the same way.
5. The most important point though is that my target is a sequential file with no delimiter and no quote character.
6. So, my job has one source seq. file, one transformer (where I call the routine) and the target seq. file.
But, I have one last major hurdle. Since, each transaction may have multiple 'detail' records. Now, I'm able to generate one header, one detail and one tender record of varying record lengths. But I'm stuck on how to generate those multiple detail records if I encounter the same transaction number more than once.
I hope my explanation is comphrensive and decipherable.
Zillion thanks!
Whale.
Last edited by I_Server_Whale on Fri Dec 15, 2006 5:34 pm, edited 1 time in total.
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hi amsh76(Amit),
If it was not for you. I would have not reached this far. But How will I check the present input (transaction no.) with the previous input(transaction no.)?
The logic may be something like this (In the constraints):
IF NEW.transaction.no = OLD.transaction.no
THEN strip the header and tender, load only detail record.
ELSE continue regular process.
Hope that will do it. But how do I implement this in DS program?
Thanks much!
Whale.
If it was not for you. I would have not reached this far. But How will I check the present input (transaction no.) with the previous input(transaction no.)?
The logic may be something like this (In the constraints):
IF NEW.transaction.no = OLD.transaction.no
THEN strip the header and tender, load only detail record.
ELSE continue regular process.
Hope that will do it. But how do I implement this in DS program?
Thanks much!
Whale.
Last edited by I_Server_Whale on Fri Dec 15, 2006 6:05 pm, edited 1 time in total.
-
- Participant
- Posts: 232
- Joined: Sat May 07, 2005 2:49 pm
- Location: USA
Hi NaveenD,
Its Simple, use stage variables like shown below
Then use both the above variables which has old transaction no (OLDVAL) and the new transaction no (NEWVAL) to check your condition
Its Simple, use stage variables like shown below
Code: Select all
Derivation StageVar
NEWVAL OLDVAL
INPUTVAL NEWVAL
Thanks,
Naveen
Naveen
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hi pnchowdary,
I'm using stage variables, but still not getting the right answer. I'm using 6 stage variables.
1. HEADER ---> calls the header routine.
2. DETAIL ---> calls the detail routine.
3. TENDER ----> calls the tender routine.
4. OLDTRAN ----> mapped to the TRANSACTION NUMBER field in the source file.
5. NEWTRAN ----> equal to OLDTRAN.
6. RESULT
so, when the old transaction number is not equal to the new one, I should load a new header, new detail and a new tender record.
Now I'm passing the 6th Stage Variable to my ONE field in the target file.
But when I run the job, although I have duplicate TRANSACTION NUMBERS in the source file. It is just loading the multiple SKU's in the target. Not the header and tender. That means it is executing only the ELSE part of the logic above.
Any help is very much appreciated.
Thanks!
Whale.
I'm using stage variables, but still not getting the right answer. I'm using 6 stage variables.
1. HEADER ---> calls the header routine.
Code: Select all
trial.header(DSLink3.f1, DSLink3.f2, DSLink3.f3, DSLink3.f4, DSLink3.f5, DSLink3.f6)
Code: Select all
trial.detail(DSLink3.f7, DSLink3.f8, DSLink3.f9, DSLink3.f10)
Code: Select all
trial.tender(DSLink3.f11, DSLink3.f12, DSLink3.f13)
Code: Select all
DSLink3.f1
6. RESULT
Code: Select all
If (OLDTRAN <> NEWTRAN) Then (HEADER : DETAIL : TENDER) Else DETAIL
Now I'm passing the 6th Stage Variable to my ONE field in the target file.
But when I run the job, although I have duplicate TRANSACTION NUMBERS in the source file. It is just loading the multiple SKU's in the target. Not the header and tender. That means it is executing only the ELSE part of the logic above.
Any help is very much appreciated.
Thanks!
Whale.
Last edited by I_Server_Whale on Fri Dec 15, 2006 6:06 pm, edited 1 time in total.
-
- Participant
- Posts: 232
- Joined: Sat May 07, 2005 2:49 pm
- Location: USA
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Got it. Thanks a lot.
Whale.
Whale.
Last edited by I_Server_Whale on Fri Dec 15, 2006 6:06 pm, edited 1 time in total.
-
- Participant
- Posts: 232
- Joined: Sat May 07, 2005 2:49 pm
- Location: USA