Complex flat file stage help needed

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
dsinfosys13
Participant
Posts: 9
Joined: Thu Oct 30, 2003 2:58 am

Complex flat file stage help needed

Post by dsinfosys13 »

Can we handle Variable length records of COBOL (Record format - VB) having OCCURS clause using CFF Stage?

Can I use an index to handle these values which occur multiple times without flatening the OCCURS clause?
gouraram
Participant
Posts: 6
Joined: Tue Sep 30, 2003 9:32 am

Post by gouraram »

Hi,
I am also having same problem, variable length file (file we are getting from third party) and I try to use CFF stage using OCCURS and depending on clause but it is not working. When I import copybook, it is expanding copybook to max occurs times(which is wrong).

Thanks in advance for your help.

Sreenivas
ariear
Participant
Posts: 237
Joined: Thu Dec 26, 2002 2:19 pm

Post by ariear »

Well ,
I don't remember if the CFF stage supports ODO (Occurs Depending On) which is the trivial way Variable Format Records can be processed. I do know that DataStage enterprise MVS edition (The former XE/390) does support it. The mechanism is to expand the CopyBook to it's max size (the data definition cannot be dynamically fittet for each row) and it's the programmer resposibility to process only the occurs elements up to the DEPENDING ON Field value.
No - no uses for index in CFF it's not COBOL it's DataStage (I hope i'm using the right it's) !
I'll try to take a look in the manuall later on :roll:
gouraram
Participant
Posts: 6
Joined: Tue Sep 30, 2003 9:32 am

Post by gouraram »

Hi,
One more thing is we have ODO in middle and after that we have regular fields which we want to use, so I can't really max out occurs.

Sreenivas
ariear
Participant
Posts: 237
Joined: Thu Dec 26, 2002 2:19 pm

Post by ariear »

So , I know this too and it's not easy to implement even with XE/390. I'll describe here what I've done with it - a series of transformers and routines (External Routines) that 'substring' The record upon a formula of the variable part (Occurs entry*ODO Field Value - there were a couple of arrays there) plus the fixed patrs (before ODO and After).
I'm not sure it can be done in a server Job using the CFF stage
dickfong
Participant
Posts: 68
Joined: Tue Apr 15, 2003 9:20 am

Post by dickfong »

As far as i know, ODO support only by mainframe jobs. Even if CFF for server supports that, i suggest you not to use that for large volume of data because the performance is really un-acceptable.
dsinfosys13
Participant
Posts: 9
Joined: Thu Oct 30, 2003 2:58 am

Erratic Behaviour of CFF stage.

Post by dsinfosys13 »

Having explored the properties tab, I am getting the feel that CFF stage can handle the cobol file with comp-3 , occurs and redefines properly. It does read values correctly when I cross check using VIEW data button "most of the times".

However when I have a comp-3 field inside a OCCURS field, the data is not getting read properly. The comp-3 fields are coming up as junk. This is actually hindering our design decisions.

Sometimes the fields using Redefines also does not seem to show correct value. Would appreciate comments from datastage experts on this stage

Rgds CP
gouraram
Participant
Posts: 6
Joined: Tue Sep 30, 2003 9:32 am

Post by gouraram »

Hi,
I was able to solve this ODO problem using substring function available in transformer. Since we know in record how many times occurs clause occurs, I was abale to do substring and move to a fixed length column.
nitecryvl
Premium Member
Premium Member
Posts: 27
Joined: Tue Jun 15, 2004 8:42 am

Re: Erratic Behaviour of CFF stage.

Post by nitecryvl »

dsinfosys13 wrote:Having explored the properties tab, I am getting the feel that CFF stage can handle the cobol file with comp-3 , occurs and redefines properly. It does read values correctly when I cross check using VIEW data button "most of the times".

However when I have a comp-3 field inside a OCCURS field, the data is not getting read properly. The comp-3 fields are coming up as junk. This is actually hindering our design decisions.

Sometimes the fields using Redefines also does not seem to show correct value. Would appreciate comments from datastage experts on this stage

Rgds CP
I did encountered an error with redefines in the CFF stage. If there is a comment block between the redefine and the field being redefined, the CFF stage does not know how to handle it.

Example:

05 AAA PIC S9(11)V9(03) COMP-3.
************************************************
* Comment Block *
************************************************
05 BBB REDEFINES AAA PIC S9(11)V9(04) COMP-3.

The data for all columns after the comment block were appearing as garbage. Once I removed the comment block, everything lines up. This may not be your problem, but looks for where comments are being placed in the copybooks. It may be why your redefines are incorrect.

Regards,
Vince Lee
ETL Developer
Zions Bank
Post Reply