CFF stage reading Mainframe Dataset that has S9(S) Comp -3

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kravids
Participant
Posts: 9
Joined: Tue Jan 19, 2010 2:33 pm

CFF stage reading Mainframe Dataset that has S9(S) Comp -3

Post by kravids »

Hi All,
Iam having trouble reading the data from Mainframe Dataset using the complex flat file. It has dataset S9(s) comp-3.( Datastage version 8.1.1)
can you please help me with this issue. I had seen all the posts and followed everything but still it dint work for me..
Can you please speficiy
1.what would be the record type is it fixed length or variable or variable block ??
2.what would be the final delimiter ,field delimiter, quote for such types of datasets.

Please can anyone help me from this issue.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Welcome aboard.

If you import the "table definition" from a relevant COBOL copybook, then all of the required metadata will be there for you.

As a general rule field sizes are fixed in mainframe data, so there is no field delimiter and no need for quotes. The record delimiter will depend upon how the data file arrives on your server, but is likely to be either None or (for variable records) UNIX-style. But check with the database owner.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kravids
Participant
Posts: 9
Joined: Tue Jan 19, 2010 2:33 pm

Reading Mainframe Data throough CFF Stage

Post by kravids »

ray.wurlod wrote:Welcome aboard.

If you import the "table definition" from a relevant COBOL copybook, then all of the required metadata will be there for you.

As a general rule field sizes are fixed in mainframe data, so there is no field delimiter and no need for quotes. The record delimiter will depend upon how the data file arrives on your server, but is likely to be either None or (for variable records) UNIX-style. But check with the database owner.
Thanks for your reply.
Iam getting error of Heap Size Limit ....
1.The current soft limit on the data segment (heap) size (1610612736) is less than the hard limit (9223372036854775807), consider increasing the heap size limit

I have doubt whether we need to flatten all the arrays or flatten selective arrays or As Is when loading the table definitions into the sequential files
whether we should include group columns or not while loading the table definition....
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Was "S9(S) COMP-3" a typographical error? Can you do a "view data" from your DataStage designer (I am assuming the HEAP error is a runtime one)? Also, could you cut-and-paste the complete actual error message?
kravids
Participant
Posts: 9
Joined: Tue Jan 19, 2010 2:33 pm

CFF stage reading Mainframe Dataset that has S9(5) Comp -3

Post by kravids »

ArndW wrote:Was "S9(5) COMP-3" a typographical error? Can you do a "view data" from your DataStage designer (I am assuming the HEAP error is a runtime one)? Also, could you cut-and-paste the complete actual error message?
Sorry it was S9(5) Comp -3 data i was reading from Mainframe to Sequential from CFF stage


main_program: The current soft limit on the data segment (heap) size (1610612736) is less than the hard limit (9223372036854775807), consider increasing the heap size limit
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So... consider increasing the heap size limit, then. Talk to your System Admin for the 'how' of that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

What is your actual error - the heap size message is a warning, not a fatal error and might have nothing at all to do with your real problem.
kravids
Participant
Posts: 9
Joined: Tue Jan 19, 2010 2:33 pm

Extracting Mainframe data with Comp3 &loading into Seq F

Post by kravids »

ArndW wrote:What is your actual error - the heap size message is a warning, not a fatal error and might have nothing at all to do with your real problem. ...
Iam using Datastage 8.1 and iam unable to read the cobol data and write it into the sequential files.
Iam not flattening the arrays ,Removing the group columns , keeping Field defaults as None.
Doing FTP of mainframe data ( in binary format) to unix server and reading the data . I tried with Ascii also it dint work at all. Iam getting Heap Error if not then iam geting import error. Iam unable to read the data from Complex flat file stage to Sequential stage. I tried with jus one column to read the data and it was successful but when i tried with the entire data it dint work at all. Is there any limitations for the complex flat file stage..... Any recommendation you want me to follow while loading the data and reading the data. Is there any parameters i need to follow ? Can you plz help me iam trying hard with this issue . I will be thankful to you if can you provide me any steps and suggestions for me ........
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Again, the heap message is a warning and not an error.

Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.
mukejee
Participant
Posts: 37
Joined: Thu Dec 11, 2008 5:38 am
Location: India,Bangalore

Post by mukejee »

[quote="ArndW"]Again, the heap message is a warning and not an error.

Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.[/quote]

Hi, I am new to DS 8.1.I face a similar kind of probem. The Mainframe file which has the columns as COMP datatype - not able to view the proper value when i give it as BINARY datatype in CFF stage.

The CFF file gives proper output in DS7.5.3.Please help me out in ths regard.

Thanks in Advance!!
developeretl
Participant
Posts: 89
Joined: Sat Jul 24, 2010 11:33 pm

Post by developeretl »

mukejee wrote:
ArndW wrote:Again, the heap message is a warning and not an error.

Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.
Hi, I am new to DS 8.1.I face a similar kind of probem. The Mainframe file which has the columns as COMP datatype - not able to view the proper value when i give it as BINARY datatype in CFF stage.

The CFF file gives proper output in DS7.5.3.Please help me out in ths regard.

When you do FTP from Mainframe to Unix server confirm whethere it is in binary format or ascii format. Try with keeping the record demiliter as Unix Newline and try to view the data.



Thanks in Advance!!
battaliou
Participant
Posts: 155
Joined: Mon Feb 24, 2003 7:28 am
Location: London
Contact:

Post by battaliou »

I've seen this problem on 8.1 as well. Try reading the data as a char and then do a conversion on it. You will probably be able to see the data as a string with a signed value on its end e.g. } would be a 0. I'd be interested to hear if anyone else is struggling to upgrade CFF to 8.1.
3NF: Every non-key attribute must provide a fact about the key, the whole key, and nothing but the key. So help me Codd.
mukejee
Participant
Posts: 37
Joined: Thu Dec 11, 2008 5:38 am
Location: India,Bangalore

Solution ---

Post by mukejee »

Hi , Got the solution for the problem -

At the CFF level stage, Record Options tab, Set the Byte Order as
Big-endian..

This works!!!!!!!!
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Only because your data are Big-Endian.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply