CFF stage reading Mainframe Dataset that has S9(S) Comp -3
Moderators: chulett, rschirm, roy
CFF stage reading Mainframe Dataset that has S9(S) Comp -3
Hi All,
Iam having trouble reading the data from Mainframe Dataset using the complex flat file. It has dataset S9(s) comp-3.( Datastage version 8.1.1)
can you please help me with this issue. I had seen all the posts and followed everything but still it dint work for me..
Can you please speficiy
1.what would be the record type is it fixed length or variable or variable block ??
2.what would be the final delimiter ,field delimiter, quote for such types of datasets.
Please can anyone help me from this issue.
Iam having trouble reading the data from Mainframe Dataset using the complex flat file. It has dataset S9(s) comp-3.( Datastage version 8.1.1)
can you please help me with this issue. I had seen all the posts and followed everything but still it dint work for me..
Can you please speficiy
1.what would be the record type is it fixed length or variable or variable block ??
2.what would be the final delimiter ,field delimiter, quote for such types of datasets.
Please can anyone help me from this issue.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Welcome aboard.
If you import the "table definition" from a relevant COBOL copybook, then all of the required metadata will be there for you.
As a general rule field sizes are fixed in mainframe data, so there is no field delimiter and no need for quotes. The record delimiter will depend upon how the data file arrives on your server, but is likely to be either None or (for variable records) UNIX-style. But check with the database owner.
If you import the "table definition" from a relevant COBOL copybook, then all of the required metadata will be there for you.
As a general rule field sizes are fixed in mainframe data, so there is no field delimiter and no need for quotes. The record delimiter will depend upon how the data file arrives on your server, but is likely to be either None or (for variable records) UNIX-style. But check with the database owner.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Reading Mainframe Data throough CFF Stage
Thanks for your reply.ray.wurlod wrote:Welcome aboard.
If you import the "table definition" from a relevant COBOL copybook, then all of the required metadata will be there for you.
As a general rule field sizes are fixed in mainframe data, so there is no field delimiter and no need for quotes. The record delimiter will depend upon how the data file arrives on your server, but is likely to be either None or (for variable records) UNIX-style. But check with the database owner.
Iam getting error of Heap Size Limit ....
1.The current soft limit on the data segment (heap) size (1610612736) is less than the hard limit (9223372036854775807), consider increasing the heap size limit
I have doubt whether we need to flatten all the arrays or flatten selective arrays or As Is when loading the table definitions into the sequential files
whether we should include group columns or not while loading the table definition....
Was "S9(S) COMP-3" a typographical error? Can you do a "view data" from your DataStage designer (I am assuming the HEAP error is a runtime one)? Also, could you cut-and-paste the complete actual error message?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
CFF stage reading Mainframe Dataset that has S9(5) Comp -3
Sorry it was S9(5) Comp -3 data i was reading from Mainframe to Sequential from CFF stageArndW wrote:Was "S9(5) COMP-3" a typographical error? Can you do a "view data" from your DataStage designer (I am assuming the HEAP error is a runtime one)? Also, could you cut-and-paste the complete actual error message?
main_program: The current soft limit on the data segment (heap) size (1610612736) is less than the hard limit (9223372036854775807), consider increasing the heap size limit
What is your actual error - the heap size message is a warning, not a fatal error and might have nothing at all to do with your real problem.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Extracting Mainframe data with Comp3 &loading into Seq F
Iam using Datastage 8.1 and iam unable to read the cobol data and write it into the sequential files.ArndW wrote:What is your actual error - the heap size message is a warning, not a fatal error and might have nothing at all to do with your real problem. ...
Iam not flattening the arrays ,Removing the group columns , keeping Field defaults as None.
Doing FTP of mainframe data ( in binary format) to unix server and reading the data . I tried with Ascii also it dint work at all. Iam getting Heap Error if not then iam geting import error. Iam unable to read the data from Complex flat file stage to Sequential stage. I tried with jus one column to read the data and it was successful but when i tried with the entire data it dint work at all. Is there any limitations for the complex flat file stage..... Any recommendation you want me to follow while loading the data and reading the data. Is there any parameters i need to follow ? Can you plz help me iam trying hard with this issue . I will be thankful to you if can you provide me any steps and suggestions for me ........
Again, the heap message is a warning and not an error.
Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.
Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
[quote="ArndW"]Again, the heap message is a warning and not an error.
Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.[/quote]
Hi, I am new to DS 8.1.I face a similar kind of probem. The Mainframe file which has the columns as COMP datatype - not able to view the proper value when i give it as BINARY datatype in CFF stage.
The CFF file gives proper output in DS7.5.3.Please help me out in ths regard.
Thanks in Advance!!
Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.[/quote]
Hi, I am new to DS 8.1.I face a similar kind of probem. The Mainframe file which has the columns as COMP datatype - not able to view the proper value when i give it as BINARY datatype in CFF stage.
The CFF file gives proper output in DS7.5.3.Please help me out in ths regard.
Thanks in Advance!!
-
- Participant
- Posts: 89
- Joined: Sat Jul 24, 2010 11:33 pm
mukejee wrote:Hi, I am new to DS 8.1.I face a similar kind of probem. The Mainframe file which has the columns as COMP datatype - not able to view the proper value when i give it as BINARY datatype in CFF stage.ArndW wrote:Again, the heap message is a warning and not an error.
Write a test job with just the CFF stage and output to a PEEK stage. As you've done already, get it to read and display one column successfully using "view data". Once that is working, add columns until you get an error and then post that information to the thread.
The CFF file gives proper output in DS7.5.3.Please help me out in ths regard.
When you do FTP from Mainframe to Unix server confirm whethere it is in binary format or ascii format. Try with keeping the record demiliter as Unix Newline and try to view the data.
Thanks in Advance!!
I've seen this problem on 8.1 as well. Try reading the data as a char and then do a conversion on it. You will probably be able to see the data as a string with a signed value on its end e.g. } would be a 0. I'd be interested to hear if anyone else is struggling to upgrade CFF to 8.1.
3NF: Every non-key attribute must provide a fact about the key, the whole key, and nothing but the key. So help me Codd.
Solution ---
Hi , Got the solution for the problem -
At the CFF level stage, Record Options tab, Set the Byte Order as
Big-endian..
This works!!!!!!!!
At the CFF level stage, Record Options tab, Set the Byte Order as
Big-endian..
This works!!!!!!!!
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: