Link Partitioner/Link Collector

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

JDionne
Participant
Posts: 342
Joined: Wed Aug 27, 2003 1:06 pm

Post by JDionne »

kcbland wrote:You have an entry line in the file is what you are saying. You have an extra carriage return/line feed somewhere. Either the data has it embedded in it, or the concatenation is somehow introducing it.

You can configure the sequential stage to throw away incomplete rows, or you can configure the sequential stage to complete incomplete rows where by you have to introduce a transformer constraint to throw them away. This at least moves you forward. You still have to find out which file is bad. You can look at the link statistics from the job that produces the individual files to get the row count, then go to the produced file and verify which file is wrong. If one of them is wrong, you have just provided that concatentation is not the problem, but its the data. If the files match the link statistics, then the concatenation of the files is the issue (sooo very not likely).
Ok first where are the settings for throwing away incomplete rows....dont member that one from training classes :)
second u are saying count the number of rows that are actualy in the files vers what DS says it outputs are the same.
what bothers me about this is that if i load the files by them selves they do not error. if there was an extra character in the file then i would imagin that it would not load at that point eather....this is what is confusing me about your theries.......
Jim
Sure I need help....But who dosent?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

JDionne wrote:Ok first where are the settings for throwing away incomplete rows....dont member that one from training classes :)
Jim
In the Sequential File stage go to the link properties (Outputs tab), select the Columns tab, then scroll the grid to the right using the horizontal scroll bar at its base. It's there you will find the rules for incomplete rows (missing columns) and other abnormal situations such as the possibility of embedded line termination characters. Right near the part where you add the textual description of what's in each columns.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

JDionne wrote: this is what is confusing me about your theries.......
Jim
Your issue has been open for weeks. In the meantime, I've answered about 100 posts and done some work myself. My theories are meant to guide you in what to look for in your design. Some other time you will have characters in your data that will mess with you (delimiters, control characters, carriage returns, etc) that will cause similar problems. So now you know what some of the possibilities are for these types of errors.

It's also difficult to troubleshoot designs from a forum. In the past I had you send me a job to look at, which I did. I made a design suggestion for you that is a solid practice in what we do. If your job is not working, I again extend that offer to look at the job design and see if I can identify what's wrong/different.

I have recommended that if you have three streams that create three files, verify that the row counts in the link statistics match the row counts in the files. Obviously, if you concatentate the files and have a larger overall row count that expected, you have to verify the individual files. As a next step, please post the command you are using to concatenate the files. You could be introducing an error there. There's about 10 more ways that you could be introducing problems (one file is DOS, another Unix, no-trailing LF, embedded delimiteres, control characters, quoting is off, definitions are different between files, etc.).
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
JDionne
Participant
Posts: 342
Joined: Wed Aug 27, 2003 1:06 pm

Post by JDionne »

kcbland wrote:
JDionne wrote: this is what is confusing me about your theries.......
Jim
Your issue has been open for weeks. In the meantime, I've answered about 100 posts and done some work myself. My theories are meant to guide you in what to look for in your design. Some other time you will have characters in your data that will mess with you (delimiters, control characters, carriage returns, etc) that will cause similar problems. So now you know what some of the possibilities are for these types of errors.

It's also difficult to troubleshoot designs from a forum. In the past I had you send me a job to look at, which I did. I made a design suggestion for you that is a solid practice in what we do. If your job is not working, I again extend that offer to look at the job design and see if I can identify what's wrong/different.

I have recommended that if you have three streams that create three files, verify that the row counts in the link statistics match the row counts in the files. Obviously, if you concatentate the files and have a larger overall row count that expected, you have to verify the individual files. As a next step, please post the command you are using to concatenate the files. You could be introducing an error there. There's about 10 more ways that you could be introducing problems (one file is DOS, another Unix, no-trailing LF, embedded delimiteres, control characters, quoting is off, definitions are different between files, etc.).
I wasnt tring to be disrespectfull or attack you. I meant no harm by what I said, i was just trying to work through a problem. Thats how my team does it we throw ideas up and we blow them away untill we find the one that holds water. Ill chose my words wiser next time.

This is the code I use for the combining of the files:
copy D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl1.txt + D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl2.txt + D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl3.txt + D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl4.txt + D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl5.txt + D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl6.txt D:\ETL_Processes\JOC\ETL_Load_Files\LD_JOC_STG_Tbl.txt

I set the incomplet column setting to discard and warn and i recieved no warnings so that tells me there are no incomplete columns.
any other thoughts???
Jim
Sure I need help....But who dosent?
JDionne
Participant
Posts: 342
Joined: Wed Aug 27, 2003 1:06 pm

Post by JDionne »

Unfortunatly no one has any idea whats going on here and I have run out of time. I am reverting to the old job with collectors and partitioners. We have just upgraded to vers 7 so I am hopeing that that will help keep this job running smothly. Thanx for all of you guys imput.
Jim
Sure I need help....But who dosent?
shawn_ramsey
Participant
Posts: 145
Joined: Fri May 02, 2003 9:59 am
Location: Seattle, Washington. USA

Post by shawn_ramsey »

JDionne wrote:Unfortunatly no one has any idea whats going on here and I have run out of time. I am reverting to the old job with collectors and partitioners. We have just upgraded to vers 7 so I am hopeing that that will help keep this job running smothly. Thanx for all of you guys imput.
Jim
Jim,

In my opinion you should have stayed with that approach. Please look at my comments earlier on using the containers.
Shawn Ramsey

"It is a mistake to think you can solve any major problems just with potatoes."
-- Douglas Adams
JDionne
Participant
Posts: 342
Joined: Wed Aug 27, 2003 1:06 pm

Post by JDionne »

shawn_ramsey wrote:
JDionne wrote:Unfortunatly no one has any idea whats going on here and I have run out of time. I am reverting to the old job with collectors and partitioners. We have just upgraded to vers 7 so I am hopeing that that will help keep this job running smothly. Thanx for all of you guys imput.
Jim
Jim,

In my opinion you should have stayed with that approach. Please look at my comments earlier on using the containers.
The DS job isnt the problem...though i will go back and look at the container comments. its combining the file after ds. The DS does the job fine. Thanx again for all the support
Jim
Sure I need help....But who dosent?
Post Reply