Error: 140015 Parser failed

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Error: 140015 Parser failed

Post by srikie »

Hi,
I am trying to generate a code for a mainframe job.
I got the above error. Can anyone help me debug this message please.
My job contains

complex flatfile---->transfromer----------->linkcollector----> fixd flat file
----------->(there are 9 links from a transformer to link collector.
Thanks in advance
Srikie
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
not 390 expert but in general:
AFAIK link collector is not ment to be used this way.
link collector expects fluid input from all links according to the method used i.e. round robin.
so try another design.
if you wrote what you need to acomplish people might be able to advise in ways to get the job done.

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

roy wrote:Hi,
not 390 expert but in general:
AFAIK link collector is not ment to be used this way.
link collector expects fluid input from all links according to the method used i.e. round robin.
so try another design.
IHTH,
All input links to collector or fluid links. I want the source record to split into 9 sperate records based on date feild and all these records go into one target table. And in 390 link collector doesnt ask for any algorithm like round-robin etc., I have played around with it but ultimatley i realised it not link collectors fault I used look-up stage after link collector which refers data from relational stage and populates into sequential file. I think
the problem is with look-up and sequential file combination. I am yet to get soultion or identify the exact problem for it.
Thanks
Srikanth
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

Hello Srikanth,

The way a mainframe link collector works is that it copies all of the rows from the first input link to the output link, then it copies all of the rows from the second input link and appends to the output link, and so on ...

Given the way that it works, it probably isn't a good idea to have multiple links from the same transformer to the same link collector. Perhaps that's the cause of your "parser" error. You should probably report that behavior to Ascential as the Designer client likely shouldn't allow you to build such a job structure.

Here's a possible solution: Replace the link collector with a flat file stage. In a mainframe job you can have multiple input links to a single flat file stage (note that this is fundamentally different than a server job's flat file stage).

Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

Hi Mike,
I dont think it is link collector's problem cause it is working properly if I use relational stage infact It worked well if i use flat file stage after link collctor also but the problem is when i send this file to a lookup stage and
us another flat file as target for this lookup i am getting the problem.
I am not sure if the problem is with link collector or lookup or the combination.
Thanks
srikie
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

Is your problem in this part of your job flow?:

Code: Select all

Link Collector --> Lookup --> Flat File
If so, try this structure:

Code: Select all

Link Collector --> Transformer --> Lookup --> Transformer --> Flat File
I always recommend using a transformer between any two stages that would allow it. It costs nothing in terms of performance, and provides maximum flexibility for debugging and future maintenance.

Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

I will try inserting transformer in b/w them.
But it works well for relational stage though????
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

When you use a passive stage (e.g. relational, flat file), you introduce a processing boundary. All of the stages that come after a passive stage must wait for all of the rows to be written to the passive stage. The generated code (and therefore job behavior) will be significantly different when using an intermediate passive stage versus using an intermediate active stage.

Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

But it works well for relational stage though????
I meant
linkcoll--------->lookupstage---------->bulk_load stage.

works well. It works even for realtional stage I think.
But it only doesnt work for flat file stage, all of them are passive stages.
Thanks
Srikie
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

I assume by "bulk_load" stage, you must have meant the DB2 Load Ready Flat File stage. In theory replacing this final stage with a simple fixed-width flat file stage should work as well. Of course, the code generation engine will go down a different path, so this might be a bug in the code generation engine. It would be worthwhile to report this to your support provider.

I still suspect that the root of the problem might actually lie in the way that you have used the link collector. I would suggest a test where you replace the link collector with a transformer and use 1 of your 9 possible flows to see if the end part of your job functions as you expect.

The link collector was new in version 7x, so I wouldn't be surprised to find some bugs related to it and its interaction with other stage types.

Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

Mike,
I tried generating the code with
MFF--->Transfromer------------------>flat file.
multiple links here

It gives me the same error?? Its got nothing to do with link collector.Do you foresee problem elsewhere.
Thanks
srikie
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

It looks like you've pared it down to a fairly simple job.

You'll have to do some detective work to isolate the issue.

Create a copy of the job and systematically remove/change items in the job design. Some suggestions on where to look: 1) transformer derivations that are more than simple passthroughs (don't forget to check stage variable derivations and constraint expressions as well), 2) try replacing the MFF with a CFF or fixed-width flat file.

Good luck,
Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

Mike,
Hurraaaaaaaayyyyyy ! :D finally I solved the problem. The problem is with timestamp attribute, I am yet to know wats the problem but when I removed tht attribute, its generating code. Something with flat file and timestamp. If anyone knows any thg abt it please enlighten me.
Really hard way to learn such lessons.
Thanks
Srikie
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

I don't think TIMESTAMP is a valid data type for a fixed-width flat file (which might explain why the code generation engine had trouble parsing it). How did you populate the column definitions? Did you drag a link from a DB2 Load Ready Flat File stage (or a Relational stage)?

Mike
srikie
Participant
Posts: 58
Joined: Thu Oct 14, 2004 4:19 pm

Post by srikie »

Yeh I did it that way and also I typed manually in table defnition which also dint work. I think Timestamp is supported in flat file stage but what I found out again is its length has to be mentioned 26 we cant leave the length blank like we do it for date.
Thanks
srikie
Post Reply