Hi All
I am New to Datastage and this forum. I got a question, I have a file which is comming from mainframes which has 7Million Rows , it is a direct one time load without any transformations.(The source file is in binary of 316 record length) My design is source(CFF) then Transformer then target database(oracle 9) First time i started it for a million it was good but when i tried to run it for 7M it is showing that target has 52 Million and the job is still running i looked into the director it shows no warning neither error.Then i replaced the target file with flat file it was running good for 7 Million, can any body tell me the reason behind this??? .
Thank you
Kris
getting looped with CFF
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 20
- Joined: Mon Dec 05, 2005 8:07 pm
- Contact:
-
- Participant
- Posts: 31
- Joined: Mon Dec 01, 2003 6:24 am
- Location: London
-
- Participant
- Posts: 20
- Joined: Mon Dec 05, 2005 8:07 pm
- Contact:
Thanks jenkinsrob for ur reply,
No i did not monitor the job, i started the job before going home and when i came back in the morning it was still running.. the job was running for nearly 12 hours, In the morning i tried it run for like1 million and again it went more than that 1 Million and no body else is playing with that table
No i did not monitor the job, i started the job before going home and when i came back in the morning it was still running.. the job was running for nearly 12 hours, In the morning i tried it run for like1 million and again it went more than that 1 Million and no body else is playing with that table
KRIS