SIGSEGV error occured during loading the data

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

SIGSEGV error occured during loading the data

Post by AmeyJoshi14 »

Hi,

We are facing the problem while loading the data into Oracle table.The job is aborting due to "SIGSEGV" error.We are using following option in the target table:

Write Method=Load
Write Mode=Truncate

We are facing these problem for a particular Oracle database server(rest for all other servers this option is working fine).The strange thing is that when use Write Method=Upsert , the data is loaded into the table,but some how the above mentioned option is not working.

We have open the ticket with IBM, they have asked us to run the job on single node,but even if we cahnge the configuration file to 1node,still we are getting SIGSEGV error.Also they have asked us to send them the log file,but the job is not geneerating any log file(.ctl file is empty).
I have also tried to run the job using the following parameters:

Code: Select all

APT_MONTIOR_SIZE=100000
APT_DISABLE_COMBINATION=True
APT_EXCECUTION_MODE=One process
But no luck... :(

Please Note:I ahve searched for the above mentioned error and even gone through every post ..but again no luck ...
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Apart from running in one node, can you try using "APT_ORA_WRITE_FILES" to keep the oracle loader files and to see if that can be run outside of DataStage without a problem?
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

ArndW wrote:Apart from running in one node, can you try using "APT_ORA_WRITE_FILES" to keep the oracle loader files and to see if that can be run outside of DataStage without a problem? ...
Thanks for the reply!!

I have used the parameter
APT_ORA_WRITE_FILES=/home/dir/abc/sqlldr.txt

Now the problem is that the job is running for almost 20 min for only single record(selecting only single record from the source).Still it is in running state :(
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
sanjay
Premium Member
Premium Member
Posts: 203
Joined: Fri Apr 23, 2004 2:22 am

Post by sanjay »

Amey

are you able to run sqlldr through command line because internally it calls sqlldr .

Thanks
Sanjay
AmeyJoshi14 wrote:
ArndW wrote:Apart from running in one node, can you try using "APT_ORA_WRITE_FILES" to keep the oracle loader files and to see if that can be run outside of DataStage without a problem? ...
Thanks for the reply!!

I have used the parameter
APT_ORA_WRITE_FILES=/home/dir/abc/sqlldr.txt

Now the problem is that the job is running for almost 20 min for only single record(selecting only single record from the source).Still it is in running state :(
laknar
Participant
Posts: 162
Joined: Thu Apr 26, 2007 5:59 am
Location: Chennai

Post by laknar »

If source is a file. definitely this will be the file issue.

can you please post the source file format and job design. And do you have any constraints in transformer stage.
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

laknar wrote:If source is a file. definitely this will be the file issue.

can you please post the source file format and job design. And do you have any constraints in transformer stage.
Thanks for the reply!!
But the job is pretty simple.Our source is Oracle ....
We are testing the whether we are able to load from one server to another server.The problem is that we are unable to use Load option for the specific oracle database server.

Regards,
Amey
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

sanjay wrote:Amey

are you able to run sqlldr through command line because internally it calls sqlldr .

Thanks
Sanjay
AmeyJoshi14 wrote:
ArndW wrote:Apart from running in one node, can you try using "APT_ORA_WRITE_FILES" to keep the oracle loader files and to see if that can be run outside of DataStage without a problem? ...
Thanks for the reply!!

I have used the parameter
APT_ORA_WRITE_FILES=/home/dir/abc/sqlldr.txt

Now the problem is that the job is running for almost 20 min for only single record(selecting only single record from the source).Still it is in running state :(
Thanks for your reply!!

Actually even if I use the Write Mode=Append,job is running for almost more than 30 min for single record. :(
Through command prompt i have tried to load the data ...and data is inserted into the particular table.(with same User Id).
Really not able to understand why datastage is not able to load the data.. :cry:

Regards,
Amey!
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

During that 30 minute wait time you should have a DBA monitor your session and find out what is happening.
sanjay
Premium Member
Premium Member
Posts: 203
Joined: Fri Apr 23, 2004 2:22 am

Post by sanjay »

Amey

is it creating control file and sql loader log in scratch area .if yes please verify control file is in correct format


Thanks
Sajay

AmeyJoshi14 wrote:
sanjay wrote:Amey

are you able to run sqlldr through command line because internally it calls sqlldr .

Thanks
Sanjay
AmeyJoshi14 wrote: Thanks for the reply!!

I have used the parameter
APT_ORA_WRITE_FILES=/home/dir/abc/sqlldr.txt

Now the problem is that the job is running for almost 20 min for only single record(selecting only single record from the source).Still it is in running state :(
Thanks for your reply!!

Actually even if I use the Write Mode=Append,job is running for almost more than 30 min for single record. :(
Through command prompt i have tried to load the data ...and data is inserted into the particular table.(with same User Id).
Really not able to understand why datastage is not able to load the data.. :cry:

Regards,
Amey!
sanjay
Premium Member
Premium Member
Posts: 203
Joined: Fri Apr 23, 2004 2:22 am

Post by sanjay »

Amey

Try 'APT_DISABLED_COMBINATION' to true in job to check whether u get actual error instead of SIGSEGV error

Thanks
Sanjay

sanjay wrote:Amey

is it creating control file and sql loader log in scratch area .if yes please verify control file is in correct format


Thanks
Sajay

AmeyJoshi14 wrote:
sanjay wrote:Amey

are you able to run sqlldr through command line because internally it calls sqlldr .

Thanks
Sanjay
Thanks for your reply!!

Actually even if I use the Write Mode=Append,job is running for almost more than 30 min for single record. :(
Through command prompt i have tried to load the data ...and data is inserted into the particular table.(with same User Id).
Really not able to understand why datastage is not able to load the data.. :cry:

Regards,
Amey!
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

Hi,

I actually used this parameter,but still it is giving me SIGSEGV error. :(
Also i tried to run the job with the Write Mode=Append, for this also it is giving same error message(SIGSEGV). :(

Thanks for the reply!!
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

Hi,

We are still facing the same issue...
The problem is really strange,for only one particular server we are not able to use the Write Method=Load ,Write Mode=Truncate ,but for the rest of the servers we can use this option... :?
Infact when we use the Write Method=Upsert,records are getting inserted into the table. :roll:
As stated earlier we have open the ticket with IBM.They advised us to run the job in the single node,which we did but again the job is getting abort with the error message as :

Code: Select all

ora_ent_tgt,0: Operator terminated abnormally: received signal SIGSEGV
Where in ora_ent_tgt is target table.
Is there any specific parameter or any privilges which we want to run/set before running the job?Also in the XMETA table we ahve set the quota time as unlimited.

Code: Select all

select * from dba_ts_quotas where tablespace_name='XMETA'
Please note: The particular user who is running the job has the insert privilges.. :)
Again thanks all for the time and effort :)

Best Regards,
Amey
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

Hi,

We are still facing the same issue...
The problem is really strange,for only one particular server we are not able to use the Write Method=Load ,Write Mode=Truncate ,but for the rest of the servers we can use this option... :?
Infact when we use the Write Method=Upsert,records are getting inserted into the table. :roll:
As stated earlier we have open the ticket with IBM.They advised us to run the job in the single node,which we did but again the job is getting abort with the error message as :

Code: Select all

ora_ent_tgt,0: Operator terminated abnormally: received signal SIGSEGV
Where in ora_ent_tgt is target table.
Is there any specific parameter or any privilges which we want to run/set before running the job?Also in the XMETA table we ahve set the quota time as unlimited.

Code: Select all

select * from dba_ts_quotas where tablespace_name='XMETA'
Please note: The particular user who is running the job has the insert privilges.. :)
Again thanks all for the time and effort :)

Best Regards,
Amey
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

When you say this is happening in one server, is this only table that is affected or all tables in that particular database ?

Try the following
1.) Set a constraint before the SQL Loader so no rows are written and run the job.
2.) Set a trace on the database (if possible for the ETL oracle user id) and check its sessions.
AmeyJoshi14
Participant
Posts: 334
Joined: Fri Dec 01, 2006 5:17 am
Location: Texas

Post by AmeyJoshi14 »

Sainath.Srinivasan wrote:When you say this is happening in one server, is this only table that is affected or all tables in that particular database ?
This is happing for all the tables in that paticular server :(
That is we are not able to use Load and truncate option for any of tables in the particular database :x

I also use the Delete option,its also working...the problem is only with the Load and Truncate option.... :?

Also in the source table while fetching the records I am selecting the no rows(through where clause) ,but for this also the job is aborting with the 'SIGSEGV' error... :x :x

Thanks for the reply!!

Best Regards,
http://findingjobsindatastage.blogspot.com/
Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works. and nobody knows why! (Albert Einstein)
Post Reply