Search found 21 matches

by dinthat
Tue Jun 03, 2008 9:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading and FTPing BLOB using datastage
Replies: 1
Views: 1236

Reading and FTPing BLOB using datastage

Hi All,

My requirement is very simple. I 've one oracle table as source, which is having one column with data type as BLOB and this column is storing images. Now I need to read this column using Datastage and FTP this images to my UNIX server as image files.

Can anybody help me to solve this?
by dinthat
Sat Apr 12, 2008 6:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

Hi Sudhindra,

Thanks for your reply.

Can you please explain how you solved this issue using PL/SQL SP. I think it will be helpful for everyone who is facing the same issue.
by dinthat
Sat Apr 05, 2008 11:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

Hi All,

I tried all the above methods to execute commit records in oracle after each row.. But none of the methods are working in my case.. :x Anybody in DSXchange can help me? :roll:

Or datastage cant do that? or nobody in this world dont have such a requirement yet? :?:
by dinthat
Thu Apr 03, 2008 12:23 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Restart Datastge Services failed
Replies: 3
Views: 1837

Hi,

We solved the issue...

The problem was with disk space and the open port.

We killed the process that using the port to release that port and we freed some space in dspx00 directory.

Now I am able to start the Engine. :lol:
by dinthat
Thu Apr 03, 2008 10:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Restart Datastge Services failed
Replies: 3
Views: 1837

Thanks ArndW for the prompt reply. I am using dsadm user id. My dspx00 directry is full. Please find the dbf command output, /dev/vgDStage/lvol1 62914560 62914560 0 100% /dspx00 And when I am doing a netstat -a | grep dsrpc I am getting one connection as ESTABLISHED. tcp 0 0 hpus40.dsrpc 172.29.30.2...
by dinthat
Thu Apr 03, 2008 10:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Restart Datastge Services failed
Replies: 3
Views: 1837

Restart Datastge Services failed

Hi All, I stopped the datastage services using the command uv -admin -stop . But when I am trying to restart services after a few minutes, its showing an error as follows chmod: can't access /dspx00/Ascential/DataStage/DSEngine/dstmp.080403.185943B I serached for the file dstmp.080403.185943B in /ds...
by dinthat
Sun Mar 30, 2008 3:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Gregorian Date to Hijri
Replies: 2
Views: 1319

Gregorian Date to Hijri

Hi All,

In my job, the input is Gregorian date, I want to convert it into corresponding Hijri date. Can anybody suggest a method to do this.
by dinthat
Thu Mar 27, 2008 4:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

chulett wrote:You must have the parameter setup as an Integer with that error... perhaps just change it to a String? :? ...
There is no option to change the type to string. I tried that also.... :(

Kindly guide me how to set the data type for APT_ORAUPSERT_COMMIT_TIME_INTERVAL as string from my job?
by dinthat
Wed Mar 26, 2008 5:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

Hi Ray,

Spelling is currect...
And I tried to set the value by double clicking on the default value grid and I selected the $UNSET from default value dialog. But result is same...
by dinthat
Wed Mar 26, 2008 3:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

If the time interval is asserted the row interval is ignored. Bring both environment variables into your job as job parameters, and give the special default value of $UNSET to the time interval parameter. In this way the row interval parameter will be able to "do its thing". Hi Ray, Thans...
by dinthat
Wed Mar 26, 2008 1:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle commit interval
Replies: 11
Views: 7784

Oracle commit interval

Hi All, Good Morning. I 've an issue while using the two environment variables APT_ORAUPSERT_COMMIT_TIME_INTERVAL and APT_ORAUPSERT_COMMIT_ROW_INTERVAL. In my job I am upserting (update first and insert) data into an oracle stage. My requirement is, I 've to commit records after each insert/update. ...
by dinthat
Sat Mar 15, 2008 11:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reject records with referential integrity constraint
Replies: 14
Views: 6437

So the conclusion of this post is "Prevention is better than cure..." :wink:

Thank you all for ur valuable replies...
by dinthat
Thu Mar 13, 2008 8:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reject records with referential integrity constraint
Replies: 14
Views: 6437

Thank u kcbland for a detailed explanation.

Anyway I handled the parent key violation using lookup. I am marking this topic as a workaround. But still I am searching for a solution to reject those records in target side...
by dinthat
Thu Mar 13, 2008 7:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reject records with referential integrity constraint
Replies: 14
Views: 6437

Hi Ray, Thank u for ur reply. But why u r saying rejecting records in the target will never be a better solution? Will it slowdown the job. Since I am inserting 33 million records into target, in that records only 60 records are violating parent key constrain. So to identify this 60 record I am perf...