Search found 14 matches
- Fri May 16, 2014 8:50 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimals in RCP
- Replies: 12
- Views: 6382
- Thu May 15, 2014 3:01 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimals in RCP
- Replies: 12
- Views: 6382
- Thu May 15, 2014 2:38 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimals in RCP
- Replies: 12
- Views: 6382
- Wed May 14, 2014 11:23 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimals in RCP
- Replies: 12
- Views: 6382
Thanks for Reply. I don't want to do any transformations. Just need to write the data from Teradata view/s into the sequential file. We have many fields with data type decimal (18, 0) and decimal (11, 2). File becomes big because of these extra zeros. Also no one likes to see lot of preceding zeros ...
- Mon May 12, 2014 9:54 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Decimals in RCP
- Replies: 12
- Views: 6382
Decimals in RCP
Hi, I have a generic RCP enabled job (Teradata Connector -> Copy Stage -> Sequential file stage) to extract data from Teradata into a text file. It accepts SELECT query on run time and creates the file. The issue I am facing is with decimal columns were DataStage adds zeros before and after. I would...
- Tue Dec 11, 2012 1:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with seperator in timestamp field
- Replies: 4
- Views: 3047
Re: Problem with seperator in timestamp field
DataStage won't validate delimiters in the time/date/timestamp fields automatically. If you want to validate it then read it as string and write code to validate in transformer then reject/convert to timestamp accordingly.
- Tue Dec 11, 2012 1:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: left join with between date condition
- Replies: 5
- Views: 4240
- Tue Dec 11, 2012 12:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Address Shuffle
- Replies: 8
- Views: 4560
Re: Address Shuffle
I will try to explain using one example suppose your records is like below one "Name Address State" Now you want to shuffle Name and Address with the state Ans: Split the record into two streams 1: Name + State 2: Address + State Now add new column "Order" for both streams and us...
- Tue Dec 11, 2012 11:54 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to read special character in Datastage
- Replies: 1
- Views: 5363
Re: How to read special character in Datastage
If you are seeing these special characters as '?' in your target table/file (don't use DataStage data viewer to validate.), then try changing NLS settings (try AMERICAN_AMERICA.AL32UTF8 or AMERICAN_AMERICA.WE8PC850 or NLS of your database)
- Tue Dec 11, 2012 11:49 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job hanging at CDC stage
- Replies: 2
- Views: 2497
- Tue May 15, 2012 8:16 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Parse xml which has hierarchical data
- Replies: 8
- Views: 9993
Re: Parse xml which has hierarchical data
Try setting element of 5th level as KEY (repetition element).
- Sun Sep 19, 2010 1:08 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Commit after some records in ODBC stage
- Replies: 6
- Views: 3678
Re: Commit after some records in ODBC stage
HI i am using ODBC stage and i want to commit after every 1000 records, what Environment Variable should be used,and what is use of Array size? so if i use after commiting 1000 reocrds, and the job got Aborted,then if i start once again wheteher it will start from the starting or where it is aborte...
- Sun Sep 19, 2010 12:56 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Archiving file sets
- Replies: 2
- Views: 1941
Re: Archiving file sets
I have to archive a large File Set and keep the last 7 days' data. As I understand it, a .fs file is just a link to where the actual data is stored, and this file can be moved around at will, as the location reference is absolute and not relative. So I can just rename the .fs file and move it to an...
- Sun Sep 19, 2010 10:34 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Looking for design help
- Replies: 3
- Views: 1847
Re: Looking for design help
Hi, You can use multi node config file if table is having row level locking. In this case use the hash partition on the key. My suggestion is to avoid hitting database multipile times for the same record again and again. try to create a single record(unique key) instead of hitting table again and ag...