Apologies for the delayed response.
Tried that Minhajuddin and no positive results. It still loads the newline character into the target DB column.
Search found 221 matches
- Fri May 09, 2008 12:28 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Newline character getting loaded into DB Column
- Replies: 15
- Views: 7410
- Fri May 09, 2008 12:26 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to generate a node map
- Replies: 18
- Views: 16634
Apologies for the delayed response. Got pulled into a few other unnecessary distractions... I added the GRID variable for Host files and it still wouldn't give desired results. However, when i added that and enabled the following variable $APT_IMPORT_PATTERN_USES_FILESET = True, i got another error ...
- Wed May 07, 2008 3:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Newline character getting loaded into DB Column
- Replies: 15
- Views: 7410
The source file is generated on a Windows box and when viewed in a binary editor contains CR+LF combo (0D 0A) The file is residing on a network mount drive and is read by a Datastage job that is running in a LINUX grid server. The last column is declared to be of VARCHAR(60) and the schema file has ...
- Wed May 07, 2008 1:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Newline character getting loaded into DB Column
- Replies: 15
- Views: 7410
Hi ArndW, Thanks for the followup. My original design was Seq. file ===> Transformer ==> ODBC Stage || || || \/ Rejects File I replaced the final ODBC stage with a target Seq File stage and made sure it is using the correct NLS map (UTF-8). The output generated matches the input file's contents. Tha...
- Tue May 06, 2008 11:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Newline character getting loaded into DB Column
- Replies: 15
- Views: 7410
Newline character getting loaded into DB Column
Hi Group, I have an RCP DB Loading job that reads from an input sequential file and parses it using a supplied schema file and then loads into a target database table (SQL Server 2005). The server is NLS-Enabled and the NLS map is set to UTF-8 at the job level. This is working well in a Windows DS 8...
- Tue May 06, 2008 11:26 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to generate a node map
- Replies: 18
- Views: 16634
Hi lstsaur, Thanks for reviewing my query. The following 4 Grid params are the ones that were suggested to be added to all our PX jobs in the GRID. I have created a parameter set by the name APT_GRID_PARAMS in my project for this purpose. Here are the values for these entries in the log file. APT_GR...
- Mon May 05, 2008 11:39 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to generate a node map
- Replies: 18
- Views: 16634
- Mon May 05, 2008 5:19 am
- Forum: General
- Topic: An easy way to view delimited files....
- Replies: 7
- Views: 1970
- Mon May 05, 2008 4:59 am
- Forum: General
- Topic: Multiple job compile "Hiding" in a fresh session
- Replies: 0
- Views: 692
Multiple job compile "Hiding" in a fresh session
Hi Group, I had imported about 15 new jobs into a QA project and tried to do a multi-compile. Here are the steps. 1. Open a Designer client afresh (Very important step) 2. Import jobs from an exported ".dsx" file. 3. Go to "repository" menu and look for "Multiple Job Compile...
- Mon May 05, 2008 4:39 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to generate a node map
- Replies: 18
- Views: 16634
Unable to generate a node map
Hi, I have a simple job that reads wildcard pattern based files from a folder and loads into a target database. I have the File Name Column property set to store the file name in the target database. This works fine in a Windows Datastage Server environment; of course with help from the gurus here h...
- Mon Apr 07, 2008 8:06 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Testing Unicode settings via Seq File stage
- Replies: 1
- Views: 1369
Ray, Thanks for the insight. Appreciate it. I got this job working..... Did a step-by-step revisit/review and found that i hadn't set the encoding to the correct format in the target stage. Right now, i tried various options - Set the encoding at the Job level - Set the encoding at the stage level -...
- Sun Apr 06, 2008 5:55 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ODBC Reject link to sequential file
- Replies: 3
- Views: 3232
To followup on my earlier post..... The data analysis showed year values upto 2199. I then tried a few tests with year values ranging from 2000 to 2090 and finally arrived at a tipping point of 2079-June-06. Searching for this range in the web, the following MSDN page http://msdn2.microsoft.com/en-u...
- Wed Apr 02, 2008 11:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ODBC Reject link to sequential file
- Replies: 3
- Views: 3232
- Mon Mar 31, 2008 10:42 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ODBC Reject link to sequential file
- Replies: 3
- Views: 3232
ODBC Reject link to sequential file
My job design is as follows. This is a PX, RCP enabled job. Sequential File ===> Transformer ==> ODBC Enterprise || || || \/ Rejects (Seq File) In one particual file that has 75K records, the data gets loaded but i see a "Fatal" log entry ODB_AsCollected,0: [DataDirect][ODBC SQL Server Dri...
- Thu Mar 20, 2008 10:01 pm
- Forum: General
- Topic: NFS Drive access related question
- Replies: 9
- Views: 3694
Ray, Thanks for the inputs. The network folder got mounted through a different request from another teammate in our LINUX server. For now I am going to use that for my development. I need to check with the folks here to find out what settings were made to the mount options. When I have some spare ti...