Search found 250 matches
- Mon Dec 04, 2006 11:31 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to avoid first five rows from the source in Importing
- Replies: 6
- Views: 2700
Re: How to avoid first five rows from the source in Importin
Hi, So how can i avoid first five rows to sourch while importing... Use the Filter option in the Sequential file stage. If you know the number of rows in your file (200 in this case) you can give Tail -195 in the filter and this will read only the last 195 records from the file. But is the number o...
- Mon Dec 04, 2006 3:32 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warnings when changing the data type
- Replies: 6
- Views: 2240
- Mon Dec 04, 2006 1:55 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sql server
- Replies: 6
- Views: 2348
Does it mean that your Master Source file has data for all the 3 types in varying metadata - say type1 has 137 columns, type2 has 83 columns and type3 has 67 columns? - and all this in a single file? You might want to read it as a single column and then split it into different columns using the Colu...
- Mon Dec 04, 2006 1:48 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: UNION ALL v/s FUNNEL Design Issue
- Replies: 6
- Views: 13378
can you please tell me what is this :- sequential funnel, if your requirement is that, other wise go for continous funnel . Inside the Funnel, you have options for Continuous, Sequntial or Sort. Continuous Funnel combines records as they arrive (i.e. no particular order); Sort Funnel combines the i...
- Mon Dec 04, 2006 1:44 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: file name
- Replies: 9
- Views: 3362
- Mon Dec 04, 2006 12:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Override Fully Qualified Table Name
- Replies: 8
- Views: 2908
I believe the question was with regard to the parameterization of the Scema name from the SQL builder.Nageshsunkoji wrote:You can parameterize Servername,Schemaname,username and password and I don't think so database name is required in the parameterisation.
We have jobs with all parameters other than table name.
- Mon Dec 04, 2006 10:47 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Override Fully Qualified Table Name
- Replies: 8
- Views: 2908
- Mon Dec 04, 2006 10:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sql server
- Replies: 6
- Views: 2348
Re: Sql server
Working:- sequential file --- transformer --- sequential file2 ---- job1 seqfile2 --- transformer --- sqlserver ---- job2 To make it as a one job can you suggest some tips: sequential file --- transformer --- sqlserver ---- job1 Not sure if I am missing something here - but can't you not simply rep...
- Mon Dec 04, 2006 10:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Override Fully Qualified Table Name
- Replies: 8
- Views: 2908
- Mon Dec 04, 2006 9:45 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequencer : Job not visble when under 2 levels down....
- Replies: 4
- Views: 2481
- Mon Dec 04, 2006 9:17 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Socket closed error, when parallel jobs run in sequence
- Replies: 4
- Views: 3982
Hi, I think, these are all because of resource allocation to the jobs for monitoring the data. We have environmental variables called APT_MONITOR_SIZE and APT_MONITOR_TIME. These variables will lead the monitoring functionality in Data Stage. overridden the default setting with values, set APT_MONI...
- Mon Dec 04, 2006 9:01 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DataStage GRID
- Replies: 6
- Views: 4386
- Mon Dec 04, 2006 8:58 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DataStage GRID
- Replies: 6
- Views: 4386
By passing the same configuration file to the parameter in the second job.santoshkumar wrote: how can i overcome that.
Since the dataset was created with a different configuration - if you have to read it - you would need a config file that will have the nodes which were used to create the dataset.
Aneesh
- Sun Dec 03, 2006 10:10 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: xml output - not writing to file
- Replies: 10
- Views: 3624
Re: xml output - not writing to file
Please provide more details. Does the log give any warnings?vij wrote: I am using the xml ouput stage in which i am writing the output to a file by specifying its path.but the file is not created in the specified path..
- Sun Dec 03, 2006 10:07 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DataStage GRID
- Replies: 6
- Views: 4386
Why dont you use the same configuration file that created the dataset , for the second job? The issue might not exactly be becuase you are using a different config file. It might be because - for the second job you are tyring to use a config file that does not include the nodes that were used to cre...