Do you want to do this from the IFD or externally to Design Studio?
Most sfpt apps I've seen run through Windows but I use Putty to do command line sftp within a batch file run from a map.
Search found 198 matches
- Fri Feb 17, 2006 3:44 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Deploy through specific SFTP application
- Replies: 1
- Views: 3113
- Thu Feb 16, 2006 7:41 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Newbie questions on Error Handling
- Replies: 4
- Views: 4157
Lisa 1 You can define validation rules in the type tree and use the reject function in a map rule to reject any records that fail the validation. Or you can test values in your map rule and send error records to an error file. 2 You usually use the REJECT function to reject records that fail validat...
- Mon Feb 06, 2006 9:41 am
- Forum: Site/Forum
- Topic: Revealing Premium content for the post made upto 28 Feb 06.
- Replies: 9
- Views: 6075
This sounds like a money grabbing scheme. 'Experts' don't have to post answers/solutions. Who is getting all the cash?
At least www.tek-tips.com is still free.
At least www.tek-tips.com is still free.
- Tue Jan 31, 2006 3:13 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Run map -- passing database variable
- Replies: 2
- Views: 3107
- Wed Sep 14, 2005 2:47 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Inserting records into different tables from a single Input
- Replies: 1
- Views: 2467
- Tue Jul 05, 2005 2:07 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Send map parameters using Echo or other + Call external map
- Replies: 2
- Views: 2998
- Thu Jun 30, 2005 2:04 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: WorkFlow processing
- Replies: 3
- Views: 3547
It depends on what you have your FetchAS set in the input card. If it's Integral then you process IP1, IP2, OP1, OP2.If you set FetchAs to Burst then it processes the number of records specified in the burst for the input card with the burst, all the records for input with integral, OP1,OP2 and then...
- Tue Jun 28, 2005 8:45 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: FTP Connector, do not concatenate
- Replies: 7
- Views: 4750
How would you be able to define multiple files in one card without concatenating the data? If you want to read multiple files in one go you need multiple cards. I don't understand why you would need multiple files defined in one card. If you want an unknown number of input files, you need to use the...
- Tue Jun 28, 2005 2:05 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: FTP Connector, do not concatenate
- Replies: 7
- Views: 4750
- Fri Mar 18, 2005 3:45 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Details Mercator Execution flow
- Replies: 5
- Views: 4825
It's a data transformation tool. It gets data from an input source and sends it to an output source which can have a different format. EG read messages from a JMS queue, change the structure to a SWIFT massage and sent it to SWIFT. Read a database table, reformat the data to an new layout and sent i...
- Thu Mar 17, 2005 10:53 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Details Mercator Execution flow
- Replies: 5
- Views: 4825
That's assuming pdf files can be read by non humans. Type trees are held in files with extension .mtt Map sources are held in files with an extension .mms Compiled maps are held in files with extension .mmc Mercator maps can be run under the command server 1 at a time or under the event server and c...
- Wed Sep 22, 2004 2:59 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: space pad character on OS/390 = ASCII space
- Replies: 2
- Views: 3299
- Mon Aug 16, 2004 2:35 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: db2 adaptor 32k limit
- Replies: 1
- Views: 2828
- Wed Mar 03, 2004 9:39 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Error : Job is being accessed by another
- Replies: 3
- Views: 3326
- Fri Nov 14, 2003 3:22 am
- Forum: IBM<sup>®</sup> DataStage TX
- Topic: Partner Manager Update Map?
- Replies: 4
- Views: 3987
Hi Jim, Not seen you around for a while. I think your problem is in 6.0. I remember scrapping this version as there were numerous database adapter problems. I would recommend upgrading to 6.5.2 or 6.7.1. Your .mdq settings look right. Are you using the -Update command for your PUT/DBLOOKUP/Adapter C...