Search found 38 matches

by mattias.klint
Mon Jan 05, 2009 9:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp in sequential file name.
Replies: 1
Views: 1370

Re: Timestamp in sequential file name.

I used the macro (predefined job parameters) #DSJobWaveNo# (increments per-job-run, resets on recompile). in both filename and transformer.

It worked perfectly!
by mattias.klint
Mon Jan 05, 2009 7:23 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp in sequential file name.
Replies: 1
Views: 1370

Timestamp in sequential file name.

Hello, I need to get a parameter with a timestamp to be used within the job and to create the file name of a sequential file. The timestamp can be the time the job started for example and then put into a parameter. I cant use sequencers, the job is started from SAP-listener. I can use environmental ...
by mattias.klint
Fri Aug 15, 2008 2:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Not Happening
Replies: 5
Views: 2611

I think I have the same problem. I'm doing a lookup on a table in a DB2 database. The field that I'm trying to do the lookup with is a varchar(16) and the data is '111222333'(9 characters). It fails. If I hardcode the data in the previous transformer it works. (Not using the DSLink.xxxxx input.) If ...
by mattias.klint
Mon Aug 11, 2008 1:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: WebSphere MQ connectors
Replies: 8
Views: 3217

MQConnectorPX

Thanks for your answer. It doesnt seem to work, I'm supposed to do this inside the MQConnectorPX stage, right? Using a parameter? Unfortunately I can not put any more time in to this, my boss says no, he has some kind of work around, in the MQ-server side, I think they made two diffrent queue's in t...
by mattias.klint
Fri Aug 08, 2008 8:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: WebSphere MQ connectors
Replies: 8
Views: 3217

Hi, thanks

I'm not able to install any other applications, because of the client. I'm not able to connect to the MQ, but I'm probably not doing it right.

But what about the port on the MQ-server beeing changed, from 1414 to 1421, dont I have to specify that somewhere in DS?

/M
by mattias.klint
Fri Aug 08, 2008 3:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: WebSphere MQ connectors
Replies: 8
Views: 3217

I dont know how to do that, if you have time to guide me I will be grateful.

thanks,
Mattias
by mattias.klint
Fri Aug 08, 2008 3:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: WebSphere MQ connectors
Replies: 8
Views: 3217

WebSphere MQ connectors

Hello, I can not connect to the queue manager and get error 2059. I'm using the Client mode. I have checked that all parameters are correct. I hardcoded them. It worked perfect before the port was changed on the MQ-server. Am I supposed to change this somewhere? I cannot find that configuration. Tha...
by mattias.klint
Mon May 12, 2008 2:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Warning message
Replies: 5
Views: 6750

converting Unicode

How did it go? (I realize that you might not see this ;-) I hope it's ok that I hi jack this thread. I have a similar problem. i tried putting all my fields to unicode and not unicode. I extract my data from an IDOC in UTF-8 and insert it to a seq. file with ISO-88859-1. main_program: Invalid charac...
by mattias.klint
Wed Apr 23, 2008 3:42 am
Forum: General
Topic: Generate on output in a Column Export stage
Replies: 0
Views: 2115

Generate on output in a Column Export stage

Hi, Trying to generate a default value from a Column Export stage in my parallell jobb. I have one column coming in and I need two going out, one with a default value,"DS rocks" for example. In the Edit Meta Data page for the new specified column I have the Field level: option "Genera...
by mattias.klint
Mon Apr 21, 2008 1:44 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Root created and cant delete it
Replies: 9
Views: 5889

ROOT jobs sucks

Sorry to hijack your thread but I have exactly the same problem, did you ever stumble over a soloution. I cant delete the projekt so I have my ROOT job laying there bothering me.

Thanks,
Mattias
by mattias.klint
Mon Feb 18, 2008 8:42 am
Forum: General
Topic: VERTICAL PIVOT V8
Replies: 8
Views: 4291

Works perfect, thank you very much. THX!

Now I will hit the aggregator...
by mattias.klint
Mon Feb 18, 2008 8:07 am
Forum: General
Topic: VERTICAL PIVOT V8
Replies: 8
Views: 4291

I tried your first suggestion and it gives me a sequence of numbers, 1 to 6.I would like to get the first 6 rows to be one group, and the following 6 to be another group. This way I will be able to aggregate my rows that look like this with the desired group column included: "1","Grou...
by mattias.klint
Mon Feb 18, 2008 7:45 am
Forum: General
Topic: VERTICAL PIVOT V8
Replies: 8
Views: 4291

Creating a grouping column would be perfect, just the way you describe. You also point out that it's easy with stage variables. Sorry but it doesnt seem to be easy for me. Can you please give me a s short description. It would be very helpful.

Thx,
Mattias
by mattias.klint
Mon Feb 18, 2008 6:43 am
Forum: General
Topic: VERTICAL PIVOT V8
Replies: 8
Views: 4291

Thanks! Is it possible to do it without one stage variable for each column? I might need more columns in the future. It's always good to be prepared :-) If you have the time make the description a little more detailed I would be very greatful.. It's been some time since I developed in DS. Please. Th...
by mattias.klint
Sun Feb 17, 2008 2:21 pm
Forum: General
Topic: VERTICAL PIVOT V8
Replies: 8
Views: 4291

VERTICAL PIVOT V8

Hello! Here is my problem. Only one column. Only one record on each row. Heading A B C 1 2 3 Z X Y 4 5 6 I need to get this into rows: A B C 1 2 3 Z X Y 4 5 6 My example data is very accurate. This is the exact way my data is structured. There are many solutions to this but they all have a key colum...