Page 1 of 1

Folder stage Issue

Posted: Wed Jan 23, 2013 10:32 am
by neena
Hi everyone, I have migrated a server job from Windows to the Linux DS server. The job was working fine in the Windows server. Please see the job structure below.

Folder Stage ---> Transformer ---> Command Stage

When I run/reset the same job in Linux server I am getting the following error. No issue occurred while compiling the job. I am wondering if there is anything that needs to be done. Please let me know.

Error: ds_loadlibrary: error in dlopen of command.dll - command.dll: cannot open shared object file: No such file or directory.

I have made sure the folder structure is correct and the files are also available.

At first I thought may be the stages are missing, but I can see the Folder stage and Command stage available in the pallet.

I tried to do the search but didn't get much luck.

Posted: Wed Jan 23, 2013 11:57 am
by neena
I have done couple of tests and came to the conclusion that its not the issue with the folder stage. But with the Command Stage.
I have removed the command stage and ran the job and everything worked fine. I am still checking.

Posted: Wed Jan 23, 2013 12:39 pm
by neena
I came across this link which is explaining that Command stage is only used for Windows Server.

I have exported entire project from Windows to Linux, I am guessing maybe that's why I am able to see the Command stage available in the Linux version of Data Stage pallet, but its really cannot be used in the Linux version.

viewtopic.php?p=248561

Posted: Wed Jan 23, 2013 12:44 pm
by chulett
You are correct - Windows only for the Command task. You should be able to swap it for an Execute Command task.

Posted: Wed Jan 23, 2013 1:15 pm
by ray.wurlod
Possibly an External Source stage will be what you require.

Posted: Wed Jan 23, 2013 1:23 pm
by chulett
Sorry, my advice was incorrect - I was thinking we were talking about a Sequence job. And External Source is only for PX jobs. What exactly is the stage doing for you in the job?

Posted: Wed Jan 23, 2013 1:28 pm
by ray.wurlod
Ditto. What output do you require? Perhaps you can solve it via the Filter command in a Sequential File stage - which will read stdout of the Filter command rather than any particular file.

Posted: Wed Jan 23, 2013 3:28 pm
by neena
The Command stage is not doing anything, it was just put there to end the flow so that job can run. The command that was listed was "ls". (I am just tweaking the job someone developed)

So I have removed the Command stage and added Sequential File stage and pointed the data flow to the black hole (/dev/null). Now the job is working fine.

Any way really appreciate all your suggestions.

Posted: Wed Jan 23, 2013 7:31 pm
by ray.wurlod
:idea:
Annotate your job design to warn people against trying a View Data on the Sequential File stage that accesses /dev/null

Posted: Thu Jan 24, 2013 7:44 am
by neena
Thanks so much Ray/Chulett. This is a really great forum.

I did forgot to mention the Annotation. Thanks for the idea Ray.

Posted: Thu Jan 24, 2013 7:04 pm
by ray.wurlod
:idea: :idea:
I sometimes even hide the Sequential File stage in a local container when I'm appending to /dev/null (as well as keeping the annotation with it).