First Column Analysis results in error

This forum contains ProfileStage posts and now focuses at newer versions Infosphere Information Analyzer.

Moderators: chulett, rschirm

Post Reply
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

First Column Analysis results in error

Post by Novak »

Hi,

I am very new to Information Analyzer, and was hoping somebody could help me with the issue I am facing.

With all the metadata properly imported I executed the very first Column Analysis but it results in error. Very limited error message says
"PXTaskRunner.java:98 BaseProfile1259195571293: Job status -> 1:Job failed, please verify that the job is submitted correctly. DS job Numer is RT_SC32. Detailed Log: "

The DataStage Director does not contain any logs for the executed job even though I chose the option of retaining a script when executing the analysis job.

All of this is being run on the Windows 2003 with DB2. Also, for the purposes of proving the environment right - the DataStage jobs with transform stage run fine.

Regards,

Novak
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Look in the RT_SC32 directory within the ANALYZERPROJECT project directory on the server.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

Hi Ray,

I actually deleted the previous job and lost the RT_SC32, but any other job I run ends in the same way, and below are the contents of RT_SC** folder.

Aside from the OshExecuter.sh the only other file in there is OshScript.osh and its contents are:

# OSH / orchestrate script for Job BaseProfile_country_by99432922 compiled at 12:37:15 26 NOV 2009
# baseprofile
pxbridge
-XMLProperties '<?xml version=\'1.0\' encoding=\'UTF-16\'?>
<Properties version=\'1.0\'>
<Common>
<Context type=\'int\'>1</Context>
<Variant type=\'string\'><![CDATA[3.5]]></Variant>
<DescriptorVersion type=\'string\'><![CDATA[1.0]]></DescriptorVersion>
<PartitionType type=\'int\'>-1</PartitionType>
</Common>
<Connection>
<DataSource type=\'string\'><![CDATA[REDBANK]]></DataSource>
<Username type=\'string\'><![CDATA[]]></Username>
<Password type=\'protectedstring\'><![CDATA[]]></Password>
</Connection>
<Usage>
<GenerateSQL type=\'bool\'><![CDATA[0]]></GenerateSQL>
<SQL>
<SelectStatement type=\'string\'><![CDATA[select
"BY"
from DB2INST1."COUNTRY"]]>
<Tables>
<Table type=\'string\'><![CDATA[DB2INST1."COUNTRY"]]></Table>
</Tables>
<Columns>
<Column type=\'string\'><![CDATA[BY]]></Column>
</Columns>
</SelectStatement>
</SQL>
<Transaction>
<RecordCount type=\'int\'>2000</RecordCount>
<EndOfWave type=\'int\'>0</EndOfWave>
</Transaction>
<Session>
<IsolationLevel type=\'int\'><![CDATA[1]]></IsolationLevel>
<AutocommitMode type=\'int\'>0</AutocommitMode>
<ArraySize type=\'int\'>2000</ArraySize>
<SchemaReconciliation>
<FailOnSizeMismatch type=\'bool\'><![CDATA[0]]></FailOnSizeMismatch>
<FailOnTypeMismatch type=\'bool\'><![CDATA[0]]></FailOnTypeMismatch>
</SchemaReconciliation>
<PassLobLocator type=\'bool\'><![CDATA[0]]></PassLobLocator>
<CodePage type=\'int\'><![CDATA[0]]></CodePage>
</Session>
</Usage>
</Properties>'
-source 0 '{
DSSchema=\'record (
BY: ustring[8];
)\'
}'
-connector '{
name=ODBCConnector,
variant=3.5,
version=1.0,
library=ccodbc
}'
> pxbridge_source.v
;
fd_compute
< pxbridge_source.v
> fd_compute_1.v
> fd_compute_2.v
;
ca_properties
< fd_compute_1.v
> ca_properties.v
;
copy
< fd_compute_2.v
;
generator
-schema
record (
DomainValueFlagDate:date {function=rundate};
)
< ca_properties.v
> generator.v
;
modify
'DistinctValue:ustring[max=512]=DomainValue;
InferredDataType:ustring[max=20]=InferredDataType;
ODBCType:ustring[max=20]=ODBCType;
GeneralFormat:ustring[max=50]=GeneralFormat;
FrequencyCount=count;
PropertyLength=Length;
PropertyPrecision=Precision;
PropertyScale=Scale;'
< generator.v
> modify.v
;
dataset_by_key
-file_pattern C:/IBM/InformationServer/Server/Projects/ANALYZERPROJECT/RT_SC43/ca_col_$KEY_VALUE.ds
-output_count 2
-key fieldNumber
-ifNotFound fail
< modify.v
;
# End of OSH code


I don't see anything wrong with this OSH, but that proves nothing.

This job was run on a different schema and table whose metadata I just imported. So, the connection is valid. Also, the jobs status gets registered in DataStage Director, which proves that user is credentialled to use DataStage. No log for any of the aborted jobs in Director though...

Regards,

Novak
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Yes, I thought that there may have been a log stored in RT_SC32. Obviously not. Need to think about it for a while.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Novak
Participant
Posts: 97
Joined: Mon May 21, 2007 10:08 pm
Location: Australia

Post by Novak »

Ok,

I am pretty certain I have this solved and the resolution is below. I could be wrong slightly as well, as there was so many different things tried and I possibly lost track of some of the steps.

1. The lack of log and failure information.
Having upgraded from 8.0 to 8.1 version my DSParams file was changed and after the RTLogging and ORLogging were updated to 1 and 0 respectively I started getting my logs populated (in the DS Director as well). This link viewtopic.php?t=129971&highlight=no+log helped me resolve it.

2. After some more information was provided I was finally able to work out what was causing "Job failed, please verify that the job is submitted correctly. "error message to appear. It was in fact the ODBC connection created to the source database (still DB2 but different to IA database). Once the ODBC type was changed from NON-WIRE to WIRE, the job worked like a charm.

Like I said, there were many different things tried and I am maybe mistaken thinking that the above were resolutions. But until somebody corrects me or I get back and try to replicate the same problem/solution scenario, this is something I would suggest for people having same issue.

And obviously Ray, as you suggested, after the job was "submitted" properly, RT_SC** folder was populated with many more files.

Regards,

Novak
Post Reply