Page 1 of 1

Schema File Processing only 100 Records

Posted: Wed Feb 09, 2011 9:58 am
by Aquilis
Hello ,
We are implementing Schema file approach for processing nearly 50 CSV Files with using 50 different schema files.
If data volume in a CSV file is 1 million records, but schema file is propogating only 100 Records. we tried with multiple scenarions where If:

Code: Select all

1. More than 100 records in CSV file :  Processes only first 100 records.
2. CSV has got exact 100 records: Processes succesfully
3. Less than 100 records : Processes succesfully
What could be the issue around it? Any patches available for this?

Working with 8.1.x has become horrible now days, reggularly we are surprised with Issues. We installed FixPack-1 & Fix Pack-2.
Is Fixpack-3 also in market?

Posted: Wed Feb 09, 2011 11:21 am
by ray.wurlod
Even fix pack 4 is in the market.

Can you post the actual schema file? Can you please also post the score from one run of the job with more than 100 records? There may be a limiter there. What stage type are you using to read the CSV files?

Posted: Thu Feb 10, 2011 7:28 am
by Aquilis
CSV files are Tab delimited. we are using Sequential file stage to read the records which is reading all the Records but column Import stage where schema file is assinged is propogating only 100 records out of it.

Code: Select all

record
{final_delim=end, delim='\t', null_field='', quote=none, padchar=' ',print_field}
( SAMPLE_ID: string[max=20];
  SAMPLE_TYPE: nullable string[max=20] {null_field=''};
  SAMPLE_CREATED_DATE: nullable timestamp {timestamp_format="%dd-%mmm-%yyyy %hh:%nn:%ss",null_field='', default="1999-12-31 12:59:50"};
  SAMPLE_RECEIVED_DATE: nullable timestamp {timestamp_format="%dd-%mmm-%yyyy %hh:%nn:%ss",null_field='', default="1999-12-31 12:59:50"};
  SAMPLE_STATUS: nullable string[max=20] {null_field=''};
  SAMPLE_STATUS_DATE: nullable timestamp {timestamp_format="%dd-%mmm-%yyyy %hh:%nn:%ss",null_field='', default="1999-12-31 12:59:50"};
  SAMPLE_DISPOSITION: nullable string[max=20] {null_field=''};
  TESTING_ORGANISATION: nullable string[max=20] {null_field=''};
  SITE_ID: nullable string[max=20] {null_field=''};
  GSK_SPEC_ID: nullable string[max=20] {null_field=''};
  GSK_SPEC_VERSION: nullable string[max=20] {null_field=''};
  STUDY_ID: nullable string[max=20] {null_field=''};
  CONDITION_ID: nullable string[max=20] {null_field=''};
  TIMEPOINT_ID: nullable string[max=20] {null_field=''};
  INSPECTION_LOT_ID: nullable string[max=20] {null_field=''};
  BATCH_ID: nullable string[max=20] {null_field=''};
  REAGENT_CONTAINER_ID: nullable string[max=20] {null_field=''};
  INSTRUMENT_ID: nullable string[max=20] {null_field=''};
  COMPLAINT_REFERENCE: nullable string[max=20] {null_field=''};
  REDLANE_FLAG: nullable string[max=20] {null_field=''};
  REASON_FOR_CANCELLATION: nullable string[max=2000] {null_field=''};
)

Couldn't get much information from Dumpscore than mentioned below:

Code: Select all

 Occurred: 11:50:04        On date: 10/02/2011           Type: Info
   Event: ReadFile,0: Import complete; 105366 records imported successfully, 0 rejected.

   Occurred: 11:50:06        On date: 10/02/2011           Type: Info
   Event: Column_Import,0: Field import complete; 100 records converted successfully, 0 rejected.

   Occurred: 11:50:06        On date: 10/02/2011           Type: Info
   Event: Load_Table,0: Records inserted: 100 (...)

Posted: Thu Feb 10, 2011 8:26 am
by Sreenivasulu
You would have set the compiled the job in debug mode with row limit as 100 :)

can you pls check this feature in jobproperties?

Regards
Sreeni

Posted: Fri Feb 11, 2011 1:56 am
by Aquilis
My Bad, How could I missed this? tracing mode was enabled. !!!