Page 1 of 1

Limit fetch in Universe Query

Posted: Thu May 07, 2009 1:31 pm
by ramnishgupta
I have a job where i need to read the log file from DS and write it to a text file. Is there a way in the Universe SQL that i can fetch only the first 1000 rows from the RT_LOGxxx file.

Currently i have a counter in a transformer and do not write to the text file when the counter reaches 1000. This works however it still has to read the entire log file.

I need to put a limit because there are cases when there is some DB issue and the log file just grows to more than 2GB and i do not want to read the entire log as the first 100 or so entries will tell me what went wrong.

Thanks

Posted: Thu May 07, 2009 2:20 pm
by asorrell
Use the limits tab of the job (when you run it) and set Stop stages after N rows.

-or-

>SELECT * FROM VOC FIRST 5;
NAME.......... TYPE DESC..........................

STAT V Verb - Produce the total and
average of values in named
field in a file
MONETARY K Keyword - NLS locale category
NLS.MAP.TABLES F F
OPTIM.SCAN K Keyword - SET.SQL Environment
SET.MODE V Verb - Display/modify the mode
of a file.

Sample of 5 records listed.
>

Posted: Thu May 07, 2009 4:42 pm
by ray.wurlod
If your log grows to 2GB it will become corrupted and unusable. If this is likely, you need to resize that particular log table to use 64-bit internal addressing. Before it gets corrupted.

This may not be a universal panacea; the log table will also get corrupted if a write to it is blocked by the fact that the disk is full. And this becomes more likely as the theoretical maximum size is 1PB or higher.