Read an IMPORT STREAM file from bottom to top

scoop3030

New Member
Hello,

I'm trying to read a file line by line starting with the last line first. Each line is being put into a temp-table and then output on to the page. I would like to start with the most recent line first do to page load times being extremely slow once the page size gets above 400MB.

Any suggestions?

Thanks.
 

RealHeavyDude

Well-Known Member
You don't say anything about your Progress/OpenEdge version and OS you're running on ...

How much lines are in the file? How much bytes are in one line? How many rows do you create in the Temp-Table?

There is a limit on how much you can store in a temp-table in memory. If you exceed that limit the temp-table gets stored on disk and that's probably why you see the performance loss. The file into which that temp-table contents get stored is one of the temporary files that get created for each Progress/OpenEdge client session, if they're on a mapped network drive it get's even worse ...

Regards, RealHeavyDude.
 

scoop3030

New Member
Sorry about the missing info. The Progress/OpenEdge version is v10, and the OS is windows. The file is an activity log so it can have an undefined amount of lines. Not sure on the number of bytes in one line. There is one row in the temp-table for each row in the file. The user can enter the number of lines they want to see to track errors and what not. So what I'm trying to do is have it read from the most recent records, the ones on the bottom, first and then read to the number specified by the user. So file size would then be taken out of the equation.
 

TomBascom

Curmudgeon
There is no easy way to do this.

On the other hand... If you do not actually need to read the entire file you could use the SEEK family of functions and statements to position yourself to the end of the file and then back up. If, for instance, the user want to see the last 10 lines and you think an average line has 40 bytes in it you could seek to the end and then seek to the end minus 10 x 40 (actually I'd probably go back twice that far just for safety's sake) and then process lines into your temp-table normally.
 

scoop3030

New Member
Actually went back and tried the SEEK function/statement and got it to work. Improved times by about 10 seconds, and if the file was too big I limited the amount of data read.
 
Top