Question Report optimization

Adlen

New Member
Hello people.

I started working with progress a few months ago, and now i have an optimization i have to do in some old report.
This report brings information about sales, and it can be filtered by date,itens and store number.These three fields are needed for the unique index.
At this time, i´m currently creating temp tables for them, one for the itens selected and another for the stores, after that i use a for each in the sales table bringing the records needed.
The problem is that in case this filters aren´t filled, the report goes very slow. Currently we the tables have 300 stores and around 80.000 itens, and when the report is selected to be generated with all stores and/or itens, it takes kind of a long time.
I thought about separating the four cases (Selected store and selected itens,Selected store and all itens,All store and selected itens,All store and all itens) into different queries, but it sounded kind of dumb.
I would like to hear some opinions about how i should handle this as i´m out of ideas...

Here´s the code where i mount the sales temp-table.
Code:
DO dDataAux = dDateIni TO dDateEnd:
[INDENT]FOR EACH ttStore NO-LOCK,
[INDENT]  EACH ttItens NO-LOCK,
  EACH sales NO-LOCK
  WHERE
  sales.coditem = ttItens.coditem AND
  sales.coduni = ttStore.coduni AND
  sales.datref = dDateAux :
[INDENT]  CREATE ttSales.
  BUFFER-COPY sales TO ttSales.[/INDENT][/INDENT]
  END.[/INDENT]
  END.

Currently using Windows 7 (64bits) and progress 10.2B.

Thanks in advance.
 

Cringer

ProgressTalk.com Moderator
Staff member
Without knowing your sales data, or your indexes, I'd guess that starting the query with FOR EACH sales NO-LOCK, and then joining with the ttStore and ttItens might be a better approach. A the moment you're doing 24,000,000 queries of sales. And if that's a bad index you could be reading a LOT of records. Might as well do a single table scan and have done with it.
 

tamhas

ProgressTalk.com Sponsor
Cringer is pointing you on the right track, although congratulations on using TTs in this context, since that is a key part of being on the right track in this kind of reporting. Separate your thinking about filling the TTs from what the report needs to do. Make the fill of the TTs efficient according to how the data is stored and indexed on disk and then make the TTs indexed to make the reporting efficient.

Of course, a report on 24M entries might take a while anyway! Not to mention potentially producing a lot of output!

Depending on the selection criteria, you may need to do some logical testing of the parameters and pick the right search strategy for collecting the data according to the indices available.
 

Adlen

New Member
First of all, thanks for the directions. I´m not used to joins, kind of really learning it now that both mentioned.

Currently, for tests, i´m using two months period for the date field, so sales table actually counts with around 150.000 total sales in that time.
that said it´s really a quite lot of data. All of this is summarized and some calcullations are made to reduce the output to sums of sales among other things, but that not the problem, at least for now.

About indexes, the itens table have an unique index using coditem field.
Same to the store table, having an unique index about the coduni field.
So for the sales table, i´m currently using another unique index containing these two fields(coditem and coduni) and another on for the date, named datref.
Based on this, i was able to fulfill the index with this info and progress should direct access these, right?

I suppose that the real problem is about the number of records in each iteraction. I can reduce some of the records, but even so, the normal usage of this report uses around 150 stores and 100 itens. That, by my calculation, is 15k iteractions over the 150k sales, inside the "do Date = dtini to dtend" loop that gives around 90 days, with direct access due to the unique index. So 135.000 access.
Even so reduced from the 24M case, the populating of the two temp tables and the last query(with no processing) it´s taking around 4 min. I think it´s still kind of slow...

I´m looking into the joining tip now. Thanks for the help up until now, i´ll inform of any advance.
Any tips, directions and corrections are always welcomed.
 

tamhas

ProgressTalk.com Sponsor
If you come back without having found the solution yourself, then a list of the indices on each table of interest and then indication of the summarization you need. One of the things which TTs are very good for is zooming through a bunch of records on whatever index makes them efficient to select (order not mattering) and then to aggregate the information from those records into TT records to produce a much smaller number of records that can then be indexed on whatever you want, including the accumulated total, e.g., stores ranked by total sales or whatever.
 
Top