post dump & reload

geoffconnolly

New Member
Hi I have dumped an reloaded a 14GB database
the 8 largest tables were binary dumped and loaded
the rest done the normal way
The 8 table were reloaded into A separate schema area for each table
however i got scatter factor of 4 for one of the tables when i did a tabanalys
for them the day after the reload
all the indexes for the eight large tables are contained in the same schema area the block size for the database is 8k
i expected all scatter factors to be 1
can anybody shed ligth on this
The fact that i ran the tabanalys on the database during uptime (live processing occuring) wouldnt cause this ??

Geoff
 

ron

Member
We dump / reload 140 GB DB's fairly regularly ... but always use binary. Once you set-up the scripts binary is easy ... you just "let it happen".

Our experience is that tables become completely "organised" by a dump/reload. The only thing that immediately comes to mind is to ask whether the table you are talking about is highly volatile. If records are inserted and deleted in big numbers then it won't take long after a D&L for things to become fragmented again. (You can check on this using info in VST's.)

Ron.
 
Top