BI file blowing

drunkahol

New Member
A schema update to our current test database is currently blowing the BI file when we try to apply it.

We've split the .df file and added more BI extents. However, the BI still spits the dummy around 7 hours into the processing of the big new index. Reducing the time to apply the update would also be good as taking the live system down for this amount of time is very difficult for us.

We're thinking about applying the indexes as INACTIVE and then rebuilding the indexes. However, another solution would be to apply our updates without the BI file being touched.

Now using -i does not prevent the BI file growing as I recall. Is it possible to start a session that will not touch the BI file at all during this schema update?

Running on RedHat 8.0 with Progress 9.1D07.

Thanks everyone
 

jongpau

Member
I seriously doubt that such a thing (applying a schema update without touching the bi) would be a good idea. If something goes wrong Progress has no clue how to roll back the schema changes it has done sofar and your database will be seriously damaged...

You could you maybe:
- Create an empty database with the structure you would like it to have and the extents it is supposed to have in order to hold the data
- Compile your application against the new database (saving the r-code into a new directory structure so you do not overwrite the existing app)
- Load the data from the old db into the new one by doing a direct copy from db to db with a little program that does so with the transaction level on each record instead of the complete load (that should keep your bi nice and small)
- Make sure the copy process completed OK
- Change the parameters of the application so it uses the new database instead of the old one
- Change the ProPath of the application so it uses the new r-code that is compliant with the database schema

Not sure if this would work for you, but it is worth a try. Since you are on version 9 creating the "copy" program should be fairly easy by using dynamic queries and buffers, meaning a single procedure should be able to copy across your entire database without you having to worry about table names etc... Doing this will also allow you to check how long this process will take in order to work out whether it is a worth while solution for your upgrade.

HTH
 
Split Delta

Hi,

Had the same problem when applying new index to quite a few large tables within the one delta.

Ended up splitting up the delta into 3 or 4 sections and applying one-by-one.

Regards,
Paul
 

drunkahol

New Member
jongpau:

The thought of doing a schema update without first having a full and verified backup of the database wouldn't even enter my mind. The schema update is a one way operation. It either works, or you restore the backup and it works the next time or the next time or the next time . . . Failure and rollback are not options. The problem with using -i is that the BI file still grows for some mysterious reason (I'm remembering this fact from some time in the past so I don't know if this is fixed in the latest releases). Creating bespoke load programs etc is also out of the window for us due to the complexity and timescale with which this has to go to the live machine.

pauloconnor:

We have already split the delta file into parts, it is a single index addition that is causing the problems. This part can't be split any further.

Kirmik:

Adding BI extents is what we've reluctantly decided to do. I say reluctantly because this operation is feasible on the development machine, but rather more difficult on the live machines (due to lack of disk space and the downtime window available). I'd hoped to do exactly the same on the development and live servers. I can see some poor schmuk applying this update on Christmas day (the only large enough downtime window). Just hope it's not me :)

Thanks very much for all the replies folks.
 

Bent

New Member
Applying LargeFiles

If you're on an enterprise license you can consider to apply the proutil .. - EnableLargeFile
*** OBS this is not reversible, and the db is after this not possible to start on a workgroup any longer.
- But if you're short on diskspace you still have a problem.
Bent
 

Kirmik

Member
I feel your pain.

To lighten the load a bit prepare for the big day by automating as much as you possibly can, using scripts. Adding extents doesn't have to be time consuming if you do your analysis first.
I'll be in edinburgh on Xmas day so I'll spare a thought for the poor sod who has to work :(
 
Top