Dump-n-Load best practices

Our 2008 Server is running with 8GB memory with plenty of disk space.

Our DB is 7GB and the BI file is apx 300MB.

What's the best switches to use when running the DumpTables.r file?

It's suggested to run with -i -B 120000 would more be better? How long typically should this take to complete and size of the dump files, should they exceed the size of the .db?

Thanks,

--tj
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
I don't think you have provided enough information to make specific recommendations. I don't know your hardware, your Progress version or license, whether your OS or DB is 32-bit or 64-bit, your DB configuration, your DB block size, your BI settings, your DB structure, etc. etc.

Also, I have no idea what DumpTables.r is. I typically dump and load DBs with Progress tools.

It's suggested to run with -i -B 120000 would more be better?

It is suggested by whom? If your DB block size is 8 KB then -B 120,000 is 0.92 GB of buffer pool; if 4 KB, then it's half that. Given your hardware you could do a little better than that if your OS and database are 64-bit and your server has no workload other than this database (which you also didn't mention...). I don't think no-integrity will help your dump, as you are not concerned about transaction throughput in a dump. No-integrity can help with a load, but make sure you take backups as appropriate, and make sure you understand the implications of using -i (no crash recovery). For some other general pointers read Dumping and Loading (chapter 16) in the Database Administration manual.

How long typically should this take to complete
I can't tell you because it depends on a lot of factors: server speed, disk speed and fill rate, file system performance, index efficiency, DB structure, etc. Your testing will give you this answer. You will be testing, right?

size of the dump files, should they exceed the size of the .db?
Again, I don't know. You could have a 7 GB database that contains 7 KB of data. It depends on your structure, data density in your blocks, fixed vs. variable extents, etc. Test and you will see.

I suggest you run a dbanalys report and then analyze the data in a spreadsheet. If you are going to be changing your structure you should be doing this anyway. If you aren't you should still have a "before" and "after" picture so you can assess the effects of the D&L.

Once you have your data in a spreadsheet you can sort your tables by size and decide how you will dump and load. You may have a few large tables that you want to binary dump/load, and do the rest via Data Dictionary. It is a small database so it may not matter. It is up to you, depending on your SLA with the business, i.e. how much downtime you are allowed and how your server and storage perform.
 
Top