How to count how many tables in a DB?

rainylsh

Member
hello, everybody.
I want dump a db's df file, then load it to a new db.
But i don't know whether whole tables' df are dumped or loaded?
 

rainylsh

Member
"where not _hidden" means don't count hidden tables? I want count all tables include hidden. can i count all tables if not use "not _hidden"?
 

Casper

ProgressTalk.com Moderator
Staff member
The hidden tables are the tables which hold the database definitions, views, sql stuff etc... These are database specific things. If you create a new empty database then the hidden tables are already present, so you don't dump and load them.

Maybe a better way to count the tables without counting the systemtables:
Code:
for each _file where _file._file-num > 0 and
                           _file._file-num < 32768:
end.
this way you will also count tables which are made hidden but are no sytem tables.

casper.
 

tamhas

ProgressTalk.com Sponsor
Count the ones with _file-num > 0 and < 32768 without regard to whether they are hidden.

And, it is simple enough to make a list of the dumped files and to compare it with this list and let us know what the differences are. Then we can help you understand what each list is.
 

tamhas

ProgressTalk.com Sponsor
The number bracket excludes schema files and system tables. Just select those below 0 or over 32768 to see what is there.

If you have tables inside the range which are hidden, then those would not show up in the list with * but would show up with your for each. You can try doing lists like inside the numeric range and hidden to see what you get, but ultimately it might just be easier to export the list of file names to a text file and the list of dump names to a text file, edit off the .d, and do a sort and compare to see what the differences are.

Also, do you have any .e files in your dump directory? If so, those are errors in the dump and you should see what they say.
 

rainylsh

Member
Thanks for your remind.
one .e file exsits in my dump dir. it contains some char unknow, reassign it and then dump the table twice. It's no .e file now.
 

rainylsh

Member
I check tables' num, only 1513 whose num in 0 ~ 32768, but 1604 whose num < 32768. the tables whose num < 0, those name are begains "_", are hidden tables.
these tables need not be dumped ? but i can dump them. for example, "_user", the user define table, control user' id & passwd, I think it should be dumped.
 

Casper

ProgressTalk.com Moderator
Staff member
The table with file-number 1513 is made be someone who made the database. So questions regarding wether tables are important should be answered from a functionality point of view.

Something questions I have are:
  • Are you dumping and loading the tables as an excercise or are we talking about an actual production environment?
  • How big is the database? I never use dd dump. I always use binary dump.
  • What is the actual reason you are doing this?
It sounds to me that you don't have enough knowledge of both the database and Progress to be working on a production database.

Normal procedure for dumping/loading a database is:
  1. design new structure of database based on measurement and output from for example dbanalys/protop/promon/os.
  2. Make backup of the database
  3. dump df
  4. dump _user table
  5. dump sequences current-values
  6. (binary) Dump the data
  7. create new database according to new structure
  8. Load df/_user and sequences current-values.
  9. (binary) load the data
  10. Rebuild indexes
  11. check if eveything went well. (compare number of records before the dump with number of records after the dump)
  12. make backup of the database/Enable AI
  13. change db parameters if necessary
Regards,

Casper.
 

tamhas

ProgressTalk.com Sponsor
With the exception of _user, the tables < 0 are metaschema tables and are not normally exported since their contents are managed by the .df file. In fact, it would be a bad thing to dump and load them since they have connections by RECID which could not be relied on in a load.

The difference I was asking is whether you had done a list of what got dumped and compared it with the list from _file. If you do that and look at the properties of files not dumped and vice versa, then you will have clues as to what is causing the difference. If this doesn't make it apparent, then bring back the list and we can help you look at characteristics.
 

rainylsh

Member
answer your question:
1.just exercise, if ok, I'll do actual production environment
2.about 7G.
3.database runs about 2 years. So I think it need be processed.

And I have some problems:
1.How to dump/load the data by binary?
2.How to dump/load the data by dd?

I dumped the data by using progress - data administration, and load by using cmd(may be binary you said. proutil live -C bulkload live.fd)

Sorry, I said there are 1513 tables in _file where _filenum in 0~32768, not _filenum is 1513.
I realy don't have enough knowledge of dumping and loading, So I show my questions here now.
 

Casper

ProgressTalk.com Moderator
Staff member
1.just exercise, if ok, I'll do actual production environment

Ok then I worry somewhat less for you ;-).

3.database runs about 2 years. So I think it need be processed.
This is hardly to be called a good reason for doing a reload of your database.
Any 'real' indication that you need to change something in your current database structure?
Like maybe large scatter factor for the large tables or fragmenation or other issues you encounter?

1.How to dump/load the data by binary?
proutil <dbname> -C dump ....
proutil <dbname> -C load....

2.How to dump/load the data by dd?
-use the datadictionary (obviously :))
-Make a program which dumps the df _user and sequences current values.

If these answers raise lots of questions to you, then I think it is best you hire a consultant or get some DBA training.

If you want to do a reload of your database because of problems you experience then you first need to think of what is the best way to solve those problems. Most reasons I know for people to do a reload is for performance reasons. If that is also the case for you, then it is better to redesign the database structure in such a way that you minimize these problems. You do this by:
-examining the database
-monintoring the database
-monitoring the OS

Furthermore you need to have knowledge of the behaviour of the data within the database:
-are there small but very active tables
-are there tables which fragment a lot
-are there some very big tables
- etc...

With this knowledge and measurements you can propose a new structure for your database.
Do some tests with this new structure and finally do the actual reload in the production environment.

What is the current strucuture of your database:
-do you already have storage area's?
-do you use type II storage area's?
-did you separate index from data?
-What version of Progress do you have?
-What OS are you using?

Regards,

Casper.
 

rainylsh

Member
You are right, I realy need a DBA traning. But no traning plan in CHINA(Mainland). I'm studying by myself, So can't get full knowledge. what a pity!
But luckly, I have a consultant. as his helping, I have Dump & Load successful in test DB. But I want know more about dump & load. So I put out question when I don't know clearly here. for example, "binary" , I know the cmd proutil <dbname> -C dump / load...., but I want know whether you said is it. and I was using "DD" dump the .d & .df file.
I think we don't know how much fragment if we don't process db files.
The DB files after process is bigger than the files before process about 300M(7G total), need it be processed? if not, when need?
 

tamhas

ProgressTalk.com Sponsor
I have to say that your questions don't make it seem like you are getting much mentoring from your consultant ...

Once upon a time, many, many years ago, people thought that periodic dump and load was a piece of good DBA practice. But these days, that is almost certainly not the case. There are reasons to dump and load, just not as a part of routine maintenance without specific indications. If you have issues, then it is the issues we should be exploring, not the dump and load since it is unlikely that the dump and load will fix anything. If it is just a question of thinking it is something that you should do, then there is a lot you should be learning about.

Check out:
http://www.dbappraise.com/ for on-going, proactive monitoring of your system.
http://www.wss.com/ for the top DB and performance consultants.
http://www.greenfieldtech.com/ for ProTop and articles on DBA issues.
 
Top