Disable database CRC check?

D.Cook

Member
We've all done it: compiled a program with the local db then sent it to a client site only to receive an error because their database structure is somehow different.
** CRC for <tablename> does not match CRC in <r-code file>. Try recompiling. (1896)

I don't know about every other database system, but I'm pretty sure that normally this doesn't happen. If a new field is not referenced in a SQL SELECT statement for example, it simply doesn't select it.

So my question is firstly, does anyone know of a secret startup parameter or ini setting that might disable this check? I realise it's quite probably not possible due to the way r-code is compiled, but am curious to know.

And secondly if not, does anyone have any tips for dealing with this: procedure-wise, code-wise or otherwise?
We generally keep a local copy of a database built with the same .df as the client db, but things happen and somehow these become mismatched..

And you know, it would be really nice to be able to add new fields to a database table without recompiling everything..
 

RealHeavyDude

Well-Known Member
You are talking about one of the core features of the ABL. Although Progress has lifted the restriction on online schema changes and introduced distinct schema changes for tables and their indexes over time, as soon as you make a schema change that causes the CRC to change you need to re-compile all ABL procedures that make static references to such an object. That is the way it is, always has been and probably will be forever.

For one - when it is feasible - you can work with dynamic references instead of static ones. You can use dynamic queries and buffers because these are resolved at run time rather than static queries and table references which are resolved at compile time. Dynamic handle-based object have been introduced with Progress V9 long time ago. They do have their drawbacks such as their creation consumes additional processing time and you need to take care about them - but they are easy to master. Personally, in most cases, but not in all, I do use dynamic queries and buffers. But there are still valid uses cases where static FOR EACH constructs are much better.

Other than that your are bound to the CRC. The only way we deploy schema changes is delta definition files. There is no way in our environment that would allow to deploy schema changes in any other way. Plus we use a dedicated CRC database to which these the schema changes made by the individual developers are deployed that way after they have passed a quality toll gate. When we build a new release we create a single delta data definition from it and do a central build over night that checks out all source code from our repository and compiles it against the CRC database.

It takes restrictive processes and discipline by the developers but since we have enforced that process we do not have any CRC issues anymore.

Heavy Regards, RealHeavyDude.
 

D.Cook

Member
Thanks for the feedback. Yep I guess dynamic queries are the only answer, but unfortunately we lose the great features of simple static program code. Not to mention a lot of legacy code that can't be changed overnight.

I guess we probably have a lot of improvement we could do in terms of our database upgrade processes, and code versioning and deployment processes. But as a small company we will never get as organised as you have!
 

RealHeavyDude

Well-Known Member
Please don't get me wrong, but I think that following a process how software is developed, packaged and deployed has nothing to do with the size of the company / development team. It is more about having a defined way of doing things to get reproducible and stable results, and, of course, discipline. The lack of it, in my humble opinion of course, is just like bad code that gets rushed out with the intention to be fixed later, where later means never.

Did you ever get mad when you had to wade through software that does not meet your ideas of quality or stumbled over an undocumented change that caused you days fixing a problem that should have never introduced into the software in the first place?

Heavy Regards, RealHeavyDude.
 

GregTomkins

Active Member
@RHD: not to hijack this thread, but I think it's interesting, seeing as you are obviously someone who knows what their doing, that you prefer dynamic over static. I could paraphrase your response as 'CRC is an awesome feature of the ABL which I avoid using'.

I/we generally take the exact opposite mentality: static as much as possible, dynamic only if there is a clear need for it. It seems to me that the static features of ABL is tied with TT's for being the #1 strength of ABL over Java-C#.
 

LarryD

Active Member
This is probably in the category of "duh", but we've scripted our customer updates. It would take a little work, but it's actually quite simple.

Here is our process:

1) Zip up new/all source code and the delta df's (if any) then send this to the customer (we do various methods to get it to our customers)

2) Script is executed on customer's server (with db down) that does the following:

- Unzip the source code and delta df's
- load the delta df (running prodict/load_df.p )
- perform some directory search(es) to save in a file the list of .p's, and appropriate extensions (this also takes care of custom programs/modifications on customer site)
- run a program that compiles all the source listed in the file
- restart the db (or db's), appserver, background processes et al (scripted)

All of the above is logged for review (if there is a problem). Just a suggestion if you're looking for a way to address crc issues. I'm sure there are other methods that would work just as well.
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
We've all done it: compiled a program with the local db then sent it to a client site only to receive an error because their database structure is somehow different.
** CRC for <tablename> does not match CRC in <r-code file>. Try recompiling. (1896)

I don't know about every other database system, but I'm pretty sure that normally this doesn't happen. If a new field is not referenced in a SQL SELECT statement for example, it simply doesn't select it.

So my question is firstly, does anyone know of a secret startup parameter or ini setting that might disable this check? I realise it's quite probably not possible due to the way r-code is compiled, but am curious to know.

And secondly if not, does anyone have any tips for dealing with this: procedure-wise, code-wise or otherwise?
We generally keep a local copy of a database built with the same .df as the client db, but things happen and somehow these become mismatched..

And you know, it would be really nice to be able to add new fields to a database table without recompiling everything..

You didn't mention your Progress version. If it 10.1C or later then this will be relevant for you: http://communities.progress.com/pcom/message/163324. It is a recent discussion on PSDN re table CRCs. Adding fields to a table changes the table CRC but doesn't necessarily invalidate your code (i.e. require recompile). If you add fields at the end of the table (higher r-pos than existing fields) then the compiler should allow the old code to run, however see the linked discussion for further details.

I'm with RHD on this. Use a schema-holder database locally for each client, and be disciplined about updating it. Use it to build the incremental .df you send to the client so its schema remains in lockstep with theirs. If you are deploying encrypted source then compile it against your schema DB before deploying code to their site. Then you can have confidence of a clean compile at the client site. As for disabling the CRC check, even if it were possible if would be more a hindrance than a help. I wouldn't try it. Sometimes an error message is the lesser of two evils; it protects you from something worse.
 

D.Cook

Member
Regarding client schema change procedures, yes I agree that a defined procedure and discipline are important regardless of your company size. And this is one area that takes a significant amount of time to define and/or script. I'm all for defining procedures and sticking to them, but often getting other people to agree and follow them is the hard part. Hence it usually becomes a lower priority compared to more pressing matters..

Anyway, thanks Larry for sharing about your script.

You didn't mention your Progress version. If it 10.1C or later then this will be relevant for you: http://communities.progress.com/pcom/message/163324. It is a recent discussion on PSDN re table CRCs. Adding fields to a table changes the table CRC but doesn't necessarily invalidate your code (i.e. require recompile). If you add fields at the end of the table (higher r-pos than existing fields) then the compiler should allow the old code to run, however see the linked discussion for further details.
Yep this is the main issue I have with the CRC check -- adding a new field to a table. I didn't know about this, yes all our clients are on 10.1C or above; that is good news thanks for the info!
 

jmac13

Member
Larry have you got an example of the scripts you use to import the df's or point me in the right direction to do this? as we haven’t got an automated system at all at the mo and I would like to go down this route


Thanks
 

LarryD

Active Member
jmac,

Below is a snippet of the code to load delta df's. We only use Linux servers, but this would be applicable to Windows too. Our db name is "ot". We have some other logging and validation checks, but this is the part that loads the delta df.

Code:
if proversion begins "9"
and search("otv9.ddf") <> ?
then do:
    put stream upd unformatted
        "Loading schema change version ot v9.x" skip.
    RUN prodict/load_df.p ("otv9.ddf").
end.
else
if search("ot.ddf") <> ?
then do:
    put stream upd unformatted
        "Loading schema change version ot oe10.x" skip.
    RUN prodict/load_df.p ("ot.ddf").
end.

We have some rules in place for doing these updates.

The code above is compiled on our dev system and sent out with each update and installed in a directory in our PROPATH prior to running the above. When we send out the delta df's, they always go into a specific directory on the customer's system. Since we have customers using v9 and OE10, there are certain things that are available in our oe10 software not available in v9, hence the two .ddf's (thank God we got rid of all our v7/v8 customers and are working on getting everyone on at least oe10.2b, with oe11 coming soon).

Here is the snippet of the Linux script to execute the above:

Code:
echo "Loading data definitions"
    cd $updt_dir
    pro $db_dir/ot -rx -bp sysadm/b_updtddf.p

The above is part of our application update script that does all the work of shutting down the db, installing the source, processing the delta df, creating the list of procedures to recompile, recompiling the entire application, then restart the system.

As Rob and RHD noted, there are other ways to handle this. What we do has worked well for us, but regardless whatever method you use requires strict rules and guidelines in order to be successful for future updates/upgrades.

I should probably note that we went this route because virtually all of our customers have customized source code of some sort, and for a standard update of our base product sending out r-code would mean having to customize each customer's update. This wouldn't be feasible nor cost effective.
 

RealHeavyDude

Well-Known Member
@GregTomins:

When working with Progress/OpenEdge the CRC is something you need to cope with. I am rather neutral - I don't think it is bad or good, I think it has its pros and cons. Sometimes, when I tend to get mad at it because it restricts me to do something the way I would like to , I find out, that I should not do it like that anyway.

Additionally I would not recommend anybody to use dynamic buffers and queries for the sake of avoiding CRC. Through time I have learned that dynamic code gives me much more flexibility and in many cases generic, re-usable code and that is why I prefer it over static code most of the times. Being freed from CRC is just one aspect but not the driver for using them. I do think that static programming still has its place and is a big strength of the ABL. So for me it is not either/or, it is AND.

Heavy Regards, RealHeavyDude.
 

jmac13

Member
Thanks LarryD I'll have a play round with and see how i get on

might have more questions:p so see how i go thanks again
 
Top