Answered Saving a structuref procedure take to much time

Hello guys,

I'm working on a structured procedure that weithg around 191ko but it takes to much time to save.

I'm running OE 10.02B (i know it's not the late version, we have planned to update to OE 11.7)

Is there some key point that can caused this enormous time of saving my structured procedure ?


Best Regards,

- BobyIsProgress -
 

TomBascom

Curmudgeon
Your units are very confusing. Is “191ko” supposed to be 191 kilograms? Are you chiseling it onto stone tablets to save it?

10.02b? Are you sure? Maybe that is actually 10.2b? There is about 10 years between 10.0 and 10.2 so you might not be quite in the Stone Age if it is 10.2

What is your hardware environment? A shared networked stone cutting drive is going to be much, much slower than a local laser blaster. You could also be having issues with interference from any anti-moss tools. If your workstation is the same vintage as your OE you might also check the hamster wheels. Those old PCs are very underpowered for modern workloads. Memory is also a likely issue if the workstation is old. Laying down sediments takes time and you usually don’t have nearly enough mud and sand to work with.
 
By 191Ko I mean Kilo Octet.
10.2B sorry for the confusing information .

I'm running it on a virtual server near the cutting edge technologie on a Windows server 2012 r2 . And I only have issu saving this programm. I can saved much heavier programm in a blink of an eye. But this one not.

I send the programm attached
 

Attachments

  • pxBIIndicJmoins1.p
    118.6 KB · Views: 8

TomBascom

Curmudgeon
Darn! I was really hoping for a network attached stone chisler

Sorry, but I have no idea why this particular procedure should be slow to save. One thing you might try is to open it with notepad, make a change (in a comment) and save it. That would establish if the issue is within Progress or is with the OS/hardware.
 

Stefan

Well-Known Member
Do you have any custom procedure firing upon save that is attempting to parse / clean up the .w?
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
@Rob Fitzpatrick in fact yes as i Said to Tom I did a mistake on progress version. I'm using 10.2b .
Is that actually what it says in the version file? In other words, you have no service pack installed? If you are dealing with a bug in a tool, it is important to know your service pack, if any.
Also by "saving" I mean the action of File -> Save File.
You haven't told us *where* you are doing this; it is relevant. Progress has several different development tools and they may do different things during a save. Procedure editor? App Builder? OpenEdge Architect?
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
One simple step you could take is to install SP08 and see if the issue persists. Even if it does, I'd say you're better off having those hundreds of bug fixes.

I haven't used App Builder either, so I don't know what it does, or can be configured to do, during a save.

It's also possible this isn't an App Builder issue at all. It could be some kind of environment issue. Is a network involved in any way, e.g. saving the file to a remote share? That location could be the problem. Or maybe the location on the disk where App Builder is writing something other than the source (r-code? logs? temp files?) is problematic. If the clusters involved are failing, the disk might have to retry several times to get a successful write. In that case, relocating the files might help.

It could be a problem with your OS. I recall we had significant performance problems in our OE application when we first went to Windows 7. I tracked it down to a feature called offline files which was enabled by default. It wasted a lot of time trying to search a file cache which was actually empty. I disabled the feature and performance returned to normal. I found that problem with Process Monitor. You may want to run that during a problematic file save and see where the time is spent. In particular, see if there are any two operations where the time delta (which can be added as a separate column) is particularly large.
 
the file has "compile on save" set. would it be somewhat related to compilation? if you remove this flag, is it still slow?
I just tested it and it's the compilation of the program that takes time.

FWIW 191 KiB is, IMHO, kind of ridiculously large.
Yes it's large but I build many temp-tables to output value in txt file. The aim is to output many data arround our production activity to add them to a power bi . This prorgam is batched.

I did the choice of writing all in one program. Maybe by separate it in many programs it would be more efficient. But I'm not sure I will win in performance ?
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
I just tested it and it's the compilation of the program that takes time.
I can saved much heavier programm in a blink of an eye. But this one not.
The question remains why the compilation of this program takes an "enormous" amount of time, relative to a larger program. Though you haven't confirmed the size of all of the source involved.
Is 191 KB the source size with all of the include files (and their dependencies, if any)?

Try to think of what is different about this particular program, compared with larger ones that compile much more quickly. Does it read its source from a different location than others (e.g. a network share versus a local directory)? Is the r-code saved to a different location? Does it connect to a database that is in a different place, e.g. a slower database server on a slower network? Or a larger number of databases? Or databases with much larger schema?

Does it take a similarly long time to do Compile | Check Syntax? I would expect that to be similar to a compile, apart from saving r-code. If it does, try tracing it with Process Monitor and see where the time is spent. If it doesn't, then trace the save operation.
 
Top