Dumping data into CSV format (urgent) ...

TomBascom

Curmudgeon
Since you have a relatively modern, albeit somewhat out of date, release you can use a dynamic query to handle the table name. There are several threads that go into how to do that in great detail. (The "development" forum might be a better place to look though...)
 

vinsym

New Member
It is ages since i wrote this bit :) and this code works good on progress 8. Not sure if this would work in openedge. Will require customization in terms of database name, hostname etc.,

Best of luck



/* Purpose: Code to extract data from a given database */
/* Pre-connect the database before executing this code */
DEF TEMP-TABLE ttTables NO-UNDO
FIELD ttTable AS CHAR
FIELD ttProg AS CHAR
FIELD ttFile AS CHAR.
DEF VAR cDbName AS CHAR NO-UNDO.
DEF VAR cDbFile AS CHAR NO-UNDO.
DEF VAR cProgDir AS CHAR NO-UNDO.
DEF VAR cDumpDir AS CHAR NO-UNDO.
DEF VAR cDumpExt AS CHAR NO-UNDO.
DEF VAR cDelimiter AS CHAR NO-UNDO.
DEF VAR cDbConnectString AS CHAR NO-UNDO.
DEF VAR cUseConnectString AS CHAR NO-UNDO.
DEF VAR i AS INT NO-UNDO.
DEF VAR cErrString AS CHAR NO-UNDO.
ASSIGN cDbName = "dbname":U
/* Define Variables */
DEF TEMP-TABLE ttTables NO-UNDO
FIELD ttTable AS CHAR
FIELD ttProg AS CHAR
FIELD ttFile AS CHAR.
DEF VAR cDbName AS CHAR NO-UNDO.
DEF VAR cDbFile AS CHAR NO-UNDO.
DEF VAR cProgDir AS CHAR NO-UNDO.
DEF VAR cDumpDir AS CHAR NO-UNDO.
DEF VAR cDumpExt AS CHAR NO-UNDO.
DEF VAR cDelimiter AS CHAR NO-UNDO.
DEF VAR cDbConnectString AS CHAR NO-UNDO.
DEF VAR cUseConnectString AS CHAR NO-UNDO.
DEF VAR i AS INT NO-UNDO.
DEF VAR cErrString AS CHAR NO-UNDO.
ASSIGN cDbName = "dbname":U
cDbFile = "<path>\dbname.db":U
cDbConnectString = "-1":U
/* Code to extract gets generated in this folder */
cProgDir = "c:\progms\":U
/* csv dumps gets generated in this folder */
cDumpDir = "c:\tpdata\":U
cDumpExt = ".csv":U
cDelimiter = ",":U

/* Change the hostname as required */
cUseConnectString = "-S <dbname> -N tcp -H <hostname>".
/* Disconnect and re-establish the connection */
DO WHILE CONNECTED(LDBNAME(1)):
DISCONNECT VALUE(LDBNAME(1)) NO-ERROR.
END.
CONNECT VALUE(cDbFile) VALUE(cUseConnectString) NO-ERROR.
/* Raise exception if connection fails */
IF NOT CONNECTED(cDbName) THEN DO:
DO i = 1 TO ERROR-STATUS:NUM-MESSAGES:
ASSIGN cErrString = cErrString + string(ERROR-STATUS:GET-NUMBER(i)) + " ":U + ERROR-STATUS:GET-MESSAGE(i) + "~n":U.
END.
MESSAGE "Connect failed" SKIP cErrString VIEW-AS ALERT-BOX TITLE cDbFile.
END.
CREATE ALIAS DICTDB FOR DATABASE VALUE(cDbName) NO-ERROR.
/* Initiate the array object */
FOR EACH DICTDB._file WHERE DICTDB._file._File-Number > 0
AND NOT DICTDB._file._file-name BEGINS "SYS":U NO-LOCK:
CREATE ttTables.
ASSIGN ttTables.ttTable = DICTDB._file._file-name.
END.
/* loop through tables to generate the extract code */
FOR EACH ttTables:
RUN GenProgtoExtract.
END.
QUIT.
RETURN.
PROCEDURE GenProgtoExtract:
/* Assign the tablename and extract file */
ASSIGN ttTables.ttProg = cProgDir + ttTables.ttTable + ".p":U
ttTables.ttFile = cDumpDir + ttTables.ttTable + cDumpExt.
/* Initiate program output file and redirect the output to this file */
OUTPUT TO VALUE(ttTables.ttProg).
PUT UNFORMATTED "find _file no-lock where _file-name = '" ttTables.ttTable "'. ~n"
"OUTPUT TO " ttTables.ttFile ". ~n"
"for each _field no-lock of _file break by _order: ~n"
"if not last( _order ) then ~n"
"put trim(_field-name) + ', '. ~n"
"else ~n"
"put _field-name skip. ~n"
"end. ~n"
"FOR EACH " ttTables.ttTable " NO-LOCK: ~n"
"EXPORT DELIMITER '" cDelimiter "' " ttTables.ttTable ". ~n"
"END. ~n"
"OUTPUT CLOSE. ~n"
"RETURN.".
OUTPUT CLOSE.
/* Execute code to generate the extract -- call this bit if you have to automate the task */
/* RUN VALUE(ttTables.ttProg). */
RETURN.
END PROCEDURE.
 

TomBascom

Curmudgeon
You must have a compiler license to run this code. If you do, then great, it will run on any release of Progress.

Because it generates a .p for every table you could also pre-compile it on a development machine and distribute the r-code.

Dynamic queries have the advantage of allowing you to write a single program that will accept the table name as a parameter. But the code is a bit more complex ;)
 

Casper

ProgressTalk.com Moderator
Staff member
Well then a dynamic example:
Code:
DEFINE VARIABLE cDir AS CHARACTER  NO-UNDO.
DEFINE VARIABLE hQuery AS HANDLE     NO-UNDO.
DEFINE VARIABLE hBuffer AS HANDLE     NO-UNDO.
DEFINE VARIABLE iTmp AS INTEGER    NO-UNDO.
DEFINE VARIABLE iTmp2 AS INTEGER     NO-UNDO.
 
DEFINE STREAM sOut.
 
/* comment this line to dump all tables or change tablename to dump another table */
&Scoped-define TABLE-NAME customer
 
/* change this if you want different Delimiter then semi-colon */
&SCOPED-DEFINE Delim ~';~'
 
/* comment/uncomment eiter of next two statements to put in the first row fieldnames or fieldlabels*/
/* if you don't want either of them comment both */
&SCOPED-DEFINE FieldNames *
&SCOPED-DEFINE FielLabels *
 
/* output directory */
ASSIGN cDir = 'C:\temp\'.
 
FOR EACH _file WHERE   &IF DEFINED(TABLE-NAME)
                       &THEN
                       _file._file-name = '{&TABLE-NAME}' AND
                       &endif
                       _file._hidden = NO NO-LOCK:
 
    OUTPUT STREAM sOut TO VALUE(cDir + _file._file-name + '.csv').
 
    
    CREATE BUFFER hBuffer FOR TABLE _file._file-name.
    
    CREATE QUERY hQuery.
 
    hQuery:ADD-BUFFER(hBuffer).
 
    hQuery:QUERY-PREPARE("FOR EACH " + hBuffer:NAME + " NO-LOCK").
 
    hQuery:QUERY-OPEN().
 
    hQuery:GET-FIRST().
 
    &IF DEFINED(FieldNames) > 0 &THEN
        DO iTmp = 1 TO hBuffer:NUM-FIELDS:
            PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):NAME {&Delim}.
        END.
        PUT STREAM sOut SKIP.
    &ELSEIF DEFINED(FielLabels) &THEN
        DO iTmp = 1 TO hBuffer:NUM-FIELDS:
            PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):LABEL {&Delim}.
        END.
        PUT STREAM sOut SKIP.    
    &ENDIF
    
    DO WHILE NOT hQuery:QUERY-OFF-END:
 
        DO  iTmp = 1 TO hBuffer:NUM-FIELDS:
            IF hBuffer:BUFFER-FIELD(iTmp):EXTENT = 0
            THEN DO:
                IF hbuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
                THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE '"' {&Delim}.
                ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE {&Delim}.
            END.
            ELSE DO iTmp2 = 1 TO hBuffer:BUFFER-FIELD(iTmp):EXTENT: /* it is an extent field so we walk through the extents */
                IF hBuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
                THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) '"' {&Delim}.
                ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) {&Delim}.
            END.
        END.
        PUT STREAM sOut SKIP.
        
        hQuery:GET-NEXT().
    END.
 
    /* cleanup and close */
    hQuery:QUERY-CLOSE().
    DELETE OBJECT hBuffer.
    DELETE OBJECT hQuery.
    OUTPUT STREAM sOut CLOSE.
 
END.

Casper
 

heindhoven

New Member
Hello,

I'm new to ProgressTalk and this is my first post!:)
I tried te dynamic table-dump script, and in the Prodedure Editor, it works!
However, when I run it from a .bat file, I get the error message:

"Could not create buffer object for table ... (7334)"
What am I doing wrong?

Using Progress 9.1.E
 

heindhoven

New Member
...well,
the batch file contains
c:\etcetc..\prowin32.exe -b -pf c:\test\param.pf -p c:\test\export.p

and the param.pf filed contains

-db tbsdb

Is it not enough for connection? What should be the correct command line?
 

Casper

ProgressTalk.com Moderator
Staff member
Sorry that was a stupid answer. It wont run since it wouldn't know _file.
Did you change the script? What is the tablename mentioned in the error?

Casper.
 

heindhoven

New Member
...to be honest I changed it a bit :blush:. In wanted the script to export only some tables, not just 1 and not all of them. In the next sample only 1 table is exported, but there can be entered more. The table name in the error is the one in the script, ms120.

DEFINE VARIABLE cDir AS CHARACTER NO-UNDO.
DEFINE VARIABLE hQuery AS HANDLE NO-UNDO.
DEFINE VARIABLE hBuffer AS HANDLE NO-UNDO.
DEFINE VARIABLE iTmp AS INTEGER NO-UNDO.
DEFINE VARIABLE iTmp2 AS INTEGER NO-UNDO.
DEFINE VARIABLE cBufNames AS CHARACTER NO-UNDO INIT "ms120".
DEFINE VARIABLE iBufNum AS INTEGER NO-UNDO.

DEFINE STREAM sOut.

/* change this if you want different Delimiter then semi-colon */
&SCOPED-DEFINE Delim ~';~'

/* comment/uncomment eiter of next two statements to put in the first row fieldnames or fieldlabels*/
/* if you don't want either of them comment both */
&SCOPED-DEFINE FieldNames *
/* &SCOPED-DEFINE FielLabels * */

/* output directory */
ASSIGN cDir = 'C:\temp\'.
DO iBufNum = 1 TO NUM-ENTRIES(cBufNames):
CREATE QUERY hQuery.
CREATE BUFFER hBuffer FOR TABLE ENTRY (iBufNum, cBufNames).

hQuery:ADD-BUFFER(hBuffer).

hQuery:QUERY-PREPARE("FOR EACH " + hBuffer:NAME + " NO-LOCK").

hQuery:QUERY-OPEN().

hQuery:GET-FIRST().

OUTPUT STREAM sOut TO VALUE(cDir + hBuffer:NAME + '.csv').

&IF DEFINED(FieldNames) > 0 &THEN
DO iTmp = 1 TO hBuffer:NUM-FIELDS:
PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):NAME {&Delim}.
END.
PUT STREAM sOut SKIP.
&ELSEIF DEFINED(FielLabels) &THEN
DO iTmp = 1 TO hBuffer:NUM-FIELDS:
PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):LABEL {&Delim}.
END.
PUT STREAM sOut SKIP.
&ENDIF

DO WHILE NOT hQuery:QUERY-OFF-END:

DO iTmp = 1 TO hBuffer:NUM-FIELDS:
IF hBuffer:BUFFER-FIELD(iTmp):EXTENT = 0
THEN DO:
IF hbuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE '"' {&Delim}.
ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE {&Delim}.
END.
ELSE DO iTmp2 = 1 TO hBuffer:BUFFER-FIELD(iTmp):EXTENT: /* it is an extent field so we walk through the extents */
IF hBuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) '"' {&Delim}.
ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) {&Delim}.
END.
END.
PUT STREAM sOut SKIP.

hQuery:GET-NEXT().
END.

/* cleanup and close */
hQuery:QUERY-CLOSE().
DELETE OBJECT hBuffer.
DELETE OBJECT hQuery.
OUTPUT STREAM sOut CLOSE.
END.
 

Casper

ProgressTalk.com Moderator
Staff member
I understand why you did that.

Your code, however, works fine for me. I must say that I don't have a clue why you would get that error, maybe a typo in the list of tables you use?

Casper.
 

heindhoven

New Member
No, it even fails with just 1 table, en it's certainly typed correct.
For some reason, I think you were right that it has something to do woth not having connection to the database, maybe the .pf file i used.
Did you try to run this export.p from a .bat file?
If so, can you post the command line you used? And the .pf file?
 

Casper

ProgressTalk.com Moderator
Staff member
Works fine with me:

Code:
set DLC=C:\dlc102a
set PATH=%DLC%\BIN;%PATH%
set LIB=%DLC%\LIB;%LIB%
mbpro C:\OpenEdge102a\WRK\sports2000\sports2000 -p c:\temp\export.r

Casper.
 

heindhoven

New Member
Casper,

It works!:)
Finally! I'v been trying and trying again, and kept getting the "could not create buffer" error. Main difference is that I used the prowin32 command, and you are using the mbpro command.
But maybe we had an incorrect path or something like that.
Many Thanks!

FYI, We are trying to get about 120 tables from progress to Microsoft SQL 2008 with minimal action on the Progress side, because of lack of Progress knowledge. I think export to CSV is, for this reason, the easiest way to get data from Progress to SQL 2008, with minimal action on the Progress side.
I know ODBC is also an option, but then more action on (and knowledge of) the Progress side is required, isn't it?
 

heindhoven

New Member
...well, the export script works, and all selected tabels are exported to csv files, with semi-colon as delimiter, BUT...next problem:
all rows in each csv file end with a semi-colon, like this:

rf502_obj;rf502_code;rf502_desc;
1763394,2027;10;"Contact person";
1763395,2027;20;"Report at";


Normally a csv file only separates the fiels by putting a semicolon between them, and normally no semi-colon exists at te end, but here it does... :mad:

Is there a way to change te export script so there are no semi-colons at the end of each row, but only between the fields?

Thanks in advance!
 

StuartT

Member
Try changing this:
DO iTmp = 1 TO hBuffer:NUM-FIELDS:
IF hBuffer:BUFFER-FIELD(iTmp):EXTENT = 0
THEN DO:
IF hbuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE '"' {&Delim}.
ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE {&Delim}.
END.
ELSE DO iTmp2 = 1 TO hBuffer:BUFFER-FIELD(iTmp):EXTENT: /* it is an extent field so we walk through the extents */
IF hBuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) '"' {&Delim}.
ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) {&Delim}.
END.
END.
PUT STREAM sOut SKIP.

To
DO iTmp = 1 TO hBuffer:NUM-FIELDS:
IF hBuffer:BUFFER-FIELD(iTmp):EXTENT = 0
THEN DO:
IF hbuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN do:
if itmp <> hBuffer:NUM-FIELDS then
PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE '"' {&Delim}.
else
PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE '"' .
end. /* itmp <> num-fields */
ELSE do:
if itmp <> hBuffer:NUM-FIELDS then
PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE {&Delim}.
else
PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE .
end.
END.
ELSE DO iTmp2 = 1 TO hBuffer:BUFFER-FIELD(iTmp):EXTENT: /* it is an extent field so we walk through the extents */
IF hBuffer:BUFFER-FIELD(iTmp):DATA-TYPE = "CHARACTER"
THEN PUT STREAM sOut UNFORMATTED '"' hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) '"' {&Delim}.
ELSE PUT STREAM sOut UNFORMATTED hBuffer:BUFFER-FIELD(iTmp):BUFFER-VALUE(iTmp2) {&Delim}.
END.
END.
PUT STREAM sOut SKIP.

With this you are simply putting out a delimiter whenever you are processing any field except the last one.
 

heindhoven

New Member
Wow Thanks! That works perfect!
At first I replaced the wrong part of the code, but when I corrected it, it worked!
Many Thanks!:)
 
Top