Question Import from a dynamic CSV

anandknr

Member
I am trying to read a csv file which can be of 'n' number of columns. My requirement to read first column and then based on its value I need to consider/ignore that row. At the end, I will create a new csv file which have same columns as first one, but only with selected rows.

Issue I am Facing - There is no fixed column numbers in csv I am uploading so I can’t use a temp-table. I have to read the entire line and then parse the first column from it. The CSV created from windows will have carriage return (^M) and because of this it won’t read the last line. See this url. Please note if we import to a temp-table this is not an issue.

Example - I am attaching a sample csv (input.txt) and program (filterRecords.p) to read it. It will then create output csv. Since this site wont allow .csv attachment, I renamed it with .txt extension. Also I am not sure if this site will change the ^M characters from input.txt, so added a screenshot how my input.txt looks like in my vim editor.

Testing this in unix.
 

Attachments

  • vim-input.png
    vim-input.png
    5.4 KB · Views: 11
  • filterRecords.p
    608 bytes · Views: 8
  • input.txt
    158 bytes · Views: 6

Cringer

ProgressTalk.com Moderator
Staff member
If the file really isn't that big. I would use COPY-LOB to put it in a LONGCHAR variable. That way you can parse it more easily and you're not reliant upon IMPORT etc.
 
I am trying to read a csv file which can be of 'n' number of columns. My requirement to read first column and then based on its value I need to consider/ignore that row. At the end, I will create a new csv file which have same columns as first one, but only with selected rows.

Issue I am Facing - There is no fixed column numbers in csv I am uploading so I can’t use a temp-table. I have to read the entire line and then parse the first column from it. The CSV created from windows will have carriage return (^M) and because of this it won’t read the last line. See this url. Please note if we import to a temp-table this is not an issue.

Example - I am attaching a sample csv (input.txt) and program (filterRecords.p) to read it. It will then create output csv. Since this site wont allow .csv attachment, I renamed it with .txt extension. Also I am not sure if this site will change the ^M characters from input.txt, so added a screenshot how my input.txt looks like in my vim editor.

Testing this in unix.
Code:
DEF VAR campos AS CHAR EXTENT 1000 NO-UNDO.
.....code....
 IMPORT DELIMITER "," campos.
/*validate first column*/ IF campos[1] ... THEN NEXT.
.....code....
 

LarryD

Active Member
Unless I don't know what else you need to do, I think you are overcomplicating this read a file, put out another based on some condition on the input line?
Here is a simple read a line no matter how many columns, and put it out based on a condition:
Code:
def stream a.
def stream b.
def var myline as char no-undo.
input stream a from value("/tmp/infile.csv").
output stream b to value("/tmp/outfile.csv").
repeat:
   myline = "".
   import strream a unformatted myline.
   if entry(1,myline,",") = "Somevalue"
   then 
      put stream b unformatted
              /* assuming you want to strip out the <cr>*/
              replace(myline,chr(13),"")
              skip.
end.
output stream b close.
input stream a close.
 
Top