Answered Error reading JSON data into a temp-table

Potish

Member
I have a simple program to read data from a json file into progress temp-tables. Program works well most of the time, but I am running into some data that causes an error as follows

'Unable to convert JSON to native data type for field 'name' in temp-table 'ttregions'. (15363)

Sample data causing this error is below

Code:
  {
    "code": "11b5",
    "country": "sk",
    "kind": "province",
    "name": "Ko\u0161ick\u00fd kraj",
    "created_at": "2014-12-08T06:00:00Z",
    "updated_at": "2014-12-08T06:00:00Z"
  },
According to the vendor that supplied the json files the error has to do with unicode encoding and the style they use to generate the data confirms to RFC 4627 JSON specification.

My program is very basic and looks as follows
Code:
define temp-table ttregions
field code       as character
field country    as character
field kind       as character
field name       as character
field created_at as datetime-tz
field updated_at as datetime-tz.

DEFINE VARIABLE cSourceType    AS CHARACTER NO-UNDO.
DEFINE VARIABLE cFile          AS CHARACTER NO-UNDO.
DEFINE VARIABLE cReadMode      AS CHARACTER NO-UNDO.
DEFINE VARIABLE lRetOK         AS LOGICAL   NO-UNDO.
DEFINE VARIABLE httHPROregions AS HANDLE    NO-UNDO.

httHPROregions = temp-table ttHPROregions:handle.

ASSIGN  cSourceType = "file"
        cFile       = "c:\json\regions-0.json"
        cReadMode   = "empty".

lRetOK = httHPROregions:read-json(cSourceType, cFile, cReadMode).
Any recommendations on how to work with these characters?
 
Last edited by a moderator:

Cecil

19+ years progress programming and still learning.
What's your session codepage set to?

Code works okay for me name becomes: Košický kraj.

I have the startup parameter -cpinternal UTF-8 & -cpstream UTF-8

BTW: small type-o in your code when referencing the temp-table.
 

Potish

Member
The session codepage parameters on my machine are set to -cpinternal ISO8859-1 & -cpstream ISO8859-1

I started session with prowin32 -cpinternal UTF-8 -cpstream UTF-8 and the json files without error under on that session.
 
Last edited:

Cecil

19+ years progress programming and still learning.
The session codepage parameters on my machine are set to -cpinternal ISO8859-1 & -cpstream ISO8859-1
ISO8859-1 is Latin 1 and I believe that it's not possible to use the special characters, try changing you code page if possible.
 

TheMadDBA

Active Member
There is a lot of overlap between the two but there are certain characters that will not map/convert properly.
 

Cecil

19+ years progress programming and still learning.
If you are dealing with just eastern european languages you might be able to use code page ISo-8859-2 if unicode (UTF-8) is not possible.

http://en.wikipedia.org/wiki/ISO/IEC_8859-2
 

Potish

Member
-cpinternal UTF-8 -cpstream UTF-8 worked for the files I needed to process. I was able to load all files into temp-tables successful. Thank you for the suggestions.
 
Top