Jump to content

I have a large file which I need to save into a JSON file fo...


Marita Locklear

Recommended Posts

I have a large file which I need to save into a JSON file format and read it into google maps. I would rather not build my JSON using a defined text field. However, when I read the file from the hold file with format JSON it removes the duplicate rows. I need these duplicate row data since I am weighting these with a score value after loading. Is there a reason it is doing this Is there a way or some extra code I can put into the hold file format JSON line which will prevent this from occurring

Thanks In Advance

Link to comment
Share on other sites

Yes with any other format all records show up. I am 8.2.06. If I run this code it it says all lines run but when I display in the javascript or print to a new hold file it does not print the duplicate records.

Same if I use PCHOLD FORMAT JSON and print to a file instead of hold file.

Possible causes:

 

Output was directed to a destination such as a file or printer

 

Detail:

-SET &OFFENDER_ID = 209568 ;

TABLE FILE WEBF_POSITIONS_SELECT

PRINT

LAT

LONG

WEIGHT

WHERE P0001 EQ 209568;

ON TABLE HOLD AS H1 FORMAT JSON

END

-RUN

1

0 NUMBER OF RECORDS IN TABLE= 2471 LINES= 2471

0

TABLE FILE H1

PRINT *

ON TABLE HOLD AS H2

END

-EXIT

1

0 NUMBER OF RECORDS IN TABLE= 2460 LINES= 2460

0

Link to comment
Share on other sites

I did reproduce the problem. If the duplicate records were not contiguous in the input file (in my case an excel spreadsheet) then all records were saved. I changed the duplicates to be one after the other in the file it would ignore the second record. The strange part was that it said it saved 4 records in the json file but when I opened it in textpad there were only 3 so I suspect that in your case the original hold file only has 2460 lines. You definitely need to open a case with techsupport.
Link to comment
Share on other sites

  • 3 weeks later...
I might have this wrong Its been a awhile a long while but I believe if you change your TABLE to a TABLEF command Im pretty sure it will keep the duplicates. TABLEF tells the WF server to bypass the FOCSORT space. You can still use a BY(s) and ACROSS(es) But without FOCSORT Server is simply grabbing records (inbound) and spitting out those same rows 1 for 1 on the outbound space. Bear in mind that depending on what your data source is for example Oracle or SQL Server that the same row set may not necessarily be handed to the WF process in the same order run over run.
Link to comment
Share on other sites

I thought of another method to accomplish this as well (its friday and still early) another way would be to simply Define a column with that unconditionally and mindlessly build a row counter and then reference that counter in a standard TABLE (no TABLEF) request:

DEFINE FILE JSON_INPUT

COUNTER/I9 WITH real_column = LAST COUNTER + 1;

END

TABLE FILE JSON_INPUT

PRINT

JSON_STRING

BY Real_By_Columns (if you want to See real sort value Rows next to Each Other)

BY COUNTER (null or NOPRINT depending if you want to see the value of COUNTER)

ON TABLE HOLD AS JSON_OUTPUT

END

I use this technique all the time in staging table type applications, especially if Im loading data from an outside vendor and I have NO idea what their key structure is, or whether theyve included dups etc. I just want to insure that I capture everything and dont try to figure it out during the raw data capture phase. I usually use this on the iinbound side but you can also turn it around and use it on outbound or data export side as well.

Link to comment
Share on other sites

  • 1 month later...

Charles,

Thank you for continue to offer such great suggestions. I tried to TABLEF with no apparrent change still dropping those dups. The other method we sorta of used to fix the issue. My ETL guy just added a field to make each row unique to my stored procedure and we are working it that way.

I opened a case with IBI because well it should not be dropping out dups when storing and reading as a JSON file. But we have a working work around. Thank You

Link to comment
Share on other sites

  • 3 months later...

Hi Marita

I was just browsing through the release notes for 8207.21 and ran across this note saying perhaps your problem is fixed in the .21 release:

Fixed: Missing duplicate rows in PCHOLD FORMAT JSON. (200911017, 200619051)

Those are case numbers shown in the parenthesis. Maybe one of those cases was yours

Just passing it along.

Toby

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
  • Create New...