We have a JSON file that we are ultimately trying to output to a CSV file. Unfortunately, I have not been able to get the combination of the JSON Parse tool and the Crosstab tool to work, because the nested values in the input file are inconsistent. Please see the attached sample of JSON_Name values from the JSON Parse tool.
For example, some records have a blank location value. Other records have nested city, state, and country values in the location (e.g., location.city, location.state, and location.country). Also, the number of categories are nested and vary considerably from record to record, as the attached sample shows.
Is there a reliable way to output inconsistent nested values in JSON to consistent fields in a CSV file? This is the desired kind of output (forgive the formatting):
domain company_name ticker city state country categories domain_1 first_seen_at last_seen_at url title behind_firewall
ABC.com ABC Corp. NYSE:ABC New York New York United States databases hyperdata.com 2020-05-31T20:01:29Z 2021-02-12T01:14:40Z http://www.hyperdata.com/products-and-services/hyperdata-database/ Hyperdata Database TRUE
ABC.com ABC Corp. NYSE:ABC New York New York United States data_management hyperdata.com 2020-05-31T20:01:29Z 2021-02-12T01:14:40Z http://www.hyperdata.com/products-and-services/hyperdata-database/ Hyperdata Database TRUE
ABC.com ABC Corp. NYSE:ABC New York New York United States enterprise_resource_planning tap.com 2020-05-31T10:39:51Z 2020-12-07T23:31:52Z https://help.tap.com/erp2005_ehp_08/helpdata/en/a/0123456789/frameset.htm Tap Plant Maintenance TRUE
ABC.com ABC Corp. NYSE:ABC New York New York United States operations tap.com 2020-05-31T10:39:51Z 2020-12-07T23:31:52Z https://help.tap.com/erp2005_ehp_08/helpdata/en/b/0123456789/frameset.htm Tap Plant Maintenance TRUE
ABC.com ABC Corp. NYSE:ABC New York New York United States integrated_development_environments netbrains.com 2021-01-19T06:04:39Z 2021-01-19T06:04:57Z https://www.netbrains.com/go/ GoLand TRUE
Please let me know if I can provide any additional information, and thank you so much in advance!