Hi Family,
I am facing a tricky issue with formatting the API Data
1. The data is not being correctly formatted.
This is how I am getting the data now.
data_parameters_asset_id | data_parameters_asset_key | data_parameters_columns_0 | data_parameters_columns_1 | data_parameters_end | data_parameters_format | data_parameters_interval | data_parameters_order | data_parameters_start | data_parameters_timestamp_format | data_schema_description | data_schema_first_available | data_schema_last_available | data_schema_metric_id | data_schema_minimum_interval | data_schema_source_attribution_0_name | data_schema_source_attribution_0_url | data_schema_source_attribution_1_name | data_schema_source_attribution_1_url | data_schema_source_attribution_2_name | data_schema_source_attribution_2_url | data_schema_source_attribution_3_name | data_schema_source_attribution_3_url | data_schema_values_schema_circulating_marketcap | data_schema_values_schema_timestamp | status_elapsed | status_timestamp | JSON_Name | JSON_ValueString |
51f8ea5e-f426-4f40-939a-db7e05495374 | usdt | timestamp | circulating_marketcap | 2020-05-01T00:00:00Z | json | 1d | ascending | 2019-01-01T00:00:00Z | unix-milliseconds | The circulating marketcap is the price of the asset multiplied by the circulating supply. If no price is found for an asset because no trades occurred, the last price of the last trade is used. After 30 days with no trades, a marketcap of 0 is reported until trading resumes. | 2015-02-17T00:00:00Z | mcap.circ | 1d | Messari | https://messari.io | Kaiko | https://www.kaiko.com/ | Coinmetrics Community Data | https://coinmetrics.io/community-network-data/ | Public BigQuery | https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=crypto_ethereum&page=dataset | The circulating marketcap in US dollars. | Time in milliseconds since the epoch (1 January 1970 00:00:00 UTC) | 10 | 2020-05-28T01:07:04.084044342Z |
I want the data to be:
status__elapsed | status__timestamp | data__parameters__asset_key | data__parameters__asset_id | data__parameters__start | data__parameters__end | data__parameters__interval | data__parameters__order | data__parameters__format | data__parameters__timestamp_format | data__parameters__columns | data__schema__metric_id | data__schema__description | data__schema__values_schema__timestamp | data__schema__values_schema__circulating_marketcap | data__schema__minimum_interval | data__schema__first_available | data__schema__last_available | data__schema__source_attribution__name | data__schema__source_attribution__url | data__values__001 | data__values__002 |
8 | 2020-05-27T23:34:52.018414694Z | usdt | 51f8ea5e-f426-4f40-939a-db7e05495374 | 2019-01-01T00:00:00Z | 2020-05-01T00:00:00Z | 1d | ascending | json | unix-milliseconds | timestamp | mcap.circ | The circulating marketcap is the price of the asset multiplied by the circulating supply. If no price is found for an asset because no trades occurred, the last price of the last trade is used. After 30 days with no trades, a marketcap of 0 is reported until trading resumes. | Time in milliseconds since the epoch (1 January 1970 00:00:00 UTC) | The circulating marketcap in US dollars. | 1d | 2015-02-17T00:00:00Z | null | Messari | https://messari.io | 1.5463E+12 | 2620842097 |
circulating_marketcap | Kaiko | https://www.kaiko.com/ | 1.54639E+12 | 2651080714 | |||||||||||||||||
Coinmetrics Community Data | https://coinmetrics.io/community-network-data/ | 1.54647E+12 | 2651794623 | ||||||||||||||||||
Public BigQuery | https://console.cloud.google.com/bigquery?p=bigquery-public-data&d=crypto_ethereum&page=dataset | 1.54656E+12 | 2656980637 |
How can I rearrange this?
2. If I have more than 1 linksto run into the API, I want them to be correctly identified e.g. This data is for USD or DAI: "data__parameters__asset_key" column
How do I do it?
Thanks,
Harshad
Solved! Go to Solution.
Hi,
Trying to wrap my head around your expected results and I'm guessing the URLs are the unique results that you have queried and data values are the values attributed to them?
Ultimately you will have to deal with them separately and piece them back at the end.
I created a quick example based on my assumption and here you go the workflow:
Is this close to what you have in mind?
Cheers,
Seinchyi