0

I can run below query to get the data in given format as:

SELECT to_timestamp(unnest(ARRAY[[1725042600, 1725043500], [1725041700, 1725042600], [1725043500, 1725044400], [1725040800, 1725041700]]));

      to_timestamp
------------------------
 2024-08-30 18:30:00+00
 2024-08-30 18:45:00+00
 2024-08-30 18:15:00+00
 2024-08-30 18:30:00+00
 2024-08-30 18:45:00+00
 2024-08-30 19:00:00+00
 2024-08-30 18:00:00+00
 2024-08-30 18:15:00+00
(8 rows)

I am using postgres version 9.6 and below is schema

test_database=> \d fact_completeness_count_requests
                    Table "public.fact_completeness_count_requests"
         Column          |            Type             | Collation | Nullable | Default
-------------------------+-----------------------------+-----------+----------+---------
 request_id              | character varying(64)       |           | not null |
 event_type              | character varying(255)      |           | not null |
 technology              | character varying(255)      |           |          |
 vendor                  | character varying(255)      |           |          |
 name                    | character varying(255)      |           | not null |
 dataset_metadata        | json                        |           | not null |

I am using below query to genrate the expected output. As shown in query output above. Not sure how to use array and to_timestamp function with below query. Is it possible to archive with postgres 9.6 version

select 
 request_id
,dataset_metadata->> 'intervals_epoch_seconds' as epoc_seconds
from fact_completeness_count_requests ;

 31319ad1-e848-4ec3-9c3e-967981e2ae45-0  | [[1725048000, 1725051600]]
 7a05cc38-5303-417d-88ce-fe3a604570d2    | [[1725055200, 1725056100]]
 ae6c2b09-8a95-4ac0-9846-6e76071579af    | [[1725050700, 1725051600], [1725049800, 1725050700], [1725048900, 1725049800], [1725048000, 1725048900]]
1
  • 1
    I am a little unsure what you want as result , can you add create table and insert into for the table and wanted result Commented Aug 31, 2024 at 13:01

2 Answers 2

2

unnest() returns all elements of an array, no matter the nesting level. There is no immediate counterpart for a JSON array, not even in current Postgres versions (Postgres 16 at the time of writing).

One path to a solution is to convert the JSON array to a Postgres array and then apply the simple power of unnest().

Postgres 10 or later

SELECT t.request_id, to_timestamp(unnest(j.intervals_epoch_seconds)) AS ts
FROM   fact_completeness_count_requests t
     , json_to_record(t.dataset_metadata) AS j(intervals_epoch_seconds float[]);

fiddle

The applicable variant of to_timestamp() takes float as input, so coerce the output from json_to_record() to float[] right away.

Postgres 9.6

The above fails in Postgres 9.6. Quoting the release notes of Postgres 10:

  • Make json_populate_record() and related functions process JSON arrays and objects recursively (Nikita Glukhov)

    With this change, array-type fields in the destination SQL type are properly converted from JSON arrays, and composite-type fields are properly converted from JSON objects. Previously, such cases would fail because the text representation of the JSON value would be fed to array_in() or record_in(), and its syntax would not match what those input functions expect.

A hackish workaround is to cast the array literal after manually exchanging delimiters [] to Postgres array delimiters {}:

SELECT request_id
     , to_timestamp(unnest(translate(dataset_metadata ->> 'intervals_epoch_seconds', '[]', '{}')::float[])) AS ts
FROM   fact_completeness_count_requests;

fiddle

There are many ways how this can break, if the JSON input does not match the given format.

Related:

Plus, the ancient Postgres 9.6 has more ways to confuse you. See:

0
1

Easiest way I found to extract any data from json is to use json_to_record.

select request_id, attempt, intervals_epoch_seconds[attempt:attempt][1:2] interval, to_timestamp(intervals_epoch_seconds[attempt][1]) startdate, to_timestamp(intervals_epoch_seconds[attempt][2]) enddate , to_timestamp(intervals_epoch_seconds[attempt][2]) -  to_timestamp(intervals_epoch_seconds[attempt][1]) attempt_duration
from (select '31319ad1-e848-4ec3-9c3e-967981e2ae45-0' request_id, '{"intervals_epoch_seconds":[[1725048000, 1725051600]]}'::json dataset_metadata
            union all
            select '7a05cc38-5303-417d-88ce-fe3a604570d2' request_id, '{"intervals_epoch_seconds":[[1725055200, 1725056100]]}'::json dataset_metadata
            union all
            select 'ae6c2b09-8a95-4ac0-9846-6e76071579af' request_id, '{"intervals_epoch_seconds":[[1725050700, 1725051600], [1725049800, 1725050700], [1725048900, 1725049800], [1725048000, 1725048900]]}'::json dataset_metadata
        ) mydataset
inner join json_to_record(dataset_metadata) as t(intervals_epoch_seconds int[]) on true
inner join generate_subscripts(intervals_epoch_seconds,1) attempt on true; 
1
  • 1
    This fails in Postgres 9.6. Would need at least Postgres 10 where json_to_record() and friends process JSON arrays recursively. Postgres 9.6 issues an error ERROR: malformed array literal: "[[1725048000, 1725051600]]" DETAIL: "[" must introduce explicitly-specified array dimensions. because it naively expects array dimensions after a leading '[' in the array literal. See: stackoverflow.com/q/12011569/939860 Commented Sep 3, 2024 at 13:48

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.