How to parse json efficiently in Oracle 18c?

妖精的绣舞 提交于 2020-01-06 06:55:53

问题


I am trying to parse a large json using JSON_OBJECT_T, JSON_ARRAY_T APIs and its working fine but I want expert suggestions whether it is efficient or not?

I am adding my json file and parsing code as below

SampleJson

Code

SET SERVEROUTPUT ON;
DECLARE
   l_clob clob;
   l_time timestamp;

   l_json json_object_t;
   l_stops_array json_array_t;
   l_stops_arr json_array_t;

   routeInfoObj json_object_t;
   routeStopArr json_array_t;
BEGIN
   SELECT LOG_CLOB INTO l_clob FROM ITV_DEV_LOGS WHERE LOG_ID = 1435334;
   l_time := systimestamp;

   l_json := json_object_t.parse( l_clob );

   dbms_output.put_line( 'Parsing Time: ' || extract(second from( systimestamp - l_time ) ) );
   l_stops_array := l_json.get_array('data');
   DBMS_OUTPUT.PUT_LINE('Data array: '||l_stops_array.get_size);
   FOR i in 0..l_stops_array.get_size-1 loop
      l_stops_arr := TREAT(l_stops_array.get(i) AS JSON_OBJECT_T).get_array('routedStops');
      DBMS_OUTPUT.PUT_LINE('stops array: '||l_stops_arr.get_size);
      FOR j in 0..l_stops_arr.get_size - 1 loop
         routeInfoObj := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_object('routingInfo');
         DBMS_OUTPUT.PUT_LINE('Stop : ' || routeInfoObj.get_number('stop'));
         routeStopArr := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_array('routedJobs');
         FOR k in 0..routeStopArr.get_size - 1 loop
            DBMS_OUTPUT.PUT_LINE('JobRef : ' || TREAT(routeStopArr.get(k) AS JSON_OBJECT_T).get_string('jobRef'));
         // update query to update stop value to respective jobRef
        end loop;
      end loop;
  end loop;
END;

It's working fine but is there a way to improve this implementation as this is just a sample json and count of objects inside may go to 2000 and instead of updating records one by one, is there a way to update all records in one statement?


回答1:


You can use json_table() to turn the JSON value into a relational representation. That in turn can be used within a MERGE statement.

E.g. the following query:

select j.*
from itv_dev_logs
  cross join json_table(log_clob, '$.data.routedStops[*]'
                        columns stop_id integer path '$.stopId',
                                zone_ltf integer path '$.zoneLTF', 
                                info_stop_nr integer path '$.routingInfo.stop',
                                info_route_ref varchar(20) path '$.routingInfo.routeRef',
                                info_eta varchar(20) path '$.routingInfo.eta',
                                info_eta_dt timestamp path '$.routingInfo.etaDateTime',
                                info_stop_time number path '$.routingInfo.stopTime'
                        ) j
where log_id = 1435334;

Returns something like this:

STOP_ID       | ZONE_LTF | INFO_STOP_NR | INFO_ROUTE_REF | INFO_ETA | INFO_ETA_DT             | INFO_STOP_TIME | INFO_DIST_PREV_STOP | INFO_BREAK_TIME | INFO_BREAK_DURATION
--------------+----------+--------------+----------------+----------+-------------------------+----------------+---------------------+-----------------+--------------------
1554383571432 |        1 |            1 | R119           | 11:01    | 2019-04-16 11:01:00.000 |           0.08 |                0.27 | 00:00           | 00:00              
1554383571515 |        1 |            2 | R119           | 11:07    | 2019-04-16 11:07:00.000 |           0.08 |                0.34 | 00:00           | 00:00              
1554383571601 |        1 |            3 | R119           | 11:13    | 2019-04-16 11:13:00.000 |           0.08 |                   0 | 00:00           | 00:00              
1554383571671 |        1 |            4 | R119           | 11:19    | 2019-04-16 11:19:00.000 |           0.08 |                   0 | 00:00           | 00:00              
1554383571739 |        1 |            5 | R119           | 11:25    | 2019-04-16 11:25:00.000 |           0.08 |                   0 | 00:00           | 00:00              

That could be used as the source of a MERGE statement to update your target table:

merge into your_target_table tg
using (
    select j.*
    from itv_dev_logs
      cross join json_table(log_clob, '$.data.routedStops[*]'
                            columns stop_id integer path '$.stopId',
                                    zone_ltf integer path '$.zoneLTF', 
                                    info_stop_nr integer path '$.routingInfo.stop',
                                    info_route_ref varchar(20) path '$.routingInfo.routeRef',
                                    info_eta varchar(20) path '$.routingInfo.eta',
                                    info_eta_dt timestamp path '$.routingInfo.etaDateTime',
                                    info_stop_time number path '$.routingInfo.stopTime'
                            ) j
    where log_id = 1435334
) t on (t.stop_id = tg.stop_id and ... more join conditions ...)
when matched then update
    set stop_nr = t.info_stop_nr, 
        eta_timestamp = t.eta_dt, 

As you neither provided the structure of the target nor the information which JSON keys should be matched to which table columns, all column names are just guesses and you need to replace them with the correct names.



来源:https://stackoverflow.com/questions/55535454/how-to-parse-json-efficiently-in-oracle-18c

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!