We have requirements to load the JSON array into azure storage from databricks. after that the stored JSON array to read and write back to azure sql DB from ADF.
below my sample JSON produced by ADB, then how to convert into dataframe and write back to storage.
[{'Details': {'Input': {'id': '1', 'name': 'asdsdasd', 'a1': None, 'a2': None, 'c': None, 's': None, 'c1': None, 'z': None}, 'Output': '{"msg":"some error"}'}, 'Failure': '{"msg":"error"}', 's': 'f'}, {'Details': {'Input': {'id': '2', 'name': 'sadsadsad', 'a1': 'adsadsad', 'a2': 'sssssss', 'c': 'cccc', 's': 'test', 'c1': 'ind', 'z': '22222'}, 'Output': '{"s":"2"}'}, 'Failure': '', 's': 's'}]
above JSON needs to load proper format into storage like parquet or delta ..etc
then we have to read this data from ADF to load into SQL DB
sample structure and expected table details .
adf_log_id | adf_id | e_name | e_desc | status | failure_msg
----------------------------------------------------------------------
1 | 1 | pipenam| {input and output details JSON} | success | Failure