I have configured Azure API for FHIR server and i am able to push some data into it. Using post man i am able to query the server and read the json files.
Now, i would like to move on to the next step. Moving the data into datalake and then apply some machine learning model on it.
First of all - Accessing the FHIR data in Microsoft Azure Storage explorer using the managed version - As per this comment, the data cannot be viewed in Cosmos DB when managed version is used.
Now, i am really confused on where the FHIR-server data is stored. I have submitted few patient information and i am not really sure whether its stored in DB or not.
I have couple of other questions in mind. To get the data out of FHIR server, should i first export it to cosmos Db and then setup a data factory to copy the data into datalake or is there any way, i can directly create a pipeline kind of stuff where data gets into datalake from fhir-serve itself.
Can someone guide on this ?