I am planning to exchange NDB Entities between two GAE web apps using URL Fetch.
One Web app can initiate the HTTP POST Request with the entity model name, starting entity index number and number of entities to be fetched. Each entity would have an index number which would be incremented sequentially for new entities.
To Send an Entity:
Some delimiter could be added to separate different entities as well as to separate properties of an entity. The HTTP Response would have a variable (say "content") containing the entity data.
Receiving Side Web APP:
The receiver web app would parse the received data and store the entities and their property values by creating new entities and "put"ting them
Both the web apps are running GAE Python and have the same models.
My Questions: Is there any disadvantage with the above method? Is there a better way to achieve this in automated way in code?
I intend to implement this for some kind of infrequent data backup design implementation
"data app". This insulates the"main app"from poor design issues in future when I might have to redesign the Models(database design). In this way, I could just rewrite the"main app"and I would have all the data ready to be fed to it. My"main app"stores some of the"data app"data in some specialized data (Models) which are designed to give better query response time.. If you feel this approach is not right, pl feel free to tell me - gsinha"data app"from where we could push them to the"main app".. This would insulate me from design changes in my"main app"as I can create the entities as per my new model after receiving data(when I do design change) from"data app"- gsinha