the time it takes depends on the size of each record.
May i know what kind of data are you submitting? How does that much of records are being generated on the client app? Can the client app process these records in the background ?
Hi astroanu. Thanks. Each record has about 10 fields. I basically have a database in SQL Server and would like to upload the data from the local database to the API.
For now, I am thinking of reading the records from the local database then use a loop to make single requests to the server until all records are posted. This will be done in the background to make the app responsive while this is going on.
I want to know if a better and more efficient way exists. Loops are generally very slow and I want to avoid them if I can.
This forum has no feature to notify me when a reply is made :(
i suggest you use a better application architecture by ditching PHP. Have you tried couchDB + PouchDB + AngularJs together? I suppose this is the ultimate set of tools for you. couchdb and pouchdb are designed to sync together. If you're planning to run the client as a standalone app (exe) you can use http://electron.atom.io/
Or if you still want to use PHP api with some client side Jquery/Js you will have to come up with a small script to run in a loop, submit un-submitted data, update local data ( basically the whole syncing process)
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community