The biggest problems with this solution have to do with scale. If the load is high then your request is put on hold for a second, and then you have to handle it. If you make a lot of requests then it is your problem. It would be very helpful if a queue was implemented to handle, for example, 100 requests at the same time. Any additional request would be put on hold and made to wait for a few seconds. Once the network and infrastructure are loaded to handle the next request, it would proceed.
The biggest problems with this solution have to do with scale. If the load is high then your request is put on hold for a second, and then you have to handle it. If you make a lot of requests then it is your problem. It would be very helpful if a queue was implemented to handle, for example, 100 requests at the same time. Any additional request would be put on hold and made to wait for a few seconds. Once the network and infrastructure are loaded to handle the next request, it would proceed.