Description

Acts as an entry point for all of the slave clusters setup.

Data flow

Input

Service asks for a new campaign tasks by fetching remote API each 10 seconds. Each request is made with a unique UUID identifier. This identifier then is used to fetch new tasks (requests) from scheduler API. When faced with an error, service will put unsuccessful requests to the Redis for further processing.

Output

All new tasks are then put into the manager queue. In case of failure (RabbitMQ is down), tasks are put into the Redis for further processing.