Framework for describing asynchronous background Jobs. We try to replicate functionality of Python Celery framework. The idea is to build tool that will allow to write simple Task definitions, such as:
val simpleTask = Task.create("simple-task") { input: String ->
println("Hello, $input")
}
And then schedule job execution:
val manager = TaskManager(RabbitMQBroker()) // This is one of supported brokers
simpleTask.callLater("World", CallParams(delay = 3.seconds)) // Will execute job in 3 seconds from now
And deploy run worker process that will execute the job:
val manager = TaskManager(RabbitMQBroker())
manager.startWorkers(simpleTask)
See /example for examples.
Currently, framework supports 2 types of tasks.
Normal task. This type of task allows you to access taskManager through context properties. As well as additional metadata.
val task = Task.create("task") { ctx: ExecutionContext, input: String ->
println("Hello, $input")
}
Simplified task with input only. Should be used for task, that don't populate new tasks.
val simpleTask = Task.create("simple-task") { input: String ->
println("Hello, $input")
}
Currently, supported brokers are:
- RabbitMQ (GCP, AWS)
- Azure Service Bus
The correct way to migrate the workload is:
- Create new task with new workload
- Maintain 2 tasks until messages for the old one are depleted
- Delete old task
Important do not change task input as Json deserializer will break causing the queue to block. To mitigate the mentioned problem we introduced default behavior to automatically drop tasks on SerialisationError