Replies: 11 comments 10 replies
-
Bull and BullMQ both support multiple consumers, just create as many workers as you see fit and they will consume your jobs in parallell. |
Beta Was this translation helpful? Give feedback.
-
@manast |
Beta Was this translation helpful? Give feedback.
-
Ok, so you mean fanout (https://en.wikipedia.org/wiki/Fan-out_(software)), i.e. same job to many consumers. That is not supported and there are no plans for it, in that case using XTREAMS is a really good solution if you do not want to experiment with more complex solutions such as Kafka. |
Beta Was this translation helpful? Give feedback.
-
@manast thanks a lot for your answers, sorry didnt find anything related to XTREAMS, can you please give me a reference? |
Beta Was this translation helpful? Give feedback.
-
@bennyKY I meant streams as you wrote on the issue, sorry for the confusion: https://redis.io/commands#stream |
Beta Was this translation helpful? Give feedback.
-
@manast Came across this while searching for the same goal as the OP. It looks like I could achieve a similar result by listening to queue events. Seems like a bit of an abuse of the intention of BullMQ, but wondering if something along the following lines would get a similar result. const queue = new Queue('Cars');
await queue.add('paint', { color: 'red' });
const worker = new Worker('paint', paintHandler);
...
const queueEvents = new QueueEvents('Paint');
queueEvents.on('completed', queueListener1);
queueEvents.on('completed', queueListener2);
queueEvents.on('completed', queueListener3); The downside here (I assume) is that I can't set these up to run in multiple workers because each listener would be called in each worker. That could lead to surprising behavior unless my listeners are idempotent. For example, if the app that's listening scales horizontally, each listener would be called on each running instance. Am I understanding how Queue listeners work correctly? |
Beta Was this translation helpful? Give feedback.
-
Had to eliminate BullMQ because it does not provide fanout, looking at RabbitMQ or LavinMQ instead. |
Beta Was this translation helpful? Give feedback.
-
Yes, BullMQ supports multiple consumers by leveraging Redis Streams. You can achieve this by creating multiple workers listening to the same queue. BullMQ ensures that each job is processed by only one worker, effectively distributing the workload among consumers. Steps:
Example:const { Worker } = require('bullmq');
const worker1 = new Worker('queueName', async job => {
console.log(`Worker 1 processing job: ${job.id}`);
});
const worker2 = new Worker('queueName', async job => {
console.log(`Worker 2 processing job: ${job.id}`);
}); This setup provides concurrent processing without RabbitMQ or Kafka. Just ensure your workers are stateless or handle shared resources correctly. |
Beta Was this translation helpful? Give feedback.
-
BullMQ with Multiple Consumers: Yes, BullMQ supports multiple consumers using Redis Streams. Create multiple Worker instances for the same queue, and jobs will be distributed among them. |
Beta Was this translation helpful? Give feedback.
-
Without Rabbit/Kafka: Use Redis Streams directly with libraries like redis in Rust or Node.js. Use XADD for adding jobs, XREADGROUP for processing with consumer groups, and XACK for acknowledgments. This achieves distributed processing without additional tools. |
Beta Was this translation helpful? Give feedback.
-
Hi, Bull uses Redis pubsub so it wont allow multiple consumers.
But Bullmq uses Redis streams so it should be feasible.
Is there a way to do it with Bullmq? If not maybe someone can advise how to achieve it without Rabbit or Kafka?
Beta Was this translation helpful? Give feedback.
All reactions