Skip to content

text-inference-batcher-nodejs 8656e098e4193c284eed951c61ffc4c7a22a4a48 Public Latest

A high performance batching router optimises max throughput for text inference workload

Install from the command line
Learn more about packages
$ docker pull ghcr.io/ialacol/text-inference-batcher-nodejs:8656e098e4193c284eed951c61ffc4c7a22a4a48

Recent tagged image versions

Loading


Last published

over 1 year ago

Issues

2

Total downloads

450