Skip to content

Releases: xorbitsai/inference

v1.1.0

13 Dec 10:29
b132fca
Compare
Choose a tag to compare

What's new in 1.1.0 (2024-12-13)

These are the changes in inference v1.1.0.

New features

Enhancements

  • ENH: Optimize error message when user parameters are passed incorrectly by @namecd in #2623
  • ENH: bypass the sampling parameter skip_special_tokens to vLLM backend by @zjuyzj in #2655
  • ENH: unify prompt_text as cosyvoice for fish speech by @qinxuye in #2658
  • ENH: Update glm4 chat model to new weights by @codingl2k1 in #2660
  • ENH: upgrade sglang in Docker by @amumu96 in #2668

Bug fixes

Documentation

Others

New Contributors

Full Changelog: v1.0.1...v1.1.0

v1.0.1

29 Nov 10:22
8dd5715
Compare
Choose a tag to compare

What's new in 1.0.1 (2024-11-29)

These are the changes in inference v1.0.1.

New features

Enhancements

Bug fixes

Documentation

New Contributors

Full Changelog: v1.0.0...v1.0.1

v1.0.0

15 Nov 10:15
4c96475
Compare
Choose a tag to compare

What's new in 1.0.0 (2024-11-15)

These are the changes in inference v1.0.0.

New features

Enhancements

Bug fixes

Documentation

Full Changelog: v0.16.3...v1.0.0

v0.16.3

08 Nov 05:47
85ab86b
Compare
Choose a tag to compare

What's new in 0.16.3 (2024-11-08)

These are the changes in inference v0.16.3.

New features

Enhancements

Bug fixes

Full Changelog: v0.16.2...v0.16.3

v0.16.2

01 Nov 10:09
67e97ab
Compare
Choose a tag to compare

What's new in 0.16.2 (2024-11-01)

These are the changes in inference v0.16.2.

New features

Enhancements

Bug fixes

  • BUG: fix bge-reranker-v2-minicpm-layerwise rerank issue by @hustyichi in #2495

Documentation

New Contributors

Full Changelog: v0.16.1...v0.16.2

v0.16.1

25 Oct 07:33
d4cd7b1
Compare
Choose a tag to compare

What's new in 0.16.1 (2024-10-25)

These are the changes in inference v0.16.1.

New features

Enhancements

Bug fixes

Documentation

New Contributors

Full Changelog: v0.16.0...v0.16.1

v0.16.0

18 Oct 11:40
5f7dea4
Compare
Choose a tag to compare

What's new in 0.16.0 (2024-10-18)

These are the changes in inference v0.16.0.

New features

  • FEAT: Adding support for awq/gptq vLLM inference to VisionModel such as Qwen2-VL by @cyhasuka in #2445
  • FEAT: Dynamic batching for the state-of-the-art FLUX.1 text_to_image interface by @ChengjieLi28 in #2380
  • FEAT: added MLX for qwen2.5-instruct by @qinxuye in #2444

Enhancements

Documentation

New Contributors

Full Changelog: v0.15.4...v0.16.0

v0.15.4

12 Oct 10:38
c0be115
Compare
Choose a tag to compare

What's new in 0.15.4 (2024-10-12)

These are the changes in inference v0.15.4.

New features

  • FEAT: Llama 3.1 Instruct support tool call by @codingl2k1 in #2388
  • FEAT: qwen2.5 instruct tool call by @codingl2k1 in #2393
  • FEAT: add whisper-large-v3-turbo audio model by @hwzhuhao in #2409
  • FEAT: Add environment variable setting to increase the retry attempts after model download failures by @hwzhuhao in #2411
  • FEAT: support getting progress for image model by @qinxuye in #2395
  • FEAT: support qwenvl2 vllm engine by @amumu96 in #2428

Enhancements

Bug fixes

Full Changelog: v0.15.3...v0.15.4

v0.15.3

30 Sep 13:42
00a9ee1
Compare
Choose a tag to compare

What's new in 0.15.3 (2024-09-30)

These are the changes in inference v0.15.3.

New features

Bug fixes

  • BUG: [UI] Fix 'Model Format' bug on model registration page. by @yiboyasss in #2353
  • BUG: Fix default value of max_model_len for vLLM backend. by @zjuyzj in #2385

New Contributors

Full Changelog: v0.15.2...v0.15.3

v0.15.2

20 Sep 09:05
5de46e9
Compare
Choose a tag to compare

What's new in 0.15.2 (2024-09-20)

These are the changes in inference v0.15.2.

New features

Bug fixes

Documentation

Full Changelog: v0.15.1...v0.15.2