Skip to content
@ialacol

ialacol

LLM working better together, this org contains a collection of toolchain built around ialacol

Popular repositories Loading

  1. text-inference-batcher text-inference-batcher Public

    A high performance batching router optimises max throughput for text inference workload

    TypeScript 16 1

  2. .github .github Public

  3. chat-ui chat-ui Public

    Forked from huggingface/chat-ui

    Open source codebase powering the HuggingChat app

    TypeScript 1

  4. completion completion Public

    CLI for starting a copilot server

    Rust

  5. exllamav2 exllamav2 Public

    Forked from turboderp/exllamav2

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python

Repositories

Showing 5 of 5 repositories
  • chat-ui Public Forked from huggingface/chat-ui

    Open source codebase powering the HuggingChat app

    ialacol/chat-ui’s past year of commit activity
    TypeScript 0 Apache-2.0 1,172 0 0 Updated Oct 26, 2023
  • completion Public

    CLI for starting a copilot server

    ialacol/completion’s past year of commit activity
    Rust 0 MIT 0 0 0 Updated Oct 14, 2023
  • exllamav2 Public Forked from turboderp/exllamav2

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    ialacol/exllamav2’s past year of commit activity
    Python 0 MIT 296 0 0 Updated Sep 26, 2023
  • text-inference-batcher Public

    A high performance batching router optimises max throughput for text inference workload

    ialacol/text-inference-batcher’s past year of commit activity
    TypeScript 16 MIT 1 1 1 Updated Sep 6, 2023
  • .github Public
    ialacol/.github’s past year of commit activity
    0 0 0 0 Updated Aug 21, 2023

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…