Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add FP8/INT8 support #88

Open
YangWang92 opened this issue Oct 26, 2024 · 3 comments
Open

Add FP8/INT8 support #88

YangWang92 opened this issue Oct 26, 2024 · 3 comments
Assignees

Comments

@YangWang92
Copy link
Contributor

No description provided.

@YangWang92 YangWang92 self-assigned this Oct 26, 2024
@YangWang92 YangWang92 changed the title Add FP8 support Add FP8/INT8 support Oct 26, 2024
@Duncan1115
Copy link

Hi, is there any progress?

@YangWang92
Copy link
Contributor Author

I have some early results on FP8, and I’ll share them here this week~ Thanks for following~

@YangWang92
Copy link
Contributor Author

Hi @Duncan1115, I just noticed that you are the author of LLM-CODEBOOK. I’d like to ask if you are interested in continuing to improve VPTQ? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants