-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Issues: lm-sys/FastChat
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Error in launching SGLang model worker multiple log values
#3372
opened Jun 2, 2024 by
vikrantrathore
fastchat.serve.model_worker --port 21003 and /v1/models show nothing
#3365
opened May 28, 2024 by
MarseusFu
Feature: Add OpenAI Usage stats when using streaming with the Chat Completions API or Completions API
#3360
opened May 23, 2024 by
douxiaofeng99
[Bug] Single quote character
'
tripping up model generation and streaming
#3358
opened May 22, 2024 by
PyroGenesis
Unable to use streaming with the /v1/embeddings API for the CodeQwen1.5-7B model.
#3357
opened May 22, 2024 by
fangx1129
Error: Type object 'Dropdown' has no attribute 'update' in "qa_browser.py"
#3347
opened May 17, 2024 by
tanliboy
How do you consider the 'top_k' parameter when using openai_api_server to start?
#3316
opened May 7, 2024 by
garyyang85
Merged Model from Huggingface runs fine with fastchat CLI but not when using service worker
#3315
opened May 7, 2024 by
heli-sdsu
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.