fauxpilot/copilot_proxy/models.py
Parth Thakkar 01f1cbb629 Add python backend support
- Modify dockerfile to include bitsandbytes, transformers and latest version of pytorch
- Minor modifications in utils/codegen.py so that same client works with FT and Py-backend
- Minor modifications in launch.sh (no need to name models by GPU)
- Add installation script for adding a new python model (with super simple config_template)
- Modify setup.sh so that it aworks with both FT and Python backend models

Signed-off-by: Parth Thakkar <thakkarparth007@gmail.com>
2022-10-16 22:05:09 -05:00

23 lines
625 B
Python

from typing import Optional, Union
from pydantic import BaseModel
class OpenAIinput(BaseModel):
model: str = "fastertransformer|py-model"
prompt: Optional[str]
suffix: Optional[str]
max_tokens: Optional[int] = 16
temperature: Optional[float] = 0.6
top_p: Optional[float] = 1.0
n: Optional[int] = 1
stream: Optional[bool]
logprobs: Optional[int] = None
echo: Optional[bool]
stop: Optional[Union[str, list]]
presence_penalty: Optional[float] = 0
frequency_penalty: Optional[float] = 1
best_of: Optional[int] = 1
logit_bias: Optional[dict]
user: Optional[str]