使用方法
Adapt config.yaml file with your settings:
OpenAIModel:
model: "gpt-3.5-turbo"
api_key: "your_openai_api_key"
GLMModel:
model_url: "your_chatglm_model_url"
timeout: 300
common:
book: "test/test.pdf"
file_format: "markdown"
# Set your api_key as an env variable
export OPENAI_API_KEY="sk-xxx"
python ai_translator/main.py --model_type OpenAIModel --openai_api_key $OPENAI_API_KEY --file_format markdown --book tests/test.pdf --openai_model gpt-3.5-turbo
And an example of how to use the GLM model:
# Set your GLM Model URL as an env variable
export GLM_MODEL_URL="http://xxx:xx"
python ai_translator/main.py --model_type GLMModel --glm_model_url $GLM_MODEL_URL --book tests/test.pdf