分词服务

POST /tokenizer
{
    "model": "gpt-3.5-turbo",
    "content": "some words"
}