"glm-3-turbo"
Optional
doturn on sampling strategy when do_sample is true, do_sample is false, temperature、top_p will not take effect
Optional
maxmax value is 8192,defaults to 1024
Optional
messagesMessages to pass as a prefix to the prompt
Optional
requestUnique identifier for the request. Defaults to a random UUID.
Optional
stopOptional
streamingWhether to stream the results or not. Defaults to false.
Optional
temperatureAmount of randomness injected into the response. Ranges from 0 to 1 (0 is not included). Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks. Defaults to 0.95
Optional
topPTotal probability mass of tokens to consider at each step. Range from 0 to 1 Defaults to 0.7
Optional
zhipuAIApiAPI key to use when making requests. Defaults to the value of
ZHIPUAI_API_KEY
environment variable.
Generated using TypeDoc
Interface defining the input to the ZhipuAIChatInput class.