How to continue incomplete response of openai API

SyntaxSurge

Power Member
Jr. VIP
Joined
Feb 6, 2023
Messages
537
Reaction score
149
In OpenAI API, how to programmatically check if the response is short. If so, you can add another command like "continue" or "expand"

Because, In my experience, I doesn't work if the response exceeds 4000 tokens, as you also need to pass the previous response to new response. If the response is 4500, it would return 4000 tokens, but you can't get the remaining 500 tokens as the max tokens per conversation is 4000 tokens. Correct me if I am wrong.
 
Create a function to calculate the number of words in the response, if that count is below your required response length add your command to the next prompt.
This will only work if you're using the chat completions api:

https://platform.openai.com/docs/guides/chat
https://platform.openai.com/docs/api-reference/chat
 
Create a function to calculate the number of words in the response, if that count is below your required response length add your command to the next prompt.
This will only work if you're using the chat completions api:

https://platform.openai.com/docs/guides/chat
https://platform.openai.com/docs/api-reference/chat
but the previous response is included in the tokens of the next prompt, right? If the response is 4500, it would return 4000 tokens, but you can't get the remaining 500 tokens as the max tokens per conversation is 4000 tokens.
 
but the previous response is included in the tokens of the next prompt, right?
I don't think it's included here in the chat completions api.
The whole purpose of having a separate chat completions api is to serve the chat feature where the system remembers the previous replies.
But there's a limit to what it can remember that's for sure, it doesn't remember everything if the chat thread gets longer.
 
I don't think it's included here in the chat completions api.
The whole purpose of having a separate chat completions api is to serve the chat feature where the system remembers the previous replies.
But there's a limit to what it can remember that's for sure, it doesn't remember everything if the chat thread gets longer.
I tried what you said, it seems that the previous messages in the conversation is included in the prompt tokens, have you ever tried this? Thanks
 
I tried what you said, it seems that the previous messages in the conversation is included in the prompt tokens, have you ever tried this? Thanks
Yes, I have tried it as I use the API regularly.
I thought maybe those messages were not counted here.
Thing is I don't generate too long content in a go but rather break it into sections maybe that's why never experienced that.
That means chat completions api and general completions api are more or less same.
 
Yes, I have tried it as I use the API regularly.
I thought maybe those messages were not counted here.
Thing is I don't generate too long content in a go but rather break it into sections maybe that's why never experienced that.
That means chat completions api and general completions api are more or less same.
I also break it into sections, but sometimes the response is incomplete as My prompt is very long. The prompt and the response consumes tokens. thanks btw
 
Back
Top