Chatgpt (plus) is doing everything possible to not write long form blog posts

IM Dude

Elite Member
Executive VIP
Jr. VIP
Joined
Nov 11, 2018
Messages
10,439
Reaction score
9,241
After last update its going nuts and cutting corners trying to write short paragraphs. What every prompt i give it, it seems its got hard wired to fall back to being a chat bot.
Another thing happen is quality is severely degraded.
Any one else noticed it?
 
They got angry users complaining them, as if the sky is about to fall and chatgpt is still running at a loss i have heard.
I see no other way but to increase the monthly subs price of 20$
 
Use API, you can set number of tokens, temperature etc. If you are code-challenged on github you can find a lot of ready made examples how to connect api.
 
Use API, you can set number of tokens, temperature etc. If you are code-challenged on github you can find a lot of ready made examples how to connect api.
Code-challenged hahaha. No, I don't think I have access to chargpt plus api? Last time I checked.
 
API is priced separately from chatgpt plus. And pricing is ridiculously low, I don't get it why people are complaining. Per 1000 tokens (about 750 words) GPT-3.5 Turbo cost is like 0.002. Prompt and response are charged, so if prompt have about 100 tokens (75 words) and response max 1000 tokens (you can set it up to 4.000, I think) you will be charged 1100 tokens. Basically with 5 usd you are covering monthly normal usage needs.

Under your account profile you can find access to api keys https://platform.openai.com/account/api-keys, create one. Examples on github are mostly for davinci 003 but it is priced a bit more, no need for it, gpt turbo is on par for almost everything and 10x less priced.
 
API is priced separately from chatgpt plus. And pricing is ridiculously low, I don't get it why people are complaining. Per 1000 tokens (about 750 words) GPT-3.5 Turbo cost is like 0.002. Prompt and response are charged, so if prompt have about 100 tokens (75 words) and response max 1000 tokens (you can set it up to 4.000, I think) you will be charged 1100 tokens. Basically with 5 usd you are covering monthly normal usage needs.
After using GPT-4 much, GPT-3.5 feels like Dumb! GPT-4 API is costlier!
 
I still didn't get api access to v4 but I've had 24 hour demo access, to be honest for many tasks for writing, summarizing etc are absolutely doable even on weakest models. Only real difference I've seen in gpt 4 vs others is in coding and reasoning (explanations) for some financial articles.
 
I still didn't get api access to v4 but I've had 24 hour demo access, to be honest for many tasks for writing, summarizing etc are absolutely doable even on weakest models. Only real difference I've seen in gpt 4 vs others is in coding and reasoning (explanations) for some financial articles.
Bruh, you are thinking like a coder, been there done that.
After using GPT-4 much, GPT-3.5 feels like Dumb! GPT-4 API is costlier!
Exactly!
 
use the GPT API and the davinci model. its more expensive but much better at writing articles and following specific instructions.
 
Use API, you can set number of tokens, temperature etc. If you are code-challenged on github you can find a lot of ready made examples how to connect api.
API is expensive :(
 
there is always bard and there is an opensource unoficial api that you can use quite easly
https://github.com/ra83205/google-bard-api
LocalAI is another project
https://github.com/go-skynet/LocalAI
 
After last update its going nuts and cutting corners trying to write short paragraphs. What every prompt i give it, it seems its got hard wired to fall back to being a chat bot.
Another thing happen is quality is severely degraded.
Any one else noticed it?
GPT 4 or gpt 3.5?

API is priced separately from chatgpt plus. And pricing is ridiculously low, I don't get it why people are complaining. Per 1000 tokens (about 750 words) GPT-3.5 Turbo cost is like 0.002. Prompt and response are charged, so if prompt have about 100 tokens (75 words) and response max 1000 tokens (you can set it up to 4.000, I think) you will be charged 1100 tokens. Basically with 5 usd you are covering monthly normal usage needs.

Under your account profile you can find access to api keys https://platform.openai.com/account/api-keys, create one. Examples on github are mostly for davinci 003 but it is priced a bit more, no need for it, gpt turbo is on par for almost everything and 10x less priced.
What’s the estimate for gpt 4? Do you think 5 usd is enough for it? I have gpt 4 access, but haven’t computed. Will check later.
 
Last edited:
I'm not going insane after all, I also noticed something weird is going on with the paragraphs lol.
 
Back
Top