Alexion
Banned - Shitlisted and did not do the right thing
I'm sorry of I missed it, but are you also gathering entities / topical keywords on those articles? Are you in any way trying to integrate them as well?
To create categories/silos yeah? I'm working on it, that's why the site is not up yet. I've been a bit busy with real-life stuff to put in a lot of coding this weekI'm sorry of I missed it, but are you also gathering entities / topical keywords on those articles? Are you in any way trying to integrate them as well?
get me access, famHere comes the next gen NLP model with 530 billion parameters. Dont know if they will open source this.
Source - https://analyticsindiamag.com/nvidi...h-530-billion-parameters-leaves-gpt-3-behind/
This will be very helpful if you do that.I will set up a simple script for you guys later this week so you got some place to start.
Wow great journey, I dont understand properly but can you please tell me, for only Paraphrasing do you using spacy? or pegasus_paraphrase? or both?
I mean for your top quality Paraphrasing.
Hallelujah brother! Zero comp is the best, it's the vast, beautiful blue ocean.Page 1 within 5-6 days for some zero competition keywords.
gpt2?Non-gpt3
haha, no. I've taken a different approach. Non-abstract method. Pure NLP/grammar-based. Instead of trying to understand the context, as GPT does, I'm only working on a per-sentence basis. This leads to mistakes, but paradoxically fewer than GPT-3.gpt2?![]()
I have tried couple of approaches including Quillbot and Pegasus, Quillbot no longer offers API and for some reason result is not as readable [feels spun] as pegasus . In both cases 10 to 30% plagiarism was observed easily. Quillbot approach is fast and pegasus takes 10 to 15 minutes for 1000 words article.haha, no. I've taken a different approach. Non-abstract method. Pure NLP/grammar-based. Instead of trying to understand the context, as GPT does, I'm only working on a per-sentence basis. This leads to mistakes, but paradoxically fewer than GPT-3.
I think you could use something like QuillBot Premium(has an API!) and the results will be better than GPT-3 if you put some clever code around it. Spacy is great for tokenizing. Or maybe use some low-level ML way to tokenize it (Google has some good free stuff - https://github.com/google/sentencepiece)
This is the latest version of the paraphraser, I think I'm done here and feeling proud AF:
View attachment 189442
Getting clicks after 6 days of posting the articles! Pretty happy about that too.
View attachment 189443
Now just gotta move fast and scale before this loophole gets patched somehow by big G.
for images I'm using Unsplash API(free) + Pillow (Python image manipulation) + Nider (Python library for writing on images)I have tried couple of approaches including Quillbot and Pegasus, Quillbot no longer offers API and for some reason result is not as readable [feels spun] as pegasus . In both cases 10 to 30% plagiarism was observed easily. Quillbot approach is fast and pegasus takes 10 to 15 minutes for 1000 words article.
What are you doing for images and text formatting like bold,italic?
and about this... get a GPUpegasus takes 10 to 15 minutes for 1000 words article.
hehe not going to get patched any time soonNow just gotta move fast and scale before this loophole gets patched somehow by big G.
this I don't understand. You could easily train a model which predicts with 99% if a site is an autoblog/spam site.hehe not going to get patched any time soon
Google keeps saying autoblogs are dead just to scare people, but the reality is that they still work and WILL continue to work UNLESS there was manual reviews of EVERY SINGLE site out there, which is impossible![]()