Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

Finetuning GPT-2 twice for particular style of writing on a particular topic

$
0
0

@regstuff wrote:

I’m just starting out in NLP and am working with gpt-2 for text generation.

My situation is that I have to generate text in a particular field for eg. family businesses, which pretrained gpt-2 is unlikely to have much “training” with. Besides the topic, I also need to generate the text in the style of one particular writer (for eg. incorporating their turns of phrase etc) This particular writer hasn’t written much about the family business topic unfortunately, but has written a lot about other topics.

It occurred to me that I can take gpt-2, finetune it on a large corpus of material on family businesses, and then finetune the new model on the written material of the particular writer.

Would this be the right way to achieve my objective of creating content on family businesses in the style of this particular writer?

Any suggestions on what sort of stuff I should keep in mind while doing this?

Any help is much appreciated.

Posts: 1

Participants: 1

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles