Varadkadtan commited on
Commit
c0c73c0
·
1 Parent(s): 1b1f23d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ tags:
15
 
16
  <!-- Provide a quick summary of what the model is/does. -->
17
 
18
- I've developed an abstractive text summarization model using the T5 transformer architecture in PyTorch. I've fine-tuned the model on my specific dataset to create concise and coherent summaries of longer texts. This model takes advantage of transformer libraries and the power of multi-head self-attention to capture context and dependencies in the input text. It's a valuable tool for generating human-like summaries, making information extraction and condensation more efficient.
19
  ## To use my model
20
 
21
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
 
15
 
16
  <!-- Provide a quick summary of what the model is/does. -->
17
 
18
+ I've developed an abstractive text summarization model using the T5 transformer architecture in PyTorch. I've fine-tuned the model on cnn_dailymail dataset to create concise and coherent summaries of longer texts. This model takes advantage of transformer libraries and the power of multi-head self-attention to capture context and dependencies in the input text. It's a valuable tool for generating human-like summaries, making information extraction and condensation more efficient.
19
  ## To use my model
20
 
21
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer