takala/financial_phrasebank
Updated • 9.81k • 255
How to use Startup-Exchange/tps_sentimental_analysis with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="Startup-Exchange/tps_sentimental_analysis") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("Startup-Exchange/tps_sentimental_analysis")
model = AutoModelForSequenceClassification.from_pretrained("Startup-Exchange/tps_sentimental_analysis")This model is a fine-tuned version of bert-base-uncased on the financial_phrasebank dataset. It achieves the following results on the evaluation set:
A fine-tuned version of bert-base-uncased
Sentimental Analysis
| Lines | Emotions |
|---|---|
| Hi, Harper. I’m really happy you came. | Positive |
| Happy Father’s Day. | Positive |
| It was Christmas. | Neutral |
| HARPER sits at a table alone in a room. | Neutral |
| I am mad at you badly. | Negative |
financial_phrasebank
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 114 | 0.5293 | 0.8230 |
| No log | 2.0 | 228 | 0.0804 | 0.9779 |
| No log | 3.0 | 342 | 0.0367 | 0.9867 |
| No log | 4.0 | 456 | 0.1544 | 0.9646 |
| 0.3241 | 5.0 | 570 | 0.0497 | 0.9912 |
| 0.3241 | 6.0 | 684 | 0.0520 | 0.9912 |
| 0.3241 | 7.0 | 798 | 0.0318 | 0.9912 |
| 0.3241 | 8.0 | 912 | 0.0628 | 0.9912 |
| 0.0218 | 9.0 | 1026 | 0.0777 | 0.9867 |
| 0.0218 | 10.0 | 1140 | 0.0866 | 0.9867 |