---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:8522
- loss:DenoisingAutoEncoderLoss
base_model: sentence-transformers/all-roberta-large-v1
widget:
- source_sentence: This . A engineer and go a trip walking the when a The physicist
the distance of the the drop bullet his rifle fires the deer to . engineer his
. to account for he rifle licks finger the speed and of fires deer 5 right . statistician
"got!"
sentences:
- 'This is a mean joke.
A physicist, an engineer, and a statistician go on a hunting trip, they are walking
through the woods when they spot a deer in a clearing. The physicist calculates
the distance of the target, the velocity and drop of the bullet, adjusts his rifle
and fires, missing the deer 5 feet to the left. The engineer rolls his eyes. ''You
forgot to account for wind. Give it here'', he snatches the rifle, licks his finger
and estimates the speed and direction of the wind and fires, missing the deer
5 feet to the right. Suddenly, the statistician claps his hands and yells "We
got him!"'
- 'While driving to work, robbers jumped into my car and stole everything.
They were pirates of the car I be in.'
- Driving and trying to read twitter, I just ran over a poodle. Unfortunately I
drive a Yaris. My car got a dent and the poodle got annoyed.
- source_sentence: ': the love?? They.'
sentences:
- I have a super hero joke Fantastic four
- 'Monroe: What did the trailer and the truck do after they fell in love?
Amanda: What?
Monroe: They got hitched.'
- 'JOSIAH: What is a lawn mower’s favorite kind of music?
TIM: I’m not sure.
JOSIAH: Bluegrass.'
- source_sentence: 'JAYDEN What panda ’ s: JAYDEN: Bam-BOO!'
sentences:
- BlackBerry and Apple have come together to create a something for ladies who have
trouble listening. It's been called the Black-i.
- Where do you put the Duke? In the duke box!
- 'JAYDEN: What is a panda’s favorite Halloween food?
CAYDEN: What?
JAYDEN: Bam-BOO!'
- source_sentence: we should be the time expand language, not it instead of 'probababably
sentences:
- '"Don''t dip your pen in company ink." - HR training seminar explaining why I
shouldn''t sleep with the receptionist...I think.'
- we should be using all the time technology frees up to expand language, not shorten
it. instead of 'prolly' try 'probababably.'
- If you like internet jokes, you should see my online bank account.
- source_sentence: yoga What the to when she him Nahimastay
sentences:
- 'CRESENCIO: Why do turkeys eat so little?
MAX: I don’t know.
CRESENCIO: Because they are always stuffed.'
- I'm really sick of making my dog a birthday cake every 52 days.
- Redneck yoga. What did the redneck say to the yoga instructor when she asked him
to leave the class? Nahimastay
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on sentence-transformers/all-roberta-large-v1
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-roberta-large-v1](https://huggingface.co/sentence-transformers/all-roberta-large-v1). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-roberta-large-v1](https://huggingface.co/sentence-transformers/all-roberta-large-v1)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("SeppeV/roberta_TSDAE")
# Run inference
sentences = [
'yoga What the to when she him Nahimastay',
'Redneck yoga. What did the redneck say to the yoga instructor when she asked him to leave the class? Nahimastay',
"I'm really sick of making my dog a birthday cake every 52 days.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 8,522 training samples
* Columns: sentence_0
and sentence_1
* Approximate statistics based on the first 1000 samples:
| | sentence_0 | sentence_1 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details |
.... recently changed sound of my clock to Justin Bieber Baby" I wake up 5 earlier do to to it.
| Justin Bieber.... I have recently changed the sound of my alarm clock to "Justin Bieber - Baby". Now I wake up 5 minutes earlier every day, so I don't have to listen to it.
|
| A got yesterday . joke be funny it had a tit
| A woman got breast implants made of wood yesterday.
This joke would be funny if it had a punchline
Wooden tit
|
| TIL unvaccinated children are less likely autistic Because they more
| TIL unvaccinated children are less likely to be autistic
Because they are more likely to be dead
|
* Loss: [DenoisingAutoEncoderLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#denoisingautoencoderloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `num_train_epochs`: 1
- `multi_dataset_batch_sampler`: round_robin
#### All Hyperparameters