--- base_model: Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq library_name: setfit metrics: - qwk pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 'Trying to detect someones emotions is like playing the game "Clue", it''s a mystery and very hard to figure out. But what if there was a software that allowed a chance to put the individual infront of a camera and the thought of guessing was no longer an option, because all the answers are sitting on the screen? Professor Thomas Huang from Beckman Institute has created such a software, a software that has the ability to calculate someone''s emotions much like in math class. This new software can open doors for the future, and give us new advancements in the classroom, but could it be an end to simply asking if some is okay? Could this be the end to emotional privacy, or simply emotions? The new software, the Facial Action Coding System, has many promising attributes and the ability to open new doors for technology and learning environements in the future. It being able to calculate emotions as easily as math problems, could help us decode a lot of the mystery that is the human mind. The process starts wiht a 3-D computer model of the face, and contains information of all the 44 major muscles in a human face and with the help of psychologists, it can classify six emotions and associate them with movements of facial muscles. This type of technology has the ability to be used in a classroom to watch when a student is getting bored or tired to help the teacher better themselves when teaching. Or maybe even help in real world situations when someone goes in front of a jury, or even during interviews for a job. But how accurate is this new software, and how do we know we can rely on it? The Facial Action Coding System is seemly flawless with many positive outlooks on what is instore for us in the futrue, but has two big questions sitting on its shoulders. How can we rely on this system? And how do we know that this is even an accurate reading? When thinking about someones emotions, one thinks of the individuals emotions as diverse and extremely different from everyone elses. So how can this software accurately read individual emotions around the world? Dr. Huang says "even though individuals often show varying degrees of expression", he firmly believes that his new software can identify the most mixed emotions with the use of video imagery and emotion recognition. This however, still does not provide us with a good example of how we can be sure that this is a reliable way to read someones emotions and could potetially ruin human socialization and emotions toward one another. While this new software may seem like the "next big thing", there are many ways that this could potentially ruin the human race. Society right now is circling around the phone, it is how we communicate, have a social life, and figure things out. It is in our way of life and we have adapted to the idea of constantly having a phone at our beck and call. So why not add in another way of relying on technology, but this one can help us figure out someones emotions, since simply asking "Are you okay?" is too hard and time consuming. Why not just stick them in front of a camera to have it tell them how they are feeling instead of trying to listen to how they feel and talking with one another. This new Facial Action Coding System has the ability to open many new doors, but it could close the door for human socialization. The Facial Action Coding System is an advancement in technology and has the abilities to open and grow our future. It had the ability to help out in classroom environments and give more support during other real life situations. But with it can come the destruction of human socialization and emotions and make a total mess of how we interact with each other. While trying to figure out someones emotions is very difficult and challenging, it may be better to stick with such ways rather than completly giving up what makes us human and causing a catastrophe in the human world. ' - text: 'Mona Lisa was happy 83 percent happy, 9 percent disgust, 6 percent fearful, and 2 percent angry. this show us that somehow computer software can recognize our emotion. at the University of Illinois, working collaboration with prof andd University of Amesterdam are experts at developing better ways for humans and computer to communicate. in the article say "computer reconginze the subtle facial movements we human use to express how we feel". Dr. Huang and Dr. Paul Eckman was working on processing to begins when the computer constructs a 3-D computer modle of the face, Eckman has classified that six basic emotions are happiness, surprise, anger, disgust, fear, and sadness. this is so true because when we have a lot of homework this emotion can relate to us. according to the text " by the weight the different unite, the software can even identify mixed emotions. most of us would havetroble actually describing each facial trait that conveys happy, worried. in fact, we humans perform this ame impressive calculation every day. Dr. Huang computer software stores similar antomical imformation as electronic code, perhaps Dr. Huang''s emotion algorithms are different sort of " Da Vinci code. Imagine a computer also knows that when you''re happy or sad. According to the article " the same technology can amake computer-animated faces more expressive--for video games or video surgery". there is one question " does your expression in the mirror suggest an emotion? yes, emotion instruction for a face that can look happy, sad,,,,etc. They are even indicate the difference between a genuine smile and foreced one. But in a false smile, the mouth is stretched sideways using the zygomatic major and different muscle, the risorius. according to the aricle " used to spot when a smilling politician or clebrity isn''t being truthful. Facial feedback theory of emotion, moving four facian muscles not only expresses emotion, but also may even help produce them. Constantin Stanislavsky, had his actors carefully reproduce smilling and frowining as a way of creating these emotions on state. according to the article " Empathy felling may happen because we unconsiously imitate another person''s facial expressions". This is why Dr. Huang and Dr. Eckman was decovery about the emotion.' - text: 'Leonardo Da Vinci''s renaissance painting "Mona Lisa" is one of the famous painting this world has ever known. But there was one question in my mind,"WHAT IS HER EMOTION"? Is she smiling, is she angry, is she sad, what is her emotion. Now this new technology called FACS (Facial Acting Coding System) can measure human emotions. It also measures Mona Lisa''s emotion. Is it valuable in today''s world. Nowdays, unlimited machines have been built to comfort human civilization. Some of them really benefitted, some of not. Now a Human emotion telling machine is built to measure emotions. I think it is valuable because the creator might have built it for purpose. But what I personally think, "IT IS USELESS". WHY?.Let me explain you. Humans are making new machines. But who has the time to test it. Because machines are growing, but the civilization is busy. Some people can''t give their family some time because they got job to do. If they''re done with job, then they have to look for home. Stress increases these days. I think this machine is valuable same as useless. Valuable because it takes a lot of time and years to make. Useless because it has no role to deal with human stress, it reads emotions, that''s pretty cool. But what anout dealing with stress. I hope you like my thought. ' - text: 'A Cowboy Who Rode the Waves is a program where you get to go on many adventures and visit unique places, but you also get to help those in need. Many countries were left in ruins after World War II, and to help these countries recover their food supplies, animals, and more, nations joined together to form UNRRA. You sign up and can help take care of horses, young cows, and mules. A good reason to join this program is if you like helping people in need. The countries were left in ruins and lots of their supplies and animals were gone. You would get to help recover all of these things and help take care of animals. Another reason to join is that you are allowed to experience many adventures and travel across oceans. Some of the Seagoing Cowboys had the benefit of seeing Europe and China. You would get to cross the Atlantic Ocean from the eastern coast of the United States and make your way to China. There are many other countries to visit, one including Italy. Being a Seagoing Cowboy can be more than just an adventure. Sure you get to tour many great countries, but you also get the benefit of getting to help all those that were affected by World War II.' - text: 'Usage of cars has been decreasing due to the effects it can have on the environment and the opinions of the public. Driving has had a major impact on the atmosphere due to it''s polluting effects. Smog covers highly populated cities where driving is just another way of carrying out everyday-life. Though transportation by car has been a huge help in economic progress, it still comes with a price. If we had no cars there would be less deaths, decreased enviromental instability, and shorter use for our limited amount of fuel. Texting/drinking and driving are some of the biggest causes of death in vehicles. The number of deaths caused by texting or drinking when driving has skyrocketed over the years. These areas where driving is prohibited are probably very safe places and the number of deaths brought about by driving are most likely little to none. But life without cars can pose for some serious problems. Yes, it may cause fewer deaths and decrease pollution. But, it will also bring about issues such as; limited transportation of goods, infestation of the homeless (not a joke), and many inexperienced drivers when they are needed. In war, mobile transportation by car or truck is often needed. If people who can''t drive are appointed to tasks such as driving, they won''t be much help and could make things worse. Yes, they could be taught but time is not everlasting. But all negatives aside, the suburban areas of the world could become much safer places without cars. No kids would get accidentily ran-over when their ball rolls into the street and the try to retrieve it. It would just be a much safer environment. Teens have no interest in learning to drive nowadays because they''re either too lazy, or they see the effects it has on the world. of course trains and emergency transportation will be needed though. But regular cars and vehicles aren''t a neccessary attribute to everyday life. In conclusion, cars that don''t serve a neccessary purpose aren''t needed. What are the cars that do? Those vehicles would be firetrucks, ambulances, and other emergency vehicles. But cars ment for our own personal transportation can be left out of the picture. Now if only we could do the same about drugs... ' inference: true model-index: - name: SetFit with Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: qwk value: 0.7139985521243908 name: Qwk --- # SetFit with Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq](https://huggingface.co/Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq](https://huggingface.co/Leo1212/longformer-base-4096-sentence-transformers-all-nli-stsb-quora-nq) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 4098 tokens - **Number of Classes:** 6 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 1 | | | 2 | | | 3 | | | 4 | | | 5 | | | 6 | | ## Evaluation ### Metrics | Label | Qwk | |:--------|:-------| | **all** | 0.7140 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("HSLU-AICOMP-LearningAgencyLab/automated-essay-scoring-setfit-finetuned") # Run inference preds = model("A Cowboy Who Rode the Waves is a program where you get to go on many adventures and visit unique places, but you also get to help those in need. Many countries were left in ruins after World War II, and to help these countries recover their food supplies, animals, and more, nations joined together to form UNRRA. You sign up and can help take care of horses, young cows, and mules. A good reason to join this program is if you like helping people in need. The countries were left in ruins and lots of their supplies and animals were gone. You would get to help recover all of these things and help take care of animals. Another reason to join is that you are allowed to experience many adventures and travel across oceans. Some of the Seagoing Cowboys had the benefit of seeing Europe and China. You would get to cross the Atlantic Ocean from the eastern coast of the United States and make your way to China. There are many other countries to visit, one including Italy. Being a Seagoing Cowboy can be more than just an adventure. Sure you get to tour many great countries, but you also get the benefit of getting to help all those that were affected by World War II.") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:---------|:-----| | Word count | 151 | 382.3744 | 2010 | | Label | Training Sample Count | |:------|:----------------------| | 1 | 130 | | 2 | 130 | | 3 | 130 | | 4 | 130 | | 5 | 100 | | 6 | 13 | ### Training Hyperparameters - batch_size: (2, 2) - num_epochs: (10, 10) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 10 - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: True - warmup_proportion: 0.1 - l2_weight: 0.01 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:-----:|:-------------:|:---------------:| | 0.0003 | 1 | 0.5439 | - | | 1.0 | 3165 | 0.1276 | 0.2650 | | 2.0 | 6330 | 0.0206 | 0.2915 | | 3.0 | 9495 | 0.0236 | 0.2984 | | 4.0 | 12660 | 0.0046 | 0.3119 | | 5.0 | 15825 | 0.0076 | 0.3003 | | 6.0 | 18990 | 0.0027 | 0.3009 | ### Framework Versions - Python: 3.11.9 - SetFit: 1.1.0 - Sentence Transformers: 3.1.1 - Transformers: 4.45.2 - PyTorch: 2.3.1+cu121 - Datasets: 3.0.1 - Tokenizers: 0.20.0 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```