Dataset Viewer
results
dict | groups
dict | group_subtasks
dict | configs
dict | versions
dict | n-shot
dict | higher_is_better
dict | n-samples
dict | config
dict | git_hash
string | date
float64 | pretty_env_info
string | transformers_version
string | lm_eval_version
string | upper_git_hash
null | tokenizer_pad_token
sequence | tokenizer_eos_token
sequence | tokenizer_bos_token
sequence | eot_token_id
int64 | max_length
int64 | task_hashes
dict | model_source
string | model_name
string | model_name_sanitized
string | system_instruction
null | system_instruction_sha
null | fewshot_as_multiturn
bool | chat_template
string | chat_template_sha
string | start_time
float64 | end_time
float64 | total_evaluation_time_seconds
string |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
{"ifeval":{"alias":"ifeval","prompt_level_strict_acc,none":0.75,"prompt_level_strict_acc_stderr,none(...TRUNCATED) | {"mmlu":{"acc,none":0.71875,"acc_stderr,none":0.011540537619239823,"alias":"mmlu"},"mmlu_humanities"(...TRUNCATED) | {"ifeval":[],"mmlu_humanities":["mmlu_philosophy","mmlu_moral_scenarios","mmlu_international_law","m(...TRUNCATED) | {"ifeval":{"task":"ifeval","dataset_path":"google/IFEval","test_split":"train","doc_to_text":"prompt(...TRUNCATED) | {"ifeval":4.0,"mmlu":2,"mmlu_abstract_algebra":1.0,"mmlu_anatomy":1.0,"mmlu_astronomy":1.0,"mmlu_bus(...TRUNCATED) | {"ifeval":0,"mmlu_abstract_algebra":0,"mmlu_anatomy":0,"mmlu_astronomy":0,"mmlu_business_ethics":0,"(...TRUNCATED) | {"ifeval":{"prompt_level_strict_acc":true,"inst_level_strict_acc":true,"prompt_level_loose_acc":true(...TRUNCATED) | {"mmlu_high_school_chemistry":{"original":203,"effective":16},"mmlu_college_computer_science":{"orig(...TRUNCATED) | {"model":"vllm","model_args":"pretrained=/home/quixi/Mango/models/PocketDoc_Dans-PersonalityEngine-V(...TRUNCATED) | 3f792954 | 1,749,007,490.84404 | "PyTorch version: 2.7.0+cu126\nIs debug build: False\nCUDA used to build PyTorch: 12.6\nROCM used to(...TRUNCATED) | 4.52.4 | 0.4.8 | null | [
"<pad>",
"11"
] | [
"<|im_end|>",
"2"
] | [
"<s>",
"1"
] | 2 | 8,192 | {"mmlu_high_school_chemistry":"04b8485697f03f4b11ff38151e36940bd9dbf5ca082ac4064e997761b889578d","mm(...TRUNCATED) | vllm | /home/quixi/Mango/models/PocketDoc_Dans-PersonalityEngine-V1.2.0-24b | __home__quixi__Mango__models__PocketDoc_Dans-PersonalityEngine-V1.2.0-24b | null | null | true | "{{ bos_token }}{%- set loop_messages = messages %}\n{%- for message in loop_messages %}\n {%- se(...TRUNCATED) | fb4aa2c12fdd53c3a2bb6a85ecdfc0082988bc68212e14f3a7c2f8055b0d84ac | 87,247.277149 | 87,894.242174 | 646.965024264995 |
No dataset card yet
- Downloads last month
- 5