Datasets:
Dataset Viewer
id
int64 1
100
| lang
stringclasses 2
values | qtype
stringclasses 4
values | question
stringlengths 35
86
| answers
sequencelengths 4
4
| correct
int64 0
3
|
---|---|---|---|---|---|
1 | PL | Qra | Ile modeli języka wchodzi w skład rodziny Qra? | [
"0",
"1",
"2",
"3"
] | 3 |
2 | PL | Qra | Ile miliardów parametrów ma największy model z rodziny Qra? | [
"1",
"70",
"13",
"8"
] | 2 |
3 | PL | Qra | Ile miliardów parametrów ma najmniejszy model z rodziny Qra? | [
"0.5",
"1",
"1.5",
"3"
] | 1 |
4 | PL | Qra | Jaka uczelnia stoi za stworzeniem modelu języka Qra? | [
"Politechnika Gdańska",
"Uniwersytet Warszawski",
"AGH",
"Uniwersyste im. Adama Mickiewicza"
] | 0 |
5 | PL | Qra | Jaka jest wielkość okna kontekstowego dla modeli języka z rodzin Qra? | [
"2048",
"512",
"4096",
"1024"
] | 2 |
6 | PL | Qra | Na jakiej licencji został udostępniony model języka Qra-1B? | [
"Apache 2.0",
"MIT",
"LLama 2 Community License Agreement",
"Creative Commons"
] | 0 |
7 | PL | Qra | Na bazie jakiego modelu języka powstał model Qra-1B? | [
"Llama-3.2-1B",
"GPT-2",
"Mistral-7B-v0.3",
"TinyLlama-1.1B"
] | 3 |
8 | PL | Qra | Ile dni zajęła nauka modelu Qra-1B? | [
"7",
"1",
"2",
"5"
] | 2 |
9 | PL | Qra | Na bazie jakiego modelu języka powstał model Qra-7B? | [
"Mistral-7B-v0.1",
"Llama-2-7b-hf",
"Meta-Llama-3-8B",
"Bielik-7B-v0.1"
] | 1 |
10 | PL | Qra | Na jakiej licencji został udostępniony model języka Qra-7B | [
"GNU GPL",
"Qra Open-Licence",
"MIT",
"LLama 2 Community License Agreement"
] | 3 |
11 | PL | Qra | Ile dni zajęła nauka modelu Qra-7B? | [
"14",
"7",
"21",
"30"
] | 0 |
12 | PL | Qra | Na bazie jakiego modelu języka powstał model Qra-13B? | [
"Phi4-14B",
"Qwen-2.5-14B",
"Llama-2-13b-hf",
"Olmo2-13B"
] | 2 |
13 | PL | Qra | Na jakiej licencji został udostępniony model języka Qra-13B | [
"LLama 2 Community License Agreement",
"Apache License 2.0",
"MIT",
"LGPL"
] | 0 |
14 | PL | Qra | Ile dni zajęła nauka modelu Qra-13B? | [
"13",
"35",
"7",
"25"
] | 1 |
15 | PL | Qra | Jakiego typu modelu języka są modele z rodziny Qra? | [
"base",
"chat",
"instruct",
"guardrail"
] | 0 |
16 | PL | Qra | Jaki jest główny język którym operują modele języka z rodziny Qra? | [
"angielski",
"słowacki",
"polski",
"rosyjski"
] | 2 |
17 | PL | Qra | Jaki jest rozmiar ostatecznego korpusu uczęcego modele języka z rodziny Qra? | [
"1 TB",
"500 GB",
"13 GB",
"2 TB"
] | 0 |
18 | PL | Bielik | Kto jest głównym właścicielem modeli języka z rodziny Bielik? | [
"Polskie Ministerstwo Cyfryzacji",
"Uniwersytet im. Adama Mickiewicza",
"Politechnika Warszawska",
"SpeakLeash"
] | 3 |
19 | PL | Bielik | Jakiego typu modelem SI jest model Sójka? | [
"embedding",
"guardrail",
"llm-instruct",
"llm-base"
] | 1 |
20 | PL | Bielik | Ile miliardów parametrów ma największy model z rodziny Bielik? | [
"11",
"8",
"7",
"70"
] | 0 |
21 | PL | Bielik | Ile miliardów parametrów ma najmniejszy model z rodziny Bielik? | [
"11",
"8",
"7",
"70"
] | 2 |
22 | PL | Bielik | Na bazie jakiego modelu języka powstał Bielik-7B-v0.1 | [
"Llama-2-7b-hf",
"Mistral-7B-v0.1",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 2 |
23 | PL | Bielik | Na jakiej licencji został udostępniony model języka Bielik-7B-v0.1 | [
"LLama 2 Community License Agreement",
"Apache License 2.0",
"MIT",
"LGPL"
] | 1 |
24 | PL | Bielik | Na bazie jakiego modelu języka powstał Bielik-11B-v2 | [
"Mistral-7B-v0.2",
"Qwen-2.5-14B",
"gemma-7b-it",
"GPT2-xl"
] | 0 |
25 | PL | Bielik | Jaki jest główny język którym operują modele języka z rodziny Bielik? | [
"angielski",
"słowacki",
"polski",
"rosyjski"
] | 2 |
26 | PL | Bielik | Na infrastruktzure jakiej uczelnii był uczony model języka Bielik? | [
"Politechnika Gdańska",
"Uniwersytet Warszawski",
"AGH",
"Uniwersyste im. Adama Mickiewicza"
] | 2 |
27 | PL | Bielik | Jaka jest wielkość okna kontekstowego dla modelu języka Bielik-7B-v0.1? | [
"2048",
"512",
"4096",
"1024"
] | 2 |
28 | PL | Bielik | Jaka jest wielkość okna kontekstowego dla modeli języka z rodziny Bielik-7B-v2? | [
"32000",
"128000",
"4096",
"16000"
] | 0 |
29 | PL | Bielik | Jakim odel języka został zaadoptowany przez startup Gaius Lex? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 0 |
30 | PL | Bielik | Jaki model języka został zaadoptowany przez poznańską firmę DomData? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 0 |
31 | PL | PLLuM | Jaki model będzie wykorzystywany w polskiej administracji publicznej? | [
"Qra",
"Bielik",
"PLLuM",
"PolishRoberta"
] | 2 |
32 | PL | PLLuM | Jaki jest główny język którym operuje model PLLuM? | [
"angielski",
"niemiecki",
"polski",
"francuski"
] | 2 |
33 | PL | PLLuM | Ile miliardów parametrów ma najmniejszy model z rodziny PLLuM | [
"12",
"8",
"7",
"70"
] | 1 |
34 | PL | PLLuM | Kto jest głównym właścicielem modeli z rodziny PLLuM | [
"Polskie Ministerstwo Cyfryzacji",
"Uniwersytet im. Adama Mickiewicza",
"Politechnika Warszawska",
"SpeakLeash"
] | 0 |
35 | PL | PLLuM | Ile miliardów parametrów ma największy model z rodziny PLLuM | [
"12",
"8",
"7",
"70"
] | 1 |
36 | PL | PLLuM | Jaki model języka został zaadoptowany przez firmę Comarch w ChatERP? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 2 |
37 | PL | PLLuM | Na jakiej licencji został udostępniony model języka Llama-PLLuM-8B-base? | [
"Llama 3.1",
"Apache License 2.0",
"MIT",
"LGPL"
] | 0 |
38 | PL | PLLuM | Na jakiej licencji został udostępniony model języka PLLuM-12B-base? | [
"Apache License 2.0",
"CC-BY-NC-4.0",
"MIT",
"LGPL"
] | 1 |
39 | PL | PLLuM | Na jakiej licencji został udostępniony model języka PLLuM-8x7B-base? | [
"CC-BY-NC-4.0",
"MIT",
"LGPL",
"Apache License 2.0"
] | 3 |
40 | PL | PLLuM | Na bazie jakiego modelu języka powstał Llama-PLLuM-8B-base? | [
"Llama3.1-8B",
"Mistral-7B-v0.1",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 0 |
41 | PL | PLLuM | Na bazie jakiego modelu języka powstał PLLuM-12B-base? | [
"Llama-2-7b-hf",
"Mistral-Nemo-Base-2407",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 1 |
42 | PL | PLLuM | Na bazie jakiego modelu języka powstał PLLuM-8x7B-base? | [
"Llama-2-7b-hf",
"Qwen-2.5-14B",
"Mixtral-8x7B-v0.1",
"gemma-7b-it"
] | 2 |
43 | PL | PLLuM | Jaka uczelnia była liderem projektu PLLuM? | [
"Politechnika Wrocławska",
"Uniwersytet Warszawski",
"AGH",
"Uniwersyste im. Adama Mickiewicza"
] | 0 |
44 | PL | PLLuM | Jakie okno kontekstowe ma model języka Llama-PLLuM-8B-base? | [
"1K",
"32K",
"64K",
"128k"
] | 3 |
45 | PL | PLLuM | Jakie okno kontekstowe ma model języka PLLuM-12B-base? | [
"1K",
"32K",
"64K",
"128k"
] | 1 |
46 | PL | PLLuM | Jakie okno kontekstowe ma model języka PLLuM-8x7B-base? | [
"1K",
"32K",
"64K",
"128k"
] | 3 |
47 | PL | Benchmark | Naukowcy z jakiego uniwerystetu w Polsce stworzyli benchmark llmzszł? | [
"Uniwersytet Szczeciński",
"Uniwersytet Warszawski",
"Uniwersyste im. Adama Mickiewicza",
"Uniwersytet Jagielloński"
] | 2 |
48 | PL | Benchmark | Na jakim type danych oparty jest benchmark llmzszł? | [
"Fake news i artykuły dezinformacyjne",
"Polskie egzaminy państwowe",
"Korpus literacki w języku polskim",
"Archiwalne nagrania sejmowe"
] | 1 |
49 | PL | Benchmark | Czym jest llmzsł w kotekście ewaluacji modeli języka? | [
"benchmark",
"metryka ewaluacji",
"mały model języka",
"narzędzie do ewaluacji"
] | 0 |
50 | PL | Benchmark | Jaki typ zadania znjadziemy w benchmarku llmzszl? | [
"analiza sentymentu",
"porównywanie podobieństwa zdań",
"test na kreatywność",
"test jednokrotnego wybóru"
] | 3 |
51 | EN | Qra | How many language models are part of the Qra family? | [
"0",
"1",
"2",
"3"
] | 3 |
52 | EN | Qra | How many billion parameters does the largest model in the Qra family have? | [
"1",
"70",
"13",
"8"
] | 2 |
53 | EN | Qra | How many billion parameters does the smallest model in the Qra family have? | [
"0.5",
"1",
"1.5",
"3"
] | 1 |
54 | EN | Qra | Which university is behind the creation of the Qra language model? | [
"Gdańsk University of Technology",
"University of Warsaw",
"AGH",
"Adam Mickiewicz University"
] | 0 |
55 | EN | Qra | What is the context window size for language models from the Qra family? | [
"2048",
"512",
"4096",
"1024"
] | 2 |
56 | EN | Qra | Under which license is the Qra-1B language model released? | [
"Apache 2.0",
"MIT",
"LLama 2 Community License Agreement",
"Creative Commons"
] | 0 |
57 | EN | Qra | Which language model was used as the base for the Qra-1B model? | [
"Llama-3.2-1B",
"GPT-2",
"Mistral-7B-v0.3",
"TinyLlama-1.1B"
] | 3 |
58 | EN | Qra | How many days did it take to train the Qra-1B model? | [
"7",
"1",
"2",
"5"
] | 2 |
59 | EN | Qra | Which language model was used as the base for the Qra-7B model? | [
"Mistral-7B-v0.1",
"Llama-2-7b-hf",
"Meta-Llama-3-8B",
"Bielik-7B-v0.1"
] | 1 |
60 | EN | Qra | Under which license is the Qra-7B language model released? | [
"GNU GPL",
"Qra Open-License",
"MIT",
"LLama 2 Community License Agreement"
] | 3 |
61 | EN | Qra | How many days did it take to train the Qra-7B model? | [
"14",
"7",
"21",
"30"
] | 0 |
62 | EN | Qra | Which language model was used as the base for the Qra-13B model? | [
"Phi4-14B",
"Qwen-2.5-14B",
"Llama-2-13b-hf",
"Olmo2-13B"
] | 2 |
63 | EN | Qra | Under which license is the Qra-13B language model released? | [
"LLama 2 Community License Agreement",
"Apache License 2.0",
"MIT",
"LGPL"
] | 0 |
64 | EN | Qra | How many days did it take to train the Qra-13B model? | [
"13",
"35",
"7",
"25"
] | 1 |
65 | EN | Qra | What type of language model are the models from the Qra family? | [
"base",
"chat",
"instruct",
"guardrail"
] | 0 |
66 | EN | Qra | What is the main language that the language models from the Qra family operate in? | [
"English",
"Slovak",
"Polish",
"Russian"
] | 2 |
67 | EN | Qra | What is the size of the final training corpus for language models from the Qra family? | [
"1 TB",
"500 GB",
"13 GB",
"2 TB"
] | 0 |
68 | EN | Bielik | Who is the main owner of the language models from the Bielik family? | [
"Polish Ministry of Digitization",
"Adam Mickiewicz University",
"Warsaw University of Technology",
"SpeakLeash"
] | 3 |
69 | EN | Bielik | What type of AI model is the Sójka model? | [
"embedding",
"guardrail",
"llm-instruct",
"llm-base"
] | 1 |
70 | EN | Bielik | How many billion parameters does the largest model in the Bielik family have? | [
"11",
"8",
"7",
"70"
] | 0 |
71 | EN | Bielik | How many billion parameters does the smallest model in the Bielik family have? | [
"11",
"8",
"7",
"70"
] | 2 |
72 | EN | Bielik | Which language model was used as the base for the Bielik-7B-v0.1 model? | [
"Llama-2-7b-hf",
"Mistral-7B-v0.1",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 2 |
73 | EN | Bielik | Under which license is the Bielik-7B-v0.1 language model released? | [
"LLama 2 Community License Agreement",
"Apache License 2.0",
"MIT",
"LGPL"
] | 1 |
74 | EN | Bielik | Which language model was used as the base for the Bielik-11B-v2 model? | [
"Mistral-7B-v0.2",
"Qwen-2.5-14B",
"gemma-7b-it",
"GPT2-xl"
] | 0 |
75 | EN | Bielik | What is the main language that the language models from the Bielik family operate in? | [
"English",
"Slovak",
"Polish",
"Russian"
] | 2 |
76 | EN | Bielik | Which university infrastructure was used to train the Bielik language model? | [
"Gdańsk University of Technology",
"University of Warsaw",
"AGH",
"Adam Mickiewicz University"
] | 2 |
77 | EN | Bielik | What is the context window size for the Bielik-7B-v0.1 language model? | [
"2048",
"512",
"4096",
"1024"
] | 2 |
78 | EN | Bielik | What is the context window size for the Bielik-7B-v2 language models? | [
"32000",
"128000",
"4096",
"16000"
] | 0 |
79 | EN | Bielik | Which language model was adopted by the Gaius Lex startup? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 0 |
80 | EN | Bielik | Which language model was adopted by the Poznań-based company DomData? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 0 |
81 | EN | PLLuM | Which model will be used in Polish public administration? | [
"Qra",
"Bielik",
"PLLuM",
"PolishRoberta"
] | 2 |
82 | EN | PLLuM | What is the main language that the PLLuM model operates in? | [
"English",
"German",
"Polish",
"French"
] | 2 |
83 | EN | PLLuM | How many billion parameters does the smallest model in the PLLuM family have? | [
"12",
"8",
"7",
"70"
] | 1 |
84 | EN | PLLuM | Who is the main owner of the models from the PLLuM family? | [
"Polish Ministry of Digitization",
"Adam Mickiewicz University",
"Warsaw University of Technology",
"SpeakLeash"
] | 0 |
85 | EN | PLLuM | How many billion parameters does the largest model in the PLLuM family have? | [
"12",
"8",
"7",
"70"
] | 1 |
86 | EN | PLLuM | Which language model was adopted by Comarch in ChatERP? | [
"Bielik",
"Qra",
"PLLuM",
"GPT-3"
] | 2 |
87 | EN | PLLuM | Under which license is the Llama-PLLuM-8B-base model released? | [
"Llama 3.1",
"Apache License 2.0",
"MIT",
"LGPL"
] | 0 |
88 | EN | PLLuM | Under which license is the PLLuM-12B-base language model released? | [
"Apache License 2.0",
"CC-BY-NC-4.0",
"MIT",
"LGPL"
] | 1 |
89 | EN | PLLuM | Under which license is the PLLuM-8x7B-base language model released? | [
"CC-BY-NC-4.0",
"MIT",
"LGPL",
"Apache License 2.0"
] | 3 |
90 | EN | PLLuM | Which language model was used as the base for the Llama-PLLuM-8B-base model? | [
"Llama3.1-8B",
"Mistral-7B-v0.1",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 0 |
91 | EN | PLLuM | Which language model was PLLuM-12B-base based on? | [
"Llama-2-7b-hf",
"Mistral-Nemo-Base-2407",
"Qwen-2.5-14B",
"gemma-7b-it"
] | 1 |
92 | EN | PLLuM | Which language model was PLLuM-8x7B-base based on? | [
"Llama-2-7b-hf",
"Qwen-2.5-14B",
"Mixtral-8x7B-v0.1",
"gemma-7b-it"
] | 2 |
93 | EN | PLLuM | Which university was the leader of the PLLuM project? | [
"Wrocław University of Technology",
"University of Warsaw",
"AGH",
"Adam Mickiewicz University in Poznań"
] | 0 |
94 | EN | PLLuM | What is the context window size of the Llama-PLLuM-8B-base language model? | [
"1K",
"32K",
"64K",
"128K"
] | 3 |
95 | EN | PLLuM | What is the context window size of the PLLuM-12B-base language model? | [
"1K",
"32K",
"64K",
"128K"
] | 1 |
96 | EN | PLLuM | What is the context window size of the PLLuM-8x7B-base language model? | [
"1K",
"32K",
"64K",
"128K"
] | 3 |
97 | EN | Benchmark | Which university in Poland created the llmzszł benchmark? | [
"University of Szczecin",
"University of Warsaw",
"Adam Mickiewicz University in Poznań",
"Jagiellonian University"
] | 2 |
98 | EN | Benchmark | What type of data is the llmzszł benchmark based on? | [
"Fake news and disinformation articles",
"Polish state exams",
"Literary corpus in Polish",
"Archival parliamentary recordings"
] | 1 |
99 | EN | Benchmark | What is llmzsł in the context of language model evaluation? | [
"benchmark",
"evaluation metric",
"small language model",
"evaluation tool"
] | 0 |
100 | EN | Benchmark | What type of task can be found in the llmzszł benchmark? | [
"sentiment analysis",
"sentence similarity comparison",
"creativity test",
"single-choice test"
] | 3 |
Polish Language Model Awareness Benchmark
- Downloads last month
- 36