Intro

image/png

GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoders (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios.

This particular version utilize bi-encoder architecture, where textual encoder is team-lucid/DeBERTa v3 small and entity label encoder is sentence transformer - BAAI/bge-m3.

Such architecture brings several advantages over uni-encoder GLiNER:

  • An unlimited amount of entities can be recognized at a single time;
  • Faster inference if entity embeddings are preprocessed;
  • Better generalization to unseen entities;

However, it has some drawbacks such as a lack of inter-label interactions that make it hard for the model to disambiguate semantically similar but contextually different entities.


Installation & Usage

Install or update the gliner package:

pip install gliner>=0.2.16
pip install python-mecab-ko

Once you've downloaded the GLiNER library, you can import the GLiNER class. You can then load this model using GLiNER.from_pretrained and predict entities with predict_entities.

from gliner import GLiNER

model = GLiNER.from_pretrained("lots-o/gliner-bi-ko-small-v1")

text = """ํฌ๋ฆฌ์Šคํ† ํผ ๋†€๋ž€(Christopher Nolan) ์€ ์˜๊ตญ์˜ ์˜ํ™” ๊ฐ๋…, ๊ฐ๋ณธ๊ฐ€, ์˜ํ™” ํ”„๋กœ๋“€์„œ์ด๋‹ค. ๊ทธ์˜ ๋Œ€ํ‘œ์ž‘์œผ๋กœ๋Š” 2008๋…„ ๊ฐœ๋ด‰ํ•œ ใ€Š๋‹คํฌ ๋‚˜์ดํŠธใ€‹ ์‹œ๋ฆฌ์ฆˆ๊ฐ€ ์žˆ์œผ๋ฉฐ, ํŠนํžˆ ใ€Š๋‹คํฌ ๋‚˜์ดํŠธใ€‹(2008)์˜ ๊ฐ๋…์œผ๋กœ ๊ฐ€์žฅ ์œ ๋ช…ํ•˜๋‹ค. ์ด ์˜ํ™”๋Š” ๋ฐฐํŠธ๋งจ ์บ๋ฆญํ„ฐ๋ฅผ ์ค‘์‹ฌ์œผ๋กœ ํ•œ ์Šˆํผํžˆ์–ด๋กœ ์˜ํ™”๋กœ, ํžˆ์Šค ๋ ˆ์ €์˜ ์กฐ์ปค ์—ญํ• ์ด ํฐ ์ธ๊ธฐ๋ฅผ ๋Œ์—ˆ๋‹ค. ๋˜ํ•œ, 2010๋…„์— ๊ฐœ๋ด‰ํ•œ ใ€Š์ธ์…‰์…˜ใ€‹(2010)์€ ๋ณต์žกํ•œ ์‹œ๊ฐ„๊ณผ ๊ฟˆ์˜ ๊ฐœ๋…์„ ๋‹ค๋ฃฌ SF ์˜ํ™”๋กœ, ์˜ํ™” ์ œ์ž‘ ๋ฐฉ์‹๊ณผ ์Šคํ† ๋ฆฌ ์ „๊ฐœ์—์„œ ํ˜์‹ ์ ์ธ ์ ‘๊ทผ์„ ์„ ๋ณด์˜€๋‹ค. ํฌ๋ฆฌ์Šคํ† ํผ ๋†€๋ž€์€ ์‹œ๊ฐ„ ์—ฌํ–‰๊ณผ ๋‹ค์ฐจ์›์  ์ด์•ผ๊ธฐ๋ฅผ ํƒ๊ตฌํ•˜๋Š” ์˜ํ™”๋“ค์„ ํ†ตํ•ด ํ˜„๋Œ€ ์˜ํ™”๊ณ„์—์„œ ์ค‘์š”ํ•œ ๊ฐ๋…์œผ๋กœ ์ž๋ฆฌ๋งค๊น€ํ–ˆ๋‹ค.
"""

labels = [
    "์˜ํ™”/์†Œ์„ค ์ž‘ํ’ˆ๋ช…",
    "์‚ฌ๋žŒ ์ด๋ฆ„",
    "์บ๋ฆญํ„ฐ ์ด๋ฆ„",
    "์ง์—…๋ช…",
    "๋‚ ์งœ_์—ฐ(๋…„)",
    "๋‚ ์งœ_์ผ",
    "๋‚ ์งœ_๋‹ฌ(์›”)",
    "๊ตญ๊ฐ€"
]

entities = model.predict_entities(text, labels, threshold=0.2)

for entity in entities:
    print(entity["text"], "=>", entity["label"])
ํฌ๋ฆฌ์Šคํ† ํผ ๋†€๋ž€ => ์‚ฌ๋žŒ ์ด๋ฆ„
Christopher Nolan => ์‚ฌ๋žŒ ์ด๋ฆ„
์˜๊ตญ => ๊ตญ๊ฐ€
์˜ํ™” ๊ฐ๋… => ์ง์—…๋ช…
๊ฐ๋ณธ๊ฐ€ => ์ง์—…๋ช…
์˜ํ™” ํ”„๋กœ๋“€์„œ => ์ง์—…๋ช…
2008๋…„ => ๋‚ ์งœ_์—ฐ(๋…„)
๋‹คํฌ ๋‚˜์ดํŠธ => ์˜ํ™”/์†Œ์„ค ์ž‘ํ’ˆ๋ช…
๋‹คํฌ ๋‚˜์ดํŠธ => ์˜ํ™”/์†Œ์„ค ์ž‘ํ’ˆ๋ช…
2008 => ๋‚ ์งœ_์—ฐ(๋…„)
๊ฐ๋… => ์ง์—…๋ช…
๋ฐฐํŠธ๋งจ => ์บ๋ฆญํ„ฐ ์ด๋ฆ„
ํžˆ์Šค ๋ ˆ์ € => ์‚ฌ๋žŒ ์ด๋ฆ„
์กฐ์ปค => ์บ๋ฆญํ„ฐ ์ด๋ฆ„
2010๋…„ => ๋‚ ์งœ_์—ฐ(๋…„)
์ธ์…‰์…˜ => ์˜ํ™”/์†Œ์„ค ์ž‘ํ’ˆ๋ช…
2010 => ๋‚ ์งœ_์—ฐ(๋…„)
ํฌ๋ฆฌ์Šคํ† ํผ ๋†€๋ž€ => ์‚ฌ๋žŒ ์ด๋ฆ„
๊ฐ๋… => ์ง์—…๋ช…

If you have a large amount of entities and want to pre-embed them, please, refer to the following code snippet:

labels = ["your entities"]
texts = ["your texts"]

entity_embeddings = model.encode_labels(labels, batch_size = 8)

outputs = model.batch_predict_with_embeds(texts, entity_embeddings, labels)

Dataset

TTA 150

entity_type_mapping = {
    "PS": {
        "PS_NAME": "์ธ๋ฌผ_์‚ฌ๋žŒ",
        "PS_CHARACTER": "์ธ๋ฌผ_๊ฐ€์ƒ ์บ๋ฆญํ„ฐ",
        "PS_PET": "์ธ๋ฌผ_๋ฐ˜๋ ค๋™๋ฌผ",
    },
    "FD": {
        "FD_SCIENCE": "ํ•™๋ฌธ ๋ถ„์•ผ_๊ณผํ•™",
        "FD_SOCIAL_SCIENCE": "ํ•™๋ฌธ ๋ถ„์•ผ_์‚ฌํšŒ๊ณผํ•™",
        "FD_MEDICINE": "ํ•™๋ฌธ ๋ถ„์•ผ_์˜ํ•™",
        "FD_ART": "ํ•™๋ฌธ ๋ถ„์•ผ_์˜ˆ์ˆ ",
        "FD_HUMANITIES": "ํ•™๋ฌธ ๋ถ„์•ผ_์ธ๋ฌธํ•™",
        "FD_OTHERS": "ํ•™๋ฌธ ๋ถ„์•ผ_๊ธฐํƒ€",
    },
    "TR": {
        "TR_SCIENCE": "์ด๋ก _๊ณผํ•™",
        "TR_SOCIAL_SCIENCE": "์ด๋ก _์‚ฌํšŒ๊ณผํ•™",
        "TR_MEDICINE": "์ด๋ก _์˜ํ•™",
        "TR_ART": "์ด๋ก _์˜ˆ์ˆ ",
        "TR_HUMANITIES": "์ด๋ก _์ฒ ํ•™/์–ธ์–ด/์—ญ์‚ฌ",
        "TR_OTHERS": "์ด๋ก _๊ธฐํƒ€",
    },
    "AF": {
        "AF_BUILDING": "์ธ๊ณต๋ฌผ_๊ฑด์ถ•๋ฌผ/ํ† ๋ชฉ๊ฑด์„ค๋ฌผ",
        "AF_CULTURAL_ASSET": "์ธ๊ณต๋ฌผ_๋ฌธํ™”์žฌ",
        "AF_ROAD": "์ธ๊ณต๋ฌผ_๋„๋กœ/์ฒ ๋กœ",
        "AF_TRANSPORT": "์ธ๊ณต๋ฌผ_๊ตํ†ต์ˆ˜๋‹จ/์šด์†ก์ˆ˜๋‹จ",
        "AF_MUSICAL_INSTRUMENT": "์ธ๊ณต๋ฌผ_์•…๊ธฐ",
        "AF_WEAPON": "์ธ๊ณต๋ฌผ_๋ฌด๊ธฐ",
        "AFA_DOCUMENT": "์ธ๊ณต๋ฌผ_๋„์„œ/์„œ์  ์ž‘ํ’ˆ๋ช…",
        "AFA_PERFORMANCE": "์ธ๊ณต๋ฌผ_์ถค/๊ณต์—ฐ/์—ฐ๊ทน ์ž‘ํ’ˆ๋ช…",
        "AFA_VIDEO": "์ธ๊ณต๋ฌผ_์˜ํ™”/TV ํ”„๋กœ๊ทธ๋žจ",
        "AFA_ART_CRAFT": "์ธ๊ณต๋ฌผ_๋ฏธ์ˆ /์กฐํ˜• ์ž‘ํ’ˆ๋ช…",
        "AFA_MUSIC": "์ธ๊ณต๋ฌผ_์Œ์•… ์ž‘ํ’ˆ๋ช…",
        "AFW_SERVICE_PRODUCTS": "์ธ๊ณต๋ฌผ_์„œ๋น„์Šค ์ƒํ’ˆ",
        "AFW_OTHER_PRODUCTS": "์ธ๊ณต๋ฌผ_๊ธฐํƒ€ ์ƒํ’ˆ",
    },
    "OG": {
        "OGG_ECONOMY": "๊ธฐ๊ด€_๊ฒฝ์ œ",
        "OGG_EDUCATION": "๊ธฐ๊ด€_๊ต์œก",
        "OGG_MILITARY": "๊ธฐ๊ด€_๊ตฐ์‚ฌ",
        "OGG_MEDIA": "๊ธฐ๊ด€_๋ฏธ๋””์–ด",
        "OGG_SPORTS": "๊ธฐ๊ด€_์Šคํฌ์ธ ",
        "OGG_ART": "๊ธฐ๊ด€_์˜ˆ์ˆ ",
        "OGG_MEDICINE": "๊ธฐ๊ด€_์˜๋ฃŒ",
        "OGG_RELIGION": "๊ธฐ๊ด€_์ข…๊ต",
        "OGG_SCIENCE": "๊ธฐ๊ด€_๊ณผํ•™",
        "OGG_LIBRARY": "๊ธฐ๊ด€_๋„์„œ๊ด€",
        "OGG_LAW": "๊ธฐ๊ด€_๋ฒ•๋ฅ ",
        "OGG_POLITICS": "๊ธฐ๊ด€_์ •๋ถ€/๊ณต๊ณต",
        "OGG_FOOD": "๊ธฐ๊ด€_์Œ์‹ ์—…์ฒด",
        "OGG_HOTEL": "๊ธฐ๊ด€_์ˆ™๋ฐ• ์—…์ฒด",
        "OGG_OTHERS": "๊ธฐ๊ด€_๊ธฐํƒ€",
    },
    "LC": {
        "LCP_COUNTRY": "์žฅ์†Œ_๊ตญ๊ฐ€",
        "LCP_PROVINCE": "์žฅ์†Œ_๋„/์ฃผ ์ง€์—ญ",
        "LCP_COUNTY": "์žฅ์†Œ_์„ธ๋ถ€ ํ–‰์ •๊ตฌ์—ญ",
        "LCP_CITY": "์žฅ์†Œ_๋„์‹œ",
        "LCP_CAPITALCITY": "์žฅ์†Œ_์ˆ˜๋„",
        "LCG_RIVER": "์žฅ์†Œ_๊ฐ•/ํ˜ธ์ˆ˜",
        "LCG_OCEAN": "์žฅ์†Œ_๋ฐ”๋‹ค",
        "LCG_BAY": "์žฅ์†Œ_๋ฐ˜๋„/๋งŒ",
        "LCG_MOUNTAIN": "์žฅ์†Œ_์‚ฐ/์‚ฐ๋งฅ",
        "LCG_ISLAND": "์žฅ์†Œ_์„ฌ",
        "LCG_CONTINENT": "์žฅ์†Œ_๋Œ€๋ฅ™",
        "LC_SPACE": "์žฅ์†Œ_์ฒœ์ฒด",
        "LC_OTHERS": "์žฅ์†Œ_๊ธฐํƒ€",
    },
    "CV": {
        "CV_CULTURE": "๋ฌธ๋ช…_๋ฌธ๋ช…/๋ฌธํ™”",
        "CV_TRIBE": "๋ฌธ๋ช…_๋ฏผ์กฑ/์ข…์กฑ",
        "CV_LANGUAGE": "๋ฌธ๋ช…_์–ธ์–ด",
        "CV_POLICY": "๋ฌธ๋ช…_์ œ๋„/์ •์ฑ…",
        "CV_LAW": "๋ฌธ๋ช…_๋ฒ•/๋ฒ•๋ฅ ",
        "CV_CURRENCY": "๋ฌธ๋ช…_ํ†ตํ™”",
        "CV_TAX": "๋ฌธ๋ช…_์กฐ์„ธ",
        "CV_FUNDS": "๋ฌธ๋ช…_์—ฐ๊ธˆ/๊ธฐ๊ธˆ",
        "CV_ART": "๋ฌธ๋ช…_์˜ˆ์ˆ ",
        "CV_SPORTS": "๋ฌธ๋ช…_์Šคํฌ์ธ ",
        "CV_SPORTS_POSITION": "๋ฌธ๋ช…_์Šคํฌ์ธ  ํฌ์ง€์…˜",
        "CV_SPORTS_INST": "๋ฌธ๋ช…_์Šคํฌ์ธ  ์šฉํ’ˆ/๋„๊ตฌ",
        "CV_PRIZE": "๋ฌธ๋ช…_์ƒ/ํ›ˆ์žฅ",
        "CV_RELATION": "๋ฌธ๋ช…_๊ฐ€์กฑ/์นœ์กฑ ๊ด€๊ณ„",
        "CV_OCCUPATION": "๋ฌธ๋ช…_์ง์—…",
        "CV_POSITION": "๋ฌธ๋ช…_์ง์œ„/์ง์ฑ…",
        "CV_FOOD": "๋ฌธ๋ช…_์Œ์‹",
        "CV_DRINK": "๋ฌธ๋ช…_์Œ๋ฃŒ/์ˆ ",
        "CV_FOOD_STYLE": "๋ฌธ๋ช…_์Œ์‹ ์œ ํ˜•",
        "CV_CLOTHING": "๋ฌธ๋ช…_์˜๋ณต/์„ฌ์œ ",
        "CV_BUILDING_TYPE": "๋ฌธ๋ช…_๊ฑด์ถ• ์–‘์‹",
    },
    "DT": {
        "DT_DURATION": "๋‚ ์งœ_๊ธฐ๊ฐ„",
        "DT_DAY": "๋‚ ์งœ_์ผ",
        "DT_WEEK": "๋‚ ์งœ_์ฃผ(์ฃผ์ฐจ)",
        "DT_MONTH": "๋‚ ์งœ_๋‹ฌ(์›”)",
        "DT_YEAR": "๋‚ ์งœ_์—ฐ(๋…„)",
        "DT_SEASON": "๋‚ ์งœ_๊ณ„์ ˆ",
        "DT_GEOAGE": "๋‚ ์งœ_์ง€์งˆ์‹œ๋Œ€",
        "DT_DYNASTY": "๋‚ ์งœ_์™•์กฐ์‹œ๋Œ€",
        "DT_OTHERS": "๋‚ ์งœ_๊ธฐํƒ€",
    },
    "TI": {
        "TI_DURATION": "์‹œ๊ฐ„_๊ธฐ๊ฐ„",
        "TI_HOUR": "์‹œ๊ฐ„_์‹œ๊ฐ(์‹œ)",
        "TI_MINUTE": "์‹œ๊ฐ„_๋ถ„",
        "TI_SECOND": "์‹œ๊ฐ„_์ดˆ",
        "TI_OTHERS": "์‹œ๊ฐ„_๊ธฐํƒ€",
    },
    "QT": {
        "QT_AGE": "์ˆ˜๋Ÿ‰_๋‚˜์ด",
        "QT_SIZE": "์ˆ˜๋Ÿ‰_๋„“์ด/๋ฉด์ ",
        "QT_LENGTH": "์ˆ˜๋Ÿ‰_๊ธธ์ด/๊ฑฐ๋ฆฌ",
        "QT_COUNT": "์ˆ˜๋Ÿ‰_์ˆ˜๋Ÿ‰/๋นˆ๋„",
        "QT_MAN_COUNT": "์ˆ˜๋Ÿ‰_์ธ์›์ˆ˜",
        "QT_WEIGHT": "์ˆ˜๋Ÿ‰_๋ฌด๊ฒŒ",
        "QT_PERCENTAGE": "์ˆ˜๋Ÿ‰_๋ฐฑ๋ถ„์œจ",
        "QT_SPEED": "์ˆ˜๋Ÿ‰_์†๋„",
        "QT_TEMPERATURE": "์ˆ˜๋Ÿ‰_์˜จ๋„",
        "QT_VOLUME": "์ˆ˜๋Ÿ‰_๋ถ€ํ”ผ",
        "QT_ORDER": "์ˆ˜๋Ÿ‰_์ˆœ์„œ",
        "QT_PRICE": "์ˆ˜๋Ÿ‰_๊ธˆ์•ก",
        "QT_PHONE": "์ˆ˜๋Ÿ‰_์ „ํ™”๋ฒˆํ˜ธ",
        "QT_SPORTS": "์ˆ˜๋Ÿ‰_์Šคํฌ์ธ  ์ˆ˜๋Ÿ‰",
        "QT_CHANNEL": "์ˆ˜๋Ÿ‰_์ฑ„๋„ ๋ฒˆํ˜ธ",
        "QT_ALBUM": "์ˆ˜๋Ÿ‰_์•จ๋ฒ” ์ˆ˜๋Ÿ‰",
        "QT_ADDRESS": "์ˆ˜๋Ÿ‰_์ฃผ์†Œ ๊ด€๋ จ ์ˆซ์ž",
        "QT_OTHERS": "์ˆ˜๋Ÿ‰_๊ธฐํƒ€ ์ˆ˜๋Ÿ‰",
    },
    "EV": {
        "EV_ACTIVITY": "์‚ฌ๊ฑด_์‚ฌํšŒ์šด๋™/์„ ์–ธ",
        "EV_WAR_REVOLUTION": "์‚ฌ๊ฑด_์ „์Ÿ/ํ˜๋ช…",
        "EV_SPORTS": "์‚ฌ๊ฑด_์Šคํฌ์ธ  ํ–‰์‚ฌ",
        "EV_FESTIVAL": "์‚ฌ๊ฑด_์ถ•์ œ/์˜ํ™”์ œ",
        "EV_OTHERS": "์‚ฌ๊ฑด_๊ธฐํƒ€",
    },
    "AM": {
        "AM_INSECT": "๋™๋ฌผ_๊ณค์ถฉ",
        "AM_BIRD": "๋™๋ฌผ_์กฐ๋ฅ˜",
        "AM_FISH": "๋™๋ฌผ_์–ด๋ฅ˜",
        "AM_MAMMALIA": "๋™๋ฌผ_ํฌ์œ ๋ฅ˜",
        "AM_AMPHIBIA": "๋™๋ฌผ_์–‘์„œ๋ฅ˜",
        "AM_REPTILIA": "๋™๋ฌผ_ํŒŒ์ถฉ๋ฅ˜",
        "AM_TYPE": "๋™๋ฌผ_๋ถ„๋ฅ˜๋ช…",
        "AM_PART": "๋™๋ฌผ_๋ถ€์œ„๋ช…",
        "AM_OTHERS": "๋™๋ฌผ_๊ธฐํƒ€",
    },
    "PT": {
        "PT_FRUIT": "์‹๋ฌผ_๊ณผ์ผ/์—ด๋งค",
        "PT_FLOWER": "์‹๋ฌผ_๊ฝƒ",
        "PT_TREE": "์‹๋ฌผ_๋‚˜๋ฌด",
        "PT_GRASS": "์‹๋ฌผ_ํ’€",
        "PT_TYPE": "์‹๋ฌผ_๋ถ„๋ฅ˜๋ช…",
        "PT_PART": "์‹๋ฌผ_๋ถ€์œ„๋ช…",
        "PT_OTHERS": "์‹๋ฌผ_๊ธฐํƒ€",
    },
    "MT": {
        "MT_ELEMENT": "๋ฌผ์งˆ_์›์†Œ",
        "MT_METAL": "๋ฌผ์งˆ_๊ธˆ์†",
        "MT_ROCK": "๋ฌผ์งˆ_์•”์„",
        "MT_CHEMICAL": "๋ฌผ์งˆ_ํ™”ํ•™",
    },
    "TM": {
        "TM_COLOR": "์šฉ์–ด_์ƒ‰๊น”",
        "TM_DIRECTION": "์šฉ์–ด_๋ฐฉํ–ฅ",
        "TM_CLIMATE": "์šฉ์–ด_๊ธฐํ›„ ์ง€์—ญ",
        "TM_SHAPE": "์šฉ์–ด_๋ชจ์–‘/ํ˜•ํƒœ",
        "TM_CELL_TISSUE_ORGAN": "์šฉ์–ด_์„ธํฌ/์กฐ์ง/๊ธฐ๊ด€",
        "TMM_DISEASE": "์šฉ์–ด_์ฆ์ƒ/์งˆ๋ณ‘",
        "TMM_DRUG": "์šฉ์–ด_์•ฝํ’ˆ",
        "TMI_HW": "์šฉ์–ด_IT ํ•˜๋“œ์›จ์–ด",
        "TMI_SW": "์šฉ์–ด_IT ์†Œํ”„ํŠธ์›จ์–ด",
        "TMI_SITE": "์šฉ์–ด_URL ์ฃผ์†Œ",
        "TMI_EMAIL": "์šฉ์–ด_์ด๋ฉ”์ผ ์ฃผ์†Œ",
        "TMI_MODEL": "์šฉ์–ด_์ œํ’ˆ ๋ชจ๋ธ๋ช…",
        "TMI_SERVICE": "์šฉ์–ด_IT ์„œ๋น„์Šค",
        "TMI_PROJECT": "์šฉ์–ด_ํ”„๋กœ์ ํŠธ",
        "TMIG_GENRE": "์šฉ์–ด_๊ฒŒ์ž„ ์žฅ๋ฅด",
        "TM_SPORTS": "์šฉ์–ด_์Šคํฌ์ธ ",
    },
}

Evaluation

Evaluate with the konne dev set :
The evaluation results presented in the table below, except for the values I provided, were derived from the following source: taeminlee/gliner_ko.

Model Precision(P) Recall(R) F1
gliner-bi-ko-small-v1 (t=0.5) 81.53% 74.16% 77.67%
gliner-bi-ko-xlarge-v1 (t=0.5) 84.73% 77.71% 81.07%
Gliner-ko (t=0.5) 72.51% 79.82% 75.99%
Gliner Large-v2 (t=0.5) 34.33% 19.50% 24.87%
Gliner Multi (t=0.5) 40.94% 34.18% 37.26%
Pororo 70.25% 57.94% 63.50%

Citation

@misc{gliner_bi_ko_small_v1,
  title={gliner-bi-ko-small-v1},
  author={Gihwan Kim},
  year={2025},
  url={https://huggingface.co/lots-o/gliner-bi-ko-small-v1}
  publisher={Hugging Face}
}
@misc{zaratiana2023gliner,
      title={GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer}, 
      author={Urchade Zaratiana and Nadi Tomeh and Pierre Holat and Thierry Charnois},
      year={2023},
      eprint={2311.08526},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for lots-o/gliner-bi-ko-small-v1

Finetuned
(1)
this model

Collection including lots-o/gliner-bi-ko-small-v1