ankur310794 commited on
Commit
d8c96e3
·
1 Parent(s): fb2b376

readme updated

Browse files
Files changed (1) hide show
  1. README.md +13 -13
README.md CHANGED
@@ -11,23 +11,27 @@ probably proofread and complete it, then remove this comment. -->
11
 
12
  # dpr-ctx_encoder_bert_uncased_L-12_H-128_A-2
13
 
14
- This model was trained from scratch on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
 
17
 
18
- ## Model description
19
 
20
- More information needed
21
 
22
- ## Intended uses & limitations
 
 
 
23
 
24
- More information needed
25
 
26
- ## Training and evaluation data
 
 
 
27
 
28
- More information needed
29
-
30
- ## Training procedure
31
 
32
  ### Training hyperparameters
33
 
@@ -35,10 +39,6 @@ The following hyperparameters were used during training:
35
  - optimizer: None
36
  - training_precision: float32
37
 
38
- ### Training results
39
-
40
-
41
-
42
  ### Framework versions
43
 
44
  - Transformers 4.15.0
 
11
 
12
  # dpr-ctx_encoder_bert_uncased_L-12_H-128_A-2
13
 
14
+ This model(google/bert_uncased_L-12_H-128_A-2) was trained from scratch on training data: data.retriever.nq-adv-hn-train(facebookresearch/DPR).
15
  It achieves the following results on the evaluation set:
16
 
17
 
18
+ ## Evaluation data
19
 
20
+ evaluation dataset: facebook-dpr-dev-dataset from official DPR github
21
 
22
+ |model_name|data_name|num of queries|num of passages|R@10|R@20|R@50|R@100|R@100|
23
+ |---|---|---|---|---|---|---|---|---|
24
+ |nlpconnect/dpr-ctx_encoder_bert_uncased_L-12_H-128_A-2(our)|nq-dev dataset|6445|199795|60.53%|68.28%|76.07%|80.98%|91.45%|
25
+ |*facebook/dpr-ctx_encoder-single-nq-base(hf/fb)|nq-dev dataset|6445|199795|40.94%|49.27%|59.05%|66.00%|82.00%|
26
 
27
+ evaluation dataset: UKPLab/beir test data but we have used first 2lac passage only.
28
 
29
+ |model_name|data_name|num of queries|num of passages|R@10|R@20|R@50|R@100|R@100|
30
+ |---|---|---|---|---|---|---|---|---|
31
+ |nlpconnect/dpr-ctx_encoder_bert_uncased_L-12_H-128_A-2(our)|nq-test dataset|3452|200001|49.68%|59.06%|69.40%|75.75%|89.28%|
32
+ |*facebook/dpr-ctx_encoder-single-nq-base(hf/fb)|nq-test dataset|3452|200001|32.93%|43.74%|56.95%|66.30%|83.92%|
33
 
34
+ Note: * means we have evaluated on same eval dataset.
 
 
35
 
36
  ### Training hyperparameters
37
 
 
39
  - optimizer: None
40
  - training_precision: float32
41
 
 
 
 
 
42
  ### Framework versions
43
 
44
  - Transformers 4.15.0