ezequiel commited on
Commit
f861129
·
verified ·
1 Parent(s): 44f2a29

sub-version a: Hyper Tunning for "full stack", "back end" and "front end"

Browse files
Files changed (3) hide show
  1. README.md +164 -164
  2. model card.txt +7 -5
  3. model.safetensors +1 -1
README.md CHANGED
@@ -4,35 +4,35 @@ tags:
4
  - sentence-similarity
5
  - feature-extraction
6
  - generated_from_trainer
7
- - dataset_size:909469
8
  - loss:CosineSimilarityLoss
9
  base_model: intfloat/multilingual-e5-small
10
  widget:
11
- - source_sentence: esperto in formazione e motivazione del personale
12
  sentences:
13
- - linguistic informatics
14
- - multi state oversight
15
- - esperto in formazione e motivazione del personale
16
- - source_sentence: drilling horizontal
17
  sentences:
18
- - gcp, bigquery, service accounts, cloud foundation
19
- - visite guidate di libri
20
- - introduction to rooms division management
21
- - source_sentence: deploy de sistemas (centosos, cpanel)
22
  sentences:
23
- - nvq 2 automotive technician
24
- - التحاور الشفهي بالبنجابية
25
- - reader oriented-mind
26
- - source_sentence: json処理
27
  sentences:
28
- - उच्च सटीकता वाले हाथ के औजारों का उपयोग
29
- - owasp zapプロキシ
30
- - manipulação de json
31
- - source_sentence: digas audioproduktion
32
  sentences:
33
- - commitment to patient safety
34
- - soziale arbeit und beratung
35
- - formation strategies
36
  pipeline_tag: sentence-similarity
37
  library_name: sentence-transformers
38
  metrics:
@@ -49,10 +49,10 @@ model-index:
49
  type: sts-dev
50
  metrics:
51
  - type: pearson_cosine
52
- value: 0.9577111210471887
53
  name: Pearson Cosine
54
  - type: spearman_cosine
55
- value: 0.8786537702630683
56
  name: Spearman Cosine
57
  - task:
58
  type: semantic-similarity
@@ -62,10 +62,10 @@ model-index:
62
  type: sts-test
63
  metrics:
64
  - type: pearson_cosine
65
- value: 0.9570133219075257
66
  name: Pearson Cosine
67
  - type: spearman_cosine
68
- value: 0.8800070052053953
69
  name: Spearman Cosine
70
  ---
71
 
@@ -119,9 +119,9 @@ from sentence_transformers import SentenceTransformer
119
  model = SentenceTransformer("sentence_transformers_model_id")
120
  # Run inference
121
  sentences = [
122
- 'digas audioproduktion',
123
- 'formation strategies',
124
- 'commitment to patient safety',
125
  ]
126
  embeddings = model.encode(sentences)
127
  print(embeddings.shape)
@@ -166,10 +166,10 @@ You can finetune this model on your own dataset.
166
  * Datasets: `sts-dev` and `sts-test`
167
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
168
 
169
- | Metric | sts-dev | sts-test |
170
- |:--------------------|:-----------|:---------|
171
- | pearson_cosine | 0.9577 | 0.957 |
172
- | **spearman_cosine** | **0.8787** | **0.88** |
173
 
174
  <!--
175
  ## Bias, Risks and Limitations
@@ -189,19 +189,19 @@ You can finetune this model on your own dataset.
189
 
190
  #### Unnamed Dataset
191
 
192
- * Size: 909,469 training samples
193
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
194
  * Approximate statistics based on the first 1000 samples:
195
  | | sentence1 | sentence2 | score |
196
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
197
  | type | string | string | float |
198
- | details | <ul><li>min: 3 tokens</li><li>mean: 9.05 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.62 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> |
199
  * Samples:
200
- | sentence1 | sentence2 | score |
201
- |:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------|:-----------------|
202
- | <code>デジタルデータプロセッシング</code> | <code>数字数据处理</code> | <code>1.0</code> |
203
- | <code>maintenance des équipements électriques, électroniques et de précision</code> | <code>maintenance of electronic, electrical and precision equipment</code> | <code>1.0</code> |
204
- | <code>application frameworks</code> | <code>ソフトウェアフレームワーク</code> | <code>1.0</code> |
205
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
206
  ```json
207
  {
@@ -213,19 +213,19 @@ You can finetune this model on your own dataset.
213
 
214
  #### Unnamed Dataset
215
 
216
- * Size: 113,683 evaluation samples
217
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
218
  * Approximate statistics based on the first 1000 samples:
219
  | | sentence1 | sentence2 | score |
220
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
221
  | type | string | string | float |
222
- | details | <ul><li>min: 3 tokens</li><li>mean: 8.97 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.85 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.56</li><li>max: 1.0</li></ul> |
223
  * Samples:
224
- | sentence1 | sentence2 | score |
225
- |:-------------------------------------------------------|:-----------------------------------|:------------------|
226
- | <code>verstehen von geschriebenem portugiesisch</code> | <code>लिखित पुर्तगाली समझें</code> | <code>1.0</code> |
227
- | <code>backbox pen test</code> | <code>backbox ペネトレーションテスト</code> | <code>1.0</code> |
228
- | <code>ing. industrial en produccion y calidad</code> | <code>national speaking</code> | <code>0.19</code> |
229
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
230
  ```json
231
  {
@@ -369,124 +369,124 @@ You can finetune this model on your own dataset.
369
 
370
  | Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | sts-test_spearman_cosine |
371
  |:------:|:-----:|:-------------:|:---------------:|:-----------------------:|:------------------------:|
372
- | 0.0352 | 500 | 0.2006 | - | - | - |
373
- | 0.0704 | 1000 | 0.0519 | - | - | - |
374
- | 0.1056 | 1500 | 0.0361 | - | - | - |
375
- | 0.1407 | 2000 | 0.0325 | - | - | - |
376
- | 0.1759 | 2500 | 0.0299 | - | - | - |
377
- | 0.2111 | 3000 | 0.0289 | - | - | - |
378
- | 0.2463 | 3500 | 0.0274 | - | - | - |
379
- | 0.2815 | 4000 | 0.0261 | - | - | - |
380
- | 0.3167 | 4500 | 0.0254 | - | - | - |
381
- | 0.3518 | 5000 | 0.0249 | - | - | - |
382
- | 0.3870 | 5500 | 0.0243 | - | - | - |
383
- | 0.4222 | 6000 | 0.0236 | - | - | - |
384
- | 0.4574 | 6500 | 0.0232 | - | - | - |
385
- | 0.4926 | 7000 | 0.0227 | - | - | - |
386
- | 0.5278 | 7500 | 0.0219 | - | - | - |
387
- | 0.5629 | 8000 | 0.0222 | - | - | - |
388
- | 0.5981 | 8500 | 0.0217 | - | - | - |
389
- | 0.6333 | 9000 | 0.0212 | - | - | - |
390
- | 0.6685 | 9500 | 0.0205 | - | - | - |
391
- | 0.7037 | 10000 | 0.0206 | - | - | - |
392
- | 0.7389 | 10500 | 0.0207 | - | - | - |
393
- | 0.7740 | 11000 | 0.02 | - | - | - |
394
- | 0.8092 | 11500 | 0.0198 | - | - | - |
395
- | 0.8444 | 12000 | 0.0199 | - | - | - |
396
- | 0.8796 | 12500 | 0.0196 | - | - | - |
397
- | 0.9148 | 13000 | 0.0192 | - | - | - |
398
- | 0.9500 | 13500 | 0.019 | - | - | - |
399
- | 0.9852 | 14000 | 0.0191 | - | - | - |
400
- | 1.0 | 14211 | - | 0.0169 | 0.8695 | - |
401
- | 1.0203 | 14500 | 0.0187 | - | - | - |
402
- | 1.0555 | 15000 | 0.0179 | - | - | - |
403
- | 1.0907 | 15500 | 0.0178 | - | - | - |
404
- | 1.1259 | 16000 | 0.0173 | - | - | - |
405
- | 1.1611 | 16500 | 0.018 | - | - | - |
406
- | 1.1963 | 17000 | 0.0176 | - | - | - |
407
- | 1.2314 | 17500 | 0.0173 | - | - | - |
408
- | 1.2666 | 18000 | 0.0174 | - | - | - |
409
- | 1.3018 | 18500 | 0.0173 | - | - | - |
410
- | 1.3370 | 19000 | 0.0174 | - | - | - |
411
- | 1.3722 | 19500 | 0.0173 | - | - | - |
412
- | 1.4074 | 20000 | 0.0175 | - | - | - |
413
- | 1.4425 | 20500 | 0.0179 | - | - | - |
414
- | 1.4777 | 21000 | 0.017 | - | - | - |
415
- | 1.5129 | 21500 | 0.0167 | - | - | - |
416
- | 1.5481 | 22000 | 0.0174 | - | - | - |
417
- | 1.5833 | 22500 | 0.017 | - | - | - |
418
- | 1.6185 | 23000 | 0.0166 | - | - | - |
419
- | 1.6536 | 23500 | 0.0165 | - | - | - |
420
- | 1.6888 | 24000 | 0.0166 | - | - | - |
421
- | 1.7240 | 24500 | 0.017 | - | - | - |
422
- | 1.7592 | 25000 | 0.0167 | - | - | - |
423
- | 1.7944 | 25500 | 0.0163 | - | - | - |
424
- | 1.8296 | 26000 | 0.0167 | - | - | - |
425
- | 1.8648 | 26500 | 0.0165 | - | - | - |
426
- | 1.8999 | 27000 | 0.0165 | - | - | - |
427
- | 1.9351 | 27500 | 0.0167 | - | - | - |
428
- | 1.9703 | 28000 | 0.0162 | - | - | - |
429
- | 2.0 | 28422 | - | 0.0155 | 0.8751 | - |
430
- | 2.0055 | 28500 | 0.0161 | - | - | - |
431
- | 2.0407 | 29000 | 0.0157 | - | - | - |
432
- | 2.0759 | 29500 | 0.0156 | - | - | - |
433
- | 2.1110 | 30000 | 0.015 | - | - | - |
434
- | 2.1462 | 30500 | 0.0155 | - | - | - |
435
- | 2.1814 | 31000 | 0.0152 | - | - | - |
436
- | 2.2166 | 31500 | 0.0153 | - | - | - |
437
- | 2.2518 | 32000 | 0.0154 | - | - | - |
438
- | 2.2870 | 32500 | 0.0154 | - | - | - |
439
- | 2.3221 | 33000 | 0.0152 | - | - | - |
440
- | 2.3573 | 33500 | 0.0154 | - | - | - |
441
- | 2.3925 | 34000 | 0.015 | - | - | - |
442
- | 2.4277 | 34500 | 0.015 | - | - | - |
443
- | 2.4629 | 35000 | 0.0148 | - | - | - |
444
- | 2.4981 | 35500 | 0.0148 | - | - | - |
445
- | 2.5332 | 36000 | 0.0151 | - | - | - |
446
- | 2.5684 | 36500 | 0.0155 | - | - | - |
447
- | 2.6036 | 37000 | 0.0152 | - | - | - |
448
- | 2.6388 | 37500 | 0.0155 | - | - | - |
449
- | 2.6740 | 38000 | 0.0149 | - | - | - |
450
- | 2.7092 | 38500 | 0.0148 | - | - | - |
451
- | 2.7444 | 39000 | 0.0151 | - | - | - |
452
- | 2.7795 | 39500 | 0.0147 | - | - | - |
453
- | 2.8147 | 40000 | 0.015 | - | - | - |
454
- | 2.8499 | 40500 | 0.0147 | - | - | - |
455
- | 2.8851 | 41000 | 0.0147 | - | - | - |
456
- | 2.9203 | 41500 | 0.0151 | - | - | - |
457
- | 2.9555 | 42000 | 0.0144 | - | - | - |
458
- | 2.9906 | 42500 | 0.0146 | - | - | - |
459
- | 3.0 | 42633 | - | 0.0148 | 0.8771 | - |
460
- | 3.0258 | 43000 | 0.0144 | - | - | - |
461
- | 3.0610 | 43500 | 0.0141 | - | - | - |
462
- | 3.0962 | 44000 | 0.0146 | - | - | - |
463
- | 3.1314 | 44500 | 0.0136 | - | - | - |
464
- | 3.1666 | 45000 | 0.0143 | - | - | - |
465
- | 3.2017 | 45500 | 0.0142 | - | - | - |
466
- | 3.2369 | 46000 | 0.0146 | - | - | - |
467
- | 3.2721 | 46500 | 0.0144 | - | - | - |
468
- | 3.3073 | 47000 | 0.0142 | - | - | - |
469
- | 3.3425 | 47500 | 0.0142 | - | - | - |
470
- | 3.3777 | 48000 | 0.0138 | - | - | - |
471
- | 3.4128 | 48500 | 0.0141 | - | - | - |
472
- | 3.4480 | 49000 | 0.0141 | - | - | - |
473
- | 3.4832 | 49500 | 0.0142 | - | - | - |
474
- | 3.5184 | 50000 | 0.0142 | - | - | - |
475
- | 3.5536 | 50500 | 0.0141 | - | - | - |
476
- | 3.5888 | 51000 | 0.0141 | - | - | - |
477
- | 3.6240 | 51500 | 0.014 | - | - | - |
478
- | 3.6591 | 52000 | 0.0143 | - | - | - |
479
- | 3.6943 | 52500 | 0.0136 | - | - | - |
480
- | 3.7295 | 53000 | 0.0139 | - | - | - |
481
- | 3.7647 | 53500 | 0.0141 | - | - | - |
482
- | 3.7999 | 54000 | 0.0139 | - | - | - |
483
- | 3.8351 | 54500 | 0.0141 | - | - | - |
484
- | 3.8702 | 55000 | 0.014 | - | - | - |
485
- | 3.9054 | 55500 | 0.0142 | - | - | - |
486
- | 3.9406 | 56000 | 0.0142 | - | - | - |
487
- | 3.9758 | 56500 | 0.014 | - | - | - |
488
- | 4.0 | 56844 | - | 0.0147 | 0.8787 | - |
489
- | -1 | -1 | - | - | - | 0.8800 |
490
 
491
  </details>
492
 
 
4
  - sentence-similarity
5
  - feature-extraction
6
  - generated_from_trainer
7
+ - dataset_size:910013
8
  - loss:CosineSimilarityLoss
9
  base_model: intfloat/multilingual-e5-small
10
  widget:
11
+ - source_sentence: business healing
12
  sentences:
13
+ - modify ict system capacity
14
+ - objetividade, inovadora,estudiosa,pesquisadora e organizada
15
+ - business consulting
16
+ - source_sentence: architecture acoustics
17
  sentences:
18
+ - disicpline leader
19
+ - 生产工艺开发及优化
20
+ - data analysis
21
+ - source_sentence: arbitru natatie
22
  sentences:
23
+ - criação cinematográfica
24
+ - quarterly distribution
25
+ - улучшение путешествий клиентов с помощью дополненной реальности
26
+ - source_sentence: configuración de software antivirus
27
  sentences:
28
+ - protocol & coordination
29
+ - laurea magistrale biologia
30
+ - deploy anti-virus software
31
+ - source_sentence: child maltreatment counselling
32
  sentences:
33
+ - book covers, flyers, posters, banners
34
+ - tool and die making
35
+ - cmc
36
  pipeline_tag: sentence-similarity
37
  library_name: sentence-transformers
38
  metrics:
 
49
  type: sts-dev
50
  metrics:
51
  - type: pearson_cosine
52
+ value: 0.9579653395486292
53
  name: Pearson Cosine
54
  - type: spearman_cosine
55
+ value: 0.8788941637037295
56
  name: Spearman Cosine
57
  - task:
58
  type: semantic-similarity
 
62
  type: sts-test
63
  metrics:
64
  - type: pearson_cosine
65
+ value: 0.9579215714676803
66
  name: Pearson Cosine
67
  - type: spearman_cosine
68
+ value: 0.8795799743051839
69
  name: Spearman Cosine
70
  ---
71
 
 
119
  model = SentenceTransformer("sentence_transformers_model_id")
120
  # Run inference
121
  sentences = [
122
+ 'child maltreatment counselling',
123
+ 'cmc',
124
+ 'book covers, flyers, posters, banners',
125
  ]
126
  embeddings = model.encode(sentences)
127
  print(embeddings.shape)
 
166
  * Datasets: `sts-dev` and `sts-test`
167
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
168
 
169
+ | Metric | sts-dev | sts-test |
170
+ |:--------------------|:-----------|:-----------|
171
+ | pearson_cosine | 0.958 | 0.9579 |
172
+ | **spearman_cosine** | **0.8789** | **0.8796** |
173
 
174
  <!--
175
  ## Bias, Risks and Limitations
 
189
 
190
  #### Unnamed Dataset
191
 
192
+ * Size: 910,013 training samples
193
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
194
  * Approximate statistics based on the first 1000 samples:
195
  | | sentence1 | sentence2 | score |
196
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
197
  | type | string | string | float |
198
+ | details | <ul><li>min: 3 tokens</li><li>mean: 8.91 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.83 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.52</li><li>max: 1.0</li></ul> |
199
  * Samples:
200
+ | sentence1 | sentence2 | score |
201
+ |:----------------------------------------------------------------------------------------------|:--------------------------------------------------|:------------------|
202
+ | <code>edición de fotografias, fondos</code> | <code>material selection and cognition</code> | <code>0.0</code> |
203
+ | <code>professional alarm installer,service tech.,customer service relations,sales,cctv</code> | <code>quantity surveying & reading charts</code> | <code>0.1</code> |
204
+ | <code>diagnostico ecografico</code> | <code>waste identification system downtime</code> | <code>0.19</code> |
205
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
206
  ```json
207
  {
 
213
 
214
  #### Unnamed Dataset
215
 
216
+ * Size: 113,751 evaluation samples
217
  * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
218
  * Approximate statistics based on the first 1000 samples:
219
  | | sentence1 | sentence2 | score |
220
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
221
  | type | string | string | float |
222
+ | details | <ul><li>min: 4 tokens</li><li>mean: 8.89 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.96 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> |
223
  * Samples:
224
+ | sentence1 | sentence2 | score |
225
+ |:----------------------------------------------------------|:-------------------------|:------------------|
226
+ | <code>a2 dutch</code> | <code>a2 dutch</code> | <code>0.98</code> |
227
+ | <code>design of mine dumps</code> | <code>设计矿山废料堆</code> | <code>1.0</code> |
228
+ | <code>create soil and plant improvement programmes</code> | <code>创建土壤和植物改良计划</code> | <code>1.0</code> |
229
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
230
  ```json
231
  {
 
369
 
370
  | Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | sts-test_spearman_cosine |
371
  |:------:|:-----:|:-------------:|:---------------:|:-----------------------:|:------------------------:|
372
+ | 0.0352 | 500 | 0.1991 | - | - | - |
373
+ | 0.0703 | 1000 | 0.0513 | - | - | - |
374
+ | 0.1055 | 1500 | 0.0362 | - | - | - |
375
+ | 0.1407 | 2000 | 0.0331 | - | - | - |
376
+ | 0.1758 | 2500 | 0.0305 | - | - | - |
377
+ | 0.2110 | 3000 | 0.029 | - | - | - |
378
+ | 0.2461 | 3500 | 0.0273 | - | - | - |
379
+ | 0.2813 | 4000 | 0.0268 | - | - | - |
380
+ | 0.3165 | 4500 | 0.0255 | - | - | - |
381
+ | 0.3516 | 5000 | 0.0245 | - | - | - |
382
+ | 0.3868 | 5500 | 0.0238 | - | - | - |
383
+ | 0.4220 | 6000 | 0.0236 | - | - | - |
384
+ | 0.4571 | 6500 | 0.0233 | - | - | - |
385
+ | 0.4923 | 7000 | 0.0222 | - | - | - |
386
+ | 0.5275 | 7500 | 0.0225 | - | - | - |
387
+ | 0.5626 | 8000 | 0.0219 | - | - | - |
388
+ | 0.5978 | 8500 | 0.0212 | - | - | - |
389
+ | 0.6330 | 9000 | 0.0215 | - | - | - |
390
+ | 0.6681 | 9500 | 0.0207 | - | - | - |
391
+ | 0.7033 | 10000 | 0.0204 | - | - | - |
392
+ | 0.7384 | 10500 | 0.0203 | - | - | - |
393
+ | 0.7736 | 11000 | 0.0203 | - | - | - |
394
+ | 0.8088 | 11500 | 0.0202 | - | - | - |
395
+ | 0.8439 | 12000 | 0.0202 | - | - | - |
396
+ | 0.8791 | 12500 | 0.0196 | - | - | - |
397
+ | 0.9143 | 13000 | 0.0193 | - | - | - |
398
+ | 0.9494 | 13500 | 0.0193 | - | - | - |
399
+ | 0.9846 | 14000 | 0.0193 | - | - | - |
400
+ | 1.0 | 14219 | - | 0.0170 | 0.8694 | - |
401
+ | 1.0198 | 14500 | 0.0188 | - | - | - |
402
+ | 1.0549 | 15000 | 0.0178 | - | - | - |
403
+ | 1.0901 | 15500 | 0.0179 | - | - | - |
404
+ | 1.1253 | 16000 | 0.0178 | - | - | - |
405
+ | 1.1604 | 16500 | 0.0178 | - | - | - |
406
+ | 1.1956 | 17000 | 0.0172 | - | - | - |
407
+ | 1.2307 | 17500 | 0.0172 | - | - | - |
408
+ | 1.2659 | 18000 | 0.0175 | - | - | - |
409
+ | 1.3011 | 18500 | 0.0178 | - | - | - |
410
+ | 1.3362 | 19000 | 0.0174 | - | - | - |
411
+ | 1.3714 | 19500 | 0.0175 | - | - | - |
412
+ | 1.4066 | 20000 | 0.0171 | - | - | - |
413
+ | 1.4417 | 20500 | 0.0175 | - | - | - |
414
+ | 1.4769 | 21000 | 0.0173 | - | - | - |
415
+ | 1.5121 | 21500 | 0.0171 | - | - | - |
416
+ | 1.5472 | 22000 | 0.0174 | - | - | - |
417
+ | 1.5824 | 22500 | 0.0172 | - | - | - |
418
+ | 1.6176 | 23000 | 0.0168 | - | - | - |
419
+ | 1.6527 | 23500 | 0.0165 | - | - | - |
420
+ | 1.6879 | 24000 | 0.0169 | - | - | - |
421
+ | 1.7230 | 24500 | 0.0169 | - | - | - |
422
+ | 1.7582 | 25000 | 0.0171 | - | - | - |
423
+ | 1.7934 | 25500 | 0.0165 | - | - | - |
424
+ | 1.8285 | 26000 | 0.0165 | - | - | - |
425
+ | 1.8637 | 26500 | 0.0165 | - | - | - |
426
+ | 1.8989 | 27000 | 0.0165 | - | - | - |
427
+ | 1.9340 | 27500 | 0.0164 | - | - | - |
428
+ | 1.9692 | 28000 | 0.0164 | - | - | - |
429
+ | 2.0 | 28438 | - | 0.0153 | 0.8751 | - |
430
+ | 2.0044 | 28500 | 0.0162 | - | - | - |
431
+ | 2.0395 | 29000 | 0.0156 | - | - | - |
432
+ | 2.0747 | 29500 | 0.0154 | - | - | - |
433
+ | 2.1099 | 30000 | 0.0157 | - | - | - |
434
+ | 2.1450 | 30500 | 0.016 | - | - | - |
435
+ | 2.1802 | 31000 | 0.015 | - | - | - |
436
+ | 2.2153 | 31500 | 0.0155 | - | - | - |
437
+ | 2.2505 | 32000 | 0.0154 | - | - | - |
438
+ | 2.2857 | 32500 | 0.0152 | - | - | - |
439
+ | 2.3208 | 33000 | 0.0152 | - | - | - |
440
+ | 2.3560 | 33500 | 0.0152 | - | - | - |
441
+ | 2.3912 | 34000 | 0.0154 | - | - | - |
442
+ | 2.4263 | 34500 | 0.0153 | - | - | - |
443
+ | 2.4615 | 35000 | 0.0154 | - | - | - |
444
+ | 2.4967 | 35500 | 0.015 | - | - | - |
445
+ | 2.5318 | 36000 | 0.0153 | - | - | - |
446
+ | 2.5670 | 36500 | 0.0149 | - | - | - |
447
+ | 2.6022 | 37000 | 0.015 | - | - | - |
448
+ | 2.6373 | 37500 | 0.0152 | - | - | - |
449
+ | 2.6725 | 38000 | 0.0152 | - | - | - |
450
+ | 2.7076 | 38500 | 0.015 | - | - | - |
451
+ | 2.7428 | 39000 | 0.0151 | - | - | - |
452
+ | 2.7780 | 39500 | 0.0155 | - | - | - |
453
+ | 2.8131 | 40000 | 0.0148 | - | - | - |
454
+ | 2.8483 | 40500 | 0.0149 | - | - | - |
455
+ | 2.8835 | 41000 | 0.0147 | - | - | - |
456
+ | 2.9186 | 41500 | 0.015 | - | - | - |
457
+ | 2.9538 | 42000 | 0.0148 | - | - | - |
458
+ | 2.9890 | 42500 | 0.0146 | - | - | - |
459
+ | 3.0 | 42657 | - | 0.0146 | 0.8775 | - |
460
+ | 3.0241 | 43000 | 0.0142 | - | - | - |
461
+ | 3.0593 | 43500 | 0.0144 | - | - | - |
462
+ | 3.0945 | 44000 | 0.0146 | - | - | - |
463
+ | 3.1296 | 44500 | 0.0142 | - | - | - |
464
+ | 3.1648 | 45000 | 0.0144 | - | - | - |
465
+ | 3.1999 | 45500 | 0.0141 | - | - | - |
466
+ | 3.2351 | 46000 | 0.0142 | - | - | - |
467
+ | 3.2703 | 46500 | 0.0142 | - | - | - |
468
+ | 3.3054 | 47000 | 0.0142 | - | - | - |
469
+ | 3.3406 | 47500 | 0.0145 | - | - | - |
470
+ | 3.3758 | 48000 | 0.0142 | - | - | - |
471
+ | 3.4109 | 48500 | 0.0143 | - | - | - |
472
+ | 3.4461 | 49000 | 0.0145 | - | - | - |
473
+ | 3.4813 | 49500 | 0.0142 | - | - | - |
474
+ | 3.5164 | 50000 | 0.014 | - | - | - |
475
+ | 3.5516 | 50500 | 0.0141 | - | - | - |
476
+ | 3.5868 | 51000 | 0.0144 | - | - | - |
477
+ | 3.6219 | 51500 | 0.0143 | - | - | - |
478
+ | 3.6571 | 52000 | 0.0143 | - | - | - |
479
+ | 3.6922 | 52500 | 0.0142 | - | - | - |
480
+ | 3.7274 | 53000 | 0.014 | - | - | - |
481
+ | 3.7626 | 53500 | 0.0142 | - | - | - |
482
+ | 3.7977 | 54000 | 0.0141 | - | - | - |
483
+ | 3.8329 | 54500 | 0.0141 | - | - | - |
484
+ | 3.8681 | 55000 | 0.014 | - | - | - |
485
+ | 3.9032 | 55500 | 0.0143 | - | - | - |
486
+ | 3.9384 | 56000 | 0.0142 | - | - | - |
487
+ | 3.9736 | 56500 | 0.0141 | - | - | - |
488
+ | 4.0 | 56876 | - | 0.0146 | 0.8789 | - |
489
+ | -1 | -1 | - | - | - | 0.8796 |
490
 
491
  </details>
492
 
model card.txt CHANGED
@@ -1,5 +1,5 @@
1
  Training dataset:
2
- File: skills_matching_training_v1
3
  Origin: gpt_dataset_acronyms, gpt_dataset_related, gpt_dataset_translations, gpt_dataset_generator, gpt_dataset_variant_generator
4
  Characteristics: LOWER CASE
5
  Length: 1.136.837 rows
@@ -16,7 +16,9 @@ Training Data:
16
 
17
  Training Results:
18
  Epoch Training Loss Validation Loss Sts-dev Pearson Cosine Sts-dev Spearman Cosine
19
- 1 0.019100 0.016896 0.951701 0.869526
20
- 2 0.016200 0.015457 0.955264 0.875107
21
- 3 0.014600 0.014808 0.957285 0.877118
22
- 4 0.014000 0.014653 0.957711 0.878654
 
 
 
1
  Training dataset:
2
+ File: skills_matching_training_v1a
3
  Origin: gpt_dataset_acronyms, gpt_dataset_related, gpt_dataset_translations, gpt_dataset_generator, gpt_dataset_variant_generator
4
  Characteristics: LOWER CASE
5
  Length: 1.136.837 rows
 
16
 
17
  Training Results:
18
  Epoch Training Loss Validation Loss Sts-dev Pearson Cosine Sts-dev Spearman Cosine
19
+ 1 0.019300 0.016977 0.950964 0.869359
20
+ 2 0.016400 0.015263 0.956199 0.875054
21
+ 3 0.014600 0.014600 0.957986 0.877506
22
+ 4 0.014100 0.014555 0.957965 0.878894
23
+
24
+ sub-version a: Hyper Tunning for "full stack", "back end" and "front end"
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:18aa4869ded36bc812734d4067cffa320cb0251f1ac7762be48c8bc527401ca2
3
  size 470637416
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:24e15e17ede2ec440a7e13f9c5776b7196cd822cc120edf133e0a70afbd60a38
3
  size 470637416