sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
7a49a3c70f613f07a41127b680a21478f5ac0871 |
# Dataset of rita_rossweisse (Houkai 3rd)
This is the dataset of rita_rossweisse (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are `hair_over_one_eye, brown_hair, bangs, mole_under_eye, breasts, mole, purple_eyes, short_hair, large_breasts, hair_ornament, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 903.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_rossweisse_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 431.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_rossweisse_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1273 | 934.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_rossweisse_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 755.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_rossweisse_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1273 | 1.42 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_rossweisse_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rita_rossweisse_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, hair_flower, looking_at_viewer, smile, solo, blue_rose, closed_mouth, night_sky, black_dress, cleavage, looking_back, outdoors, purple_dress, purple_rose, star_(sky) |
| 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | bare_shoulders, looking_at_viewer, 1girl, solo, closed_mouth, smile, cleavage, white_dress, jewelry, white_background, white_gloves |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, china_dress, hair_flower, smile, solo, twintails, white_flower, black_thighhighs, cleavage, closed_mouth, white_dress, looking_at_viewer, navel_cutout, earrings, chinese_new_year, petals, wrist_cuffs |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, closed_mouth, hair_flower, looking_at_viewer, simple_background, solo, white_background, smile, blue_rose, black_dress, bridal_veil, wedding_dress, white_dress |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, cleavage, frills, horns, maid_headdress, smile, solo, black_gloves, looking_at_viewer, simple_background, white_background, blush, puffy_short_sleeves, red_rose, closed_mouth, red_eyes, hair_flower |
| 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_gloves, horns, looking_at_viewer, maid_headdress, rose, solo, frills, holding_weapon, maid_apron, cleavage, holding_scythe, pantyhose, grin, hair_flower, puffy_short_sleeves, red_eyes |
| 6 | 18 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, long_sleeves, solo, white_shirt, pantyhose, looking_at_viewer, smile, black_skirt, closed_mouth, black_gloves, rose, frilled_shirt, simple_background, single_glove, white_background |
| 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, solo, bare_shoulders, wedding_dress, white_dress, white_gloves, bride, smile, bridal_veil, earrings, looking_at_viewer, cleavage, hair_flower, blue_rose, closed_mouth, holding, petals, white_thighhighs |
| 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, looking_at_viewer, maid_headdress, simple_background, solo, cleavage, frilled_swimsuit, one-piece_swimsuit, smile, twintails, white_background, nail_polish, bare_shoulders, blue_nails, closed_mouth, full_body, sandals |
| 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, looking_at_viewer, solo, blush, smile, black_bra, black_panties, black_thighhighs, on_side, ass |
| 10 | 20 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, hetero, 1girl, blush, smile, solo_focus, looking_at_viewer, horns, maid_headdress, nipples, penis, black_gloves, red_eyes, sex, cowgirl_position, girl_on_top, heart, nude, puffy_short_sleeves, vaginal, cum_in_pussy, mosaic_censoring, closed_mouth, hair_flower, rose, open_mouth |
| 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, bare_shoulders, blush, cleavage, detached_collar, looking_at_viewer, playboy_bunny, solo, strapless_leotard, wrist_cuffs, black_leotard, fake_animal_ears, rabbit_ears, black_pantyhose, fake_tail, rabbit_tail, red_bowtie, sitting, smile, very_long_hair, eyes_visible_through_hair, feet_out_of_frame, parted_lips, red_eyes, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | hair_flower | looking_at_viewer | smile | solo | blue_rose | closed_mouth | night_sky | black_dress | cleavage | looking_back | outdoors | purple_dress | purple_rose | star_(sky) | white_dress | jewelry | white_background | white_gloves | china_dress | twintails | white_flower | black_thighhighs | navel_cutout | earrings | chinese_new_year | petals | wrist_cuffs | simple_background | bridal_veil | wedding_dress | frills | horns | maid_headdress | black_gloves | blush | puffy_short_sleeves | red_rose | red_eyes | rose | holding_weapon | maid_apron | holding_scythe | pantyhose | grin | long_sleeves | white_shirt | black_skirt | frilled_shirt | single_glove | bride | holding | white_thighhighs | frilled_swimsuit | one-piece_swimsuit | nail_polish | blue_nails | full_body | sandals | black_bra | black_panties | on_side | ass | 1boy | hetero | solo_focus | nipples | penis | sex | cowgirl_position | girl_on_top | heart | nude | vaginal | cum_in_pussy | mosaic_censoring | open_mouth | detached_collar | playboy_bunny | strapless_leotard | black_leotard | fake_animal_ears | rabbit_ears | black_pantyhose | fake_tail | rabbit_tail | red_bowtie | sitting | very_long_hair | eyes_visible_through_hair | feet_out_of_frame | parted_lips |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:--------------------|:--------|:-------|:------------|:---------------|:------------|:--------------|:-----------|:---------------|:-----------|:---------------|:--------------|:-------------|:--------------|:----------|:-------------------|:---------------|:--------------|:------------|:---------------|:-------------------|:---------------|:-----------|:-------------------|:---------|:--------------|:--------------------|:--------------|:----------------|:---------|:--------|:-----------------|:---------------|:--------|:----------------------|:-----------|:-----------|:-------|:-----------------|:-------------|:-----------------|:------------|:-------|:---------------|:--------------|:--------------|:----------------|:---------------|:--------|:----------|:-------------------|:-------------------|:---------------------|:--------------|:-------------|:------------|:----------|:------------|:----------------|:----------|:------|:-------|:---------|:-------------|:----------|:--------|:------|:-------------------|:--------------|:--------|:-------|:----------|:---------------|:-------------------|:-------------|:------------------|:----------------|:--------------------|:----------------|:-------------------|:--------------|:------------------|:------------|:--------------|:-------------|:----------|:-----------------|:----------------------------|:--------------------|:--------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | X | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | | X | | | | | | | X | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | X | X | | X | | | X | | | | | | | | X | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 18 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | X | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 16 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | X | X | X | X | | | X | | | | | | X | | | X | | | | | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | X | X | X | | X | | | X | | | | | | | | X | | | X | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 20 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | X | | X | X | X | | | | | X | | | | | | | | X | | | | | | | | | | X | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/rita_rossweisse_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T07:36:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:07:25+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of rita\_rossweisse (Houkai 3rd)
========================================
This is the dataset of rita\_rossweisse (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are 'hair\_over\_one\_eye, brown\_hair, bangs, mole\_under\_eye, breasts, mole, purple\_eyes, short\_hair, large\_breasts, hair\_ornament, long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6590b8618cd2dc9cd3757efce0a7288d45b72cae |
# Dataset of prometheus (Houkai 3rd)
This is the dataset of prometheus (Houkai 3rd), containing 44 images and their tags.
The core tags of this character are `grey_hair, bangs, red_eyes, earrings, drill_hair, twin_drills, hair_between_eyes, long_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 68.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prometheus_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 35.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prometheus_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 100 | 76.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prometheus_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 59.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prometheus_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 100 | 113.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prometheus_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/prometheus_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, jewelry, looking_at_viewer, closed_mouth, grey_skirt, pleated_skirt, small_breasts, bare_shoulders, doll_joints, full_body, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | looking_at_viewer | closed_mouth | grey_skirt | pleated_skirt | small_breasts | bare_shoulders | doll_joints | full_body | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------------------|:---------------|:-------------|:----------------|:----------------|:-----------------|:--------------|:------------|:--------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/prometheus_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T07:37:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T07:47:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of prometheus (Houkai 3rd)
==================================
This is the dataset of prometheus (Houkai 3rd), containing 44 images and their tags.
The core tags of this character are 'grey\_hair, bangs, red\_eyes, earrings, drill\_hair, twin\_drills, hair\_between\_eyes, long\_hair, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6b6286d36639299889cc019f83d4d5ba8bf7a582 |
# Dataset of fuxi (Houkai 3rd)
This is the dataset of fuxi (Houkai 3rd), containing 12 images and their tags.
The core tags of this character are `bangs, blue_eyes, long_hair, hair_ornament, black_hair, hair_bun, very_long_hair, blunt_bangs, braid, brown_hair, breasts, double_bun, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 13.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 13.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 11.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 20.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuxi_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fuxi_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, bare_shoulders, looking_at_viewer, collarbone, detached_sleeves, long_sleeves, snake, white_dress, barefoot, parted_lips, sitting, sleeves_past_wrists, strapless, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | collarbone | detached_sleeves | long_sleeves | snake | white_dress | barefoot | parted_lips | sitting | sleeves_past_wrists | strapless | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:-------------|:-------------------|:---------------|:--------|:--------------|:-----------|:--------------|:----------|:----------------------|:------------|:-------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fuxi_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T07:48:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T07:51:38+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fuxi (Houkai 3rd)
============================
This is the dataset of fuxi (Houkai 3rd), containing 12 images and their tags.
The core tags of this character are 'bangs, blue\_eyes, long\_hair, hair\_ornament, black\_hair, hair\_bun, very\_long\_hair, blunt\_bangs, braid, brown\_hair, breasts, double\_bun, multicolored\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
de35cbce33676a59e86abe330981097cf30b7964 | # Dataset Card for "19100_chat_05x_slot_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | FanChen0116/19100_chat_05x_slot_pvi | [
"region:us"
] | 2024-01-17T07:57:35+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-time", "2": "B-date", "3": "B-last_name", "4": "B-people", "5": "I-date", "6": "I-people", "7": "I-last_name", "8": "I-first_name", "9": "B-first_name", "10": "B-time"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 5796, "num_examples": 32}, {"name": "validation", "num_bytes": 5405, "num_examples": 32}, {"name": "test", "num_bytes": 646729, "num_examples": 3731}], "download_size": 0, "dataset_size": 657930}} | 2024-01-17T08:12:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "19100_chat_05x_slot_pvi"
More Information needed | [
"# Dataset Card for \"19100_chat_05x_slot_pvi\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"19100_chat_05x_slot_pvi\"\n\nMore Information needed"
] |
2bcee28902a8390fd968f3520cc38c565e163bb3 |
# Dataset Card for Evaluation run of alnrg2arg/test2_4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test2_4](https://huggingface.co/alnrg2arg/test2_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test2_4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T07:57:35.598249](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2_4/blob/main/results_2024-01-17T07-57-35.598249.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652927958678689,
"acc_stderr": 0.0321169960910649,
"acc_norm": 0.6519652759500019,
"acc_norm_stderr": 0.03279242565970157,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7229635530770763,
"acc_stderr": 0.004466200055292544,
"acc_norm": 0.8886675960963951,
"acc_norm_stderr": 0.0031390048159258633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6976711663625277,
"mc2_stderr": 0.015093001598591628
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__test2_4 | [
"region:us"
] | 2024-01-17T07:59:54+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/test2_4", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test2_4](https://huggingface.co/alnrg2arg/test2_4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test2_4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T07:57:35.598249](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2_4/blob/main/results_2024-01-17T07-57-35.598249.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652927958678689,\n \"acc_stderr\": 0.0321169960910649,\n \"acc_norm\": 0.6519652759500019,\n \"acc_norm_stderr\": 0.03279242565970157,\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7229635530770763,\n \"acc_stderr\": 0.004466200055292544,\n \"acc_norm\": 0.8886675960963951,\n \"acc_norm_stderr\": 0.0031390048159258633\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944423,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944423\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6976711663625277,\n \"mc2_stderr\": 0.015093001598591628\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898772\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test2_4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|arc:challenge|25_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|gsm8k|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hellaswag|10_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T07-57-35.598249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["**/details_harness|winogrande|5_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T07-57-35.598249.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T07_57_35.598249", "path": ["results_2024-01-17T07-57-35.598249.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T07-57-35.598249.parquet"]}]}]} | 2024-01-17T08:00:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/test2_4
Dataset automatically created during the evaluation run of model alnrg2arg/test2_4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T07:57:35.598249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/test2_4\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test2_4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T07:57:35.598249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/test2_4\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test2_4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T07:57:35.598249(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e6c019fd7073ccfc3280e9505e5be4a514884232 |
# Dataset of fu_hua (Houkai 3rd)
This is the dataset of fu_hua (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are `long_hair, bangs, black_hair, blue_eyes, hair_ornament, hair_between_eyes, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 826.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_hua_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 420.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_hua_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1227 | 904.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_hua_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 706.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_hua_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1227 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/fu_hua_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fu_hua_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, china_dress, closed_mouth, solo, hair_over_one_eye, looking_at_viewer, white_dress, simple_background, single_earring, white_background, long_sleeves, yin_yang, smile |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, china_dress, closed_mouth, hair_over_one_eye, long_sleeves, solo, white_dress, yin_yang, looking_at_viewer |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, china_dress, hair_over_one_eye, holding_umbrella, solo, white_dress, closed_mouth, looking_at_viewer, oil-paper_umbrella, long_sleeves, single_earring, yin_yang, smile, white_background |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, hair_over_one_eye, looking_at_viewer, sleeveless_dress, solo, white_dress, earrings, smile, open_mouth, bird |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, chinese_clothes, solo, long_sleeves, looking_at_viewer, closed_mouth, simple_background, black_gloves, white_background, white_shirt |
| 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, monocle, solo, white_shirt, detective, glasses, looking_at_viewer, long_sleeves, brown_gloves, brown_jacket, closed_mouth, polo_shirt, black_gloves, black_shorts |
| 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, china_dress, sleeveless_dress, solo, white_dress, white_hair, elbow_gloves, red_eyes, red_gloves, small_breasts, black_gloves, mismatched_gloves, streaked_hair, closed_mouth, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | china_dress | closed_mouth | solo | hair_over_one_eye | looking_at_viewer | white_dress | simple_background | single_earring | white_background | long_sleeves | yin_yang | smile | holding_umbrella | oil-paper_umbrella | sleeveless_dress | earrings | open_mouth | bird | chinese_clothes | black_gloves | white_shirt | monocle | detective | glasses | brown_gloves | brown_jacket | polo_shirt | black_shorts | white_hair | elbow_gloves | red_eyes | red_gloves | small_breasts | mismatched_gloves | streaked_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:-------|:--------------------|:--------------------|:--------------|:--------------------|:-----------------|:-------------------|:---------------|:-----------|:--------|:-------------------|:---------------------|:-------------------|:-----------|:-------------|:-------|:------------------|:---------------|:--------------|:----------|:------------|:----------|:---------------|:---------------|:-------------|:---------------|:-------------|:---------------|:-----------|:-------------|:----------------|:--------------------|:----------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | X | X | X | X | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 10 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | X | | X | | X | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | |
| 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | |
| 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | X | X | | | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/fu_hua_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T08:03:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:10:01+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fu\_hua (Houkai 3rd)
===============================
This is the dataset of fu\_hua (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are 'long\_hair, bangs, black\_hair, blue\_eyes, hair\_ornament, hair\_between\_eyes, ponytail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
691feeea68c7b532a8b312c2152b84e64acbf7b4 |
# Portuguese-Corpus Instruct
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://nkluge-correa.github.io/TeenyTinyLlama/
- **Repository:** https://github.com/Nkluge-correa/TeenyTinyLlama
- **Paper:** [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640)
- **Point of Contact:** [AIRES at PUCRS](mailto:[email protected])
### Dataset Summary
Portuguese-Corpus Instruct is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending).
In a tokenized format, the dataset (uncompressed) weighs 80 GB and has approximately 6.2B tokens. This version of the corpus (Pt-Corpus-Instruct) includes several instances of conversational and general instructional data, allowing trained models to go through preference pre-training during their initial pre-training stage.
### Supported Tasks and Leaderboards
This dataset can be utilized for tasks involving language modeling.
### Languages
Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **text:** a string of text in Portuguese.
- **metadata:** the source where that string originated.
### Data Fields
```python
{
"text": "A inteligência artificial (de sigla: IA; do inglês: artificial intelligence, de sigla: AI) é um campo de estudo multidisciplinar que abrange varias áreas do conhecimento.",
"metadata": "source: https://huggingface.co/datasets/graelo/wikipedia"
}
```
### Data Splits
Available splits are `train`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/Pt-Corpus-Instruct", split='train')
# If you don't want to download the entire dataset, set streaming to `True`
dataset = load_dataset("nicholasKluge/Pt-Corpus-Instruct", split='train', streaming=True)
```
## Dataset Creation
### Curation Rationale
This dataset was developed are part of the [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640) paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.
### Source Data
#### Initial Data Collection and Normalization
We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator (`\n\n`).
#### Who are the source language producers?
All text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).
### Annotations
#### Annotation process
Portuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending). We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator (`\n\n`).
#### Who are the annotators?
[Nicholas Kluge Corrêa](mailto:[email protected]).
### Personal and Sensitive Information
This dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.
## Considerations for Using the Data
### Social Impact of Dataset
The presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.
### Discussion of Biases
The inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.
### Other Known Limitations
A significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:[email protected]).
### Licensing Information
The following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:
- [Wikipedia](https://huggingface.co/datasets/graelo/wikipedia) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [Instruct-PTBR](https://huggingface.co/datasets/cnmoro/Instruct-PTBR-ENUS-11M) (License: [LLAMA 2 Community License](https://ai.meta.com/llama/license/))
- [CulturaX](https://huggingface.co/datasets/uonlp/CulturaX) (License: [ODC-By](https://opendatacommons.org/licenses/by/1-0/), [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [Gpt4all](https://huggingface.co/datasets/pablo-moreira/gpt4all-j-prompt-generations-pt) (License: [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0.html))
- [OSCAR](https://huggingface.co/datasets/eduagarcia/OSCAR-2301-pt_dedup) (License: [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [CCc100](https://huggingface.co/datasets/eduagarcia/cc100-pt) (License: [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/))
- [Bactrian-X](https://huggingface.co/datasets/MBZUAI/Bactrian-X) (License: [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/deed.de))
- [Dolly-15k](https://huggingface.co/datasets/Gustrd/dolly-15k-libretranslate-pt) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [CosmosQA](https://huggingface.co/datasets/heloisy/cosmos_qa_ptbr) (License: [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/deed.de))
- [Roots Wikiquote](https://huggingface.co/datasets/bigscience-data/roots_pt_wikiquote) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [Roots Ted Talks](https://huggingface.co/datasets/bigscience-data/roots_pt_ted_talks_iwslt) (License: [CC BY-NC-ND 4.0](https://creativecommons.org/licenses/by-nc-nd/4.0/deed.en))
### Citation Information
```latex
@misc{correa24ttllama,
title = {TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese},
author = {Corr{\^e}a, Nicholas Kluge and Falk, Sophia and Fatimah, Shiza and Sen, Aniket and De Oliveira, Nythamar},
journal={arXiv preprint arXiv:2401.16640},
year={2024}
}
```
### Contributions
If you would like to contribute, contact me at [[email protected]](mailto:[email protected])!
| nicholasKluge/Pt-Corpus-Instruct | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:pt",
"license:other",
"portuguese",
"language-modeling",
"arxiv:2401.16640",
"arxiv:2112.11446",
"region:us"
] | 2024-01-17T08:09:20+00:00 | {"language": ["pt"], "license": "other", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"], "pretty_name": "Pt-Corpus Instruct", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29708613896, "num_examples": 10564643}], "download_size": 17036520990, "dataset_size": 29708613896}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["portuguese", "language-modeling"]} | 2024-02-15T18:10:17+00:00 | [
"2401.16640",
"2112.11446"
] | [
"pt"
] | TAGS
#task_categories-text-generation #size_categories-1M<n<10M #language-Portuguese #license-other #portuguese #language-modeling #arxiv-2401.16640 #arxiv-2112.11446 #region-us
|
# Portuguese-Corpus Instruct
## Table of Contents
- Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese
- Point of Contact: AIRES at PUCRS
### Dataset Summary
Portuguese-Corpus Instruct is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub.
In a tokenized format, the dataset (uncompressed) weighs 80 GB and has approximately 6.2B tokens. This version of the corpus (Pt-Corpus-Instruct) includes several instances of conversational and general instructional data, allowing trained models to go through preference pre-training during their initial pre-training stage.
### Supported Tasks and Leaderboards
This dataset can be utilized for tasks involving language modeling.
### Languages
Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- text: a string of text in Portuguese.
- metadata: the source where that string originated.
### Data Fields
### Data Splits
Available splits are 'train'.
## Dataset Creation
### Curation Rationale
This dataset was developed are part of the TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.
### Source Data
#### Initial Data Collection and Normalization
We utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\n\n').
#### Who are the source language producers?
All text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).
### Annotations
#### Annotation process
Portuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub. We utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\n\n').
#### Who are the annotators?
Nicholas Kluge Corrêa.
### Personal and Sensitive Information
This dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.
## Considerations for Using the Data
### Social Impact of Dataset
The presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.
### Discussion of Biases
The inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.
### Other Known Limitations
A significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.
## Additional Information
### Dataset Curators
Nicholas Kluge Corrêa.
### Licensing Information
The following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:
- Wikipedia (License: CC BY-SA 3.0)
- Instruct-PTBR (License: LLAMA 2 Community License)
- CulturaX (License: ODC-By, cc0-1.0)
- Gpt4all (License: Apache 2.0)
- OSCAR (License: cc0-1.0)
- CCc100 (License: Common Crawl terms of use)
- Bactrian-X (License: CC BY-NC 4.0)
- Dolly-15k (License: CC BY-SA 3.0)
- CosmosQA (License: CC BY 4.0)
- Roots Wikiquote (License: CC BY-SA 3.0)
- Roots Ted Talks (License: CC BY-NC-ND 4.0)
### Contributions
If you would like to contribute, contact me at nicholas@URL!
| [
"# Portuguese-Corpus Instruct",
"## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese\n- Point of Contact: AIRES at PUCRS",
"### Dataset Summary\n\nPortuguese-Corpus Instruct is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub.\n\nIn a tokenized format, the dataset (uncompressed) weighs 80 GB and has approximately 6.2B tokens. This version of the corpus (Pt-Corpus-Instruct) includes several instances of conversational and general instructional data, allowing trained models to go through preference pre-training during their initial pre-training stage.",
"### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for tasks involving language modeling.",
"### Languages\n\nPortuguese.",
"## Dataset Structure",
"### Data Instances\n\nThe dataset consists of the following features:\n\n- text: a string of text in Portuguese.\n- metadata: the source where that string originated.",
"### Data Fields",
"### Data Splits\n\nAvailable splits are 'train'.",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset was developed are part of the TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nWe utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\\n\\n').",
"#### Who are the source language producers?\n\nAll text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).",
"### Annotations",
"#### Annotation process\n\nPortuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub. We utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\\n\\n').",
"#### Who are the annotators?\n\nNicholas Kluge Corrêa.",
"### Personal and Sensitive Information\n\nThis dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.",
"### Discussion of Biases\n\nThe inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.",
"### Other Known Limitations\n\nA significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.",
"## Additional Information",
"### Dataset Curators\n\nNicholas Kluge Corrêa.",
"### Licensing Information\n\nThe following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:\n\n- Wikipedia (License: CC BY-SA 3.0)\n\n- Instruct-PTBR (License: LLAMA 2 Community License)\n\n- CulturaX (License: ODC-By, cc0-1.0)\n\n- Gpt4all (License: Apache 2.0)\n\n- OSCAR (License: cc0-1.0)\n\n- CCc100 (License: Common Crawl terms of use)\n\n- Bactrian-X (License: CC BY-NC 4.0)\n\n- Dolly-15k (License: CC BY-SA 3.0)\n\n- CosmosQA (License: CC BY 4.0)\n\n- Roots Wikiquote (License: CC BY-SA 3.0)\n\n- Roots Ted Talks (License: CC BY-NC-ND 4.0)",
"### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1M<n<10M #language-Portuguese #license-other #portuguese #language-modeling #arxiv-2401.16640 #arxiv-2112.11446 #region-us \n",
"# Portuguese-Corpus Instruct",
"## Table of Contents\n\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese\n- Point of Contact: AIRES at PUCRS",
"### Dataset Summary\n\nPortuguese-Corpus Instruct is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub.\n\nIn a tokenized format, the dataset (uncompressed) weighs 80 GB and has approximately 6.2B tokens. This version of the corpus (Pt-Corpus-Instruct) includes several instances of conversational and general instructional data, allowing trained models to go through preference pre-training during their initial pre-training stage.",
"### Supported Tasks and Leaderboards\n\nThis dataset can be utilized for tasks involving language modeling.",
"### Languages\n\nPortuguese.",
"## Dataset Structure",
"### Data Instances\n\nThe dataset consists of the following features:\n\n- text: a string of text in Portuguese.\n- metadata: the source where that string originated.",
"### Data Fields",
"### Data Splits\n\nAvailable splits are 'train'.",
"## Dataset Creation",
"### Curation Rationale\n\nThis dataset was developed are part of the TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nWe utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\\n\\n').",
"#### Who are the source language producers?\n\nAll text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).",
"### Annotations",
"#### Annotation process\n\nPortuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the Hub. We utilized some of the filters used in Rae et al. (2021), besides using a fine-tuned BERTimbau to exclude samples classified above a pre-defined toxicity threshold. Conversational samples were formatted using a double new line separator ('\\n\\n').",
"#### Who are the annotators?\n\nNicholas Kluge Corrêa.",
"### Personal and Sensitive Information\n\nThis dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.",
"### Discussion of Biases\n\nThe inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.",
"### Other Known Limitations\n\nA significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.",
"## Additional Information",
"### Dataset Curators\n\nNicholas Kluge Corrêa.",
"### Licensing Information\n\nThe following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:\n\n- Wikipedia (License: CC BY-SA 3.0)\n\n- Instruct-PTBR (License: LLAMA 2 Community License)\n\n- CulturaX (License: ODC-By, cc0-1.0)\n\n- Gpt4all (License: Apache 2.0)\n\n- OSCAR (License: cc0-1.0)\n\n- CCc100 (License: Common Crawl terms of use)\n\n- Bactrian-X (License: CC BY-NC 4.0)\n\n- Dolly-15k (License: CC BY-SA 3.0)\n\n- CosmosQA (License: CC BY 4.0)\n\n- Roots Wikiquote (License: CC BY-SA 3.0)\n\n- Roots Ted Talks (License: CC BY-NC-ND 4.0)",
"### Contributions\n\nIf you would like to contribute, contact me at nicholas@URL!"
] |
8ea12e26a195d7d7865b1d9ad21b709d6691fb54 | # Dataset Card for "19100_chat_Self1x_slot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | FanChen0116/19100_chat_Self1x_slot | [
"region:us"
] | 2024-01-17T08:14:15+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-time", "2": "B-date", "3": "B-last_name", "4": "B-people", "5": "I-date", "6": "I-people", "7": "I-last_name", "8": "I-first_name", "9": "B-first_name", "10": "B-time"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 9973, "num_examples": 64}, {"name": "validation", "num_bytes": 4887, "num_examples": 32}, {"name": "test", "num_bytes": 570513, "num_examples": 3731}], "download_size": 124685, "dataset_size": 585373}} | 2024-01-31T12:20:11+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "19100_chat_Self1x_slot"
More Information needed | [
"# Dataset Card for \"19100_chat_Self1x_slot\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"19100_chat_Self1x_slot\"\n\nMore Information needed"
] |
5026756321b5f2a8122eec866b5159b9aff393e8 | # Dataset Card for "19100_chat_Self1x_slot_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | FanChen0116/19100_chat_Self1x_slot_empty | [
"region:us"
] | 2024-01-17T08:14:36+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-time", "2": "B-date", "3": "B-last_name", "4": "B-people", "5": "I-date", "6": "I-people", "7": "I-last_name", "8": "I-first_name", "9": "B-first_name", "10": "B-time"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 8877, "num_examples": 64}, {"name": "validation", "num_bytes": 4354, "num_examples": 32}, {"name": "test", "num_bytes": 570513, "num_examples": 3731}], "download_size": 121228, "dataset_size": 583744}} | 2024-01-31T12:20:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "19100_chat_Self1x_slot_empty"
More Information needed | [
"# Dataset Card for \"19100_chat_Self1x_slot_empty\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"19100_chat_Self1x_slot_empty\"\n\nMore Information needed"
] |
8e6c6e1c48f6caf1dce68f27cab7cc0ab77290c1 |
# Dataset Card for Evaluation run of Kquant03/Eukaryote-8x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Eukaryote-8x7B-bf16](https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T08:12:21.184681](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16/blob/main/results_2024-01-17T08-12-21.184681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563228779594618,
"acc_stderr": 0.03193320528683609,
"acc_norm": 0.6559570918809283,
"acc_norm_stderr": 0.032596696779968556,
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6316657951045663,
"mc2_stderr": 0.01524980646948028
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892976,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.01346008047800251
},
"harness|hellaswag|10": {
"acc": 0.6986656044612627,
"acc_stderr": 0.004578999029127976,
"acc_norm": 0.8729336785500896,
"acc_norm_stderr": 0.0033236659644122007
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.01664330737231588,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.01664330737231588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707906,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707906
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827058,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827058
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6316657951045663,
"mc2_stderr": 0.01524980646948028
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.7194844579226687,
"acc_stderr": 0.012374608490929556
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16 | [
"region:us"
] | 2024-01-17T08:14:48+00:00 | {"pretty_name": "Evaluation run of Kquant03/Eukaryote-8x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Eukaryote-8x7B-bf16](https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T08:12:21.184681](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Eukaryote-8x7B-bf16/blob/main/results_2024-01-17T08-12-21.184681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563228779594618,\n \"acc_stderr\": 0.03193320528683609,\n \"acc_norm\": 0.6559570918809283,\n \"acc_norm_stderr\": 0.032596696779968556,\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6316657951045663,\n \"mc2_stderr\": 0.01524980646948028\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892976,\n \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.01346008047800251\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6986656044612627,\n \"acc_stderr\": 0.004578999029127976,\n \"acc_norm\": 0.8729336785500896,\n \"acc_norm_stderr\": 0.0033236659644122007\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n \"acc_stderr\": 0.01664330737231588,\n \"acc_norm\": 0.45139664804469276,\n \"acc_norm_stderr\": 0.01664330737231588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.01274724896707906,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.01274724896707906\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827058,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827058\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6316657951045663,\n \"mc2_stderr\": 0.01524980646948028\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7194844579226687,\n \"acc_stderr\": 0.012374608490929556\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|arc:challenge|25_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|gsm8k|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hellaswag|10_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["**/details_harness|winogrande|5_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T08-12-21.184681.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T08_12_21.184681", "path": ["results_2024-01-17T08-12-21.184681.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T08-12-21.184681.parquet"]}]}]} | 2024-01-17T08:15:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Eukaryote-8x7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Eukaryote-8x7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T08:12:21.184681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Eukaryote-8x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Eukaryote-8x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T08:12:21.184681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Eukaryote-8x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Eukaryote-8x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T08:12:21.184681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5cb65aa4f5143919746c697f3e63b67b72db15cb | # Dataset Card for "19100_chat_Self1x_slot_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | FanChen0116/19100_chat_Self1x_slot_pvi | [
"region:us"
] | 2024-01-17T08:15:07+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "O", "1": "I-time", "2": "B-date", "3": "B-last_name", "4": "B-people", "5": "I-date", "6": "I-people", "7": "I-last_name", "8": "I-first_name", "9": "B-first_name", "10": "B-time"}}}}, {"name": "request_slot", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 9973, "num_examples": 64}, {"name": "validation", "num_bytes": 4887, "num_examples": 32}, {"name": "test", "num_bytes": 570513, "num_examples": 3731}], "download_size": 124672, "dataset_size": 585373}} | 2024-01-31T12:21:20+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "19100_chat_Self1x_slot_pvi"
More Information needed | [
"# Dataset Card for \"19100_chat_Self1x_slot_pvi\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"19100_chat_Self1x_slot_pvi\"\n\nMore Information needed"
] |
3f2e7e6cb00e737a8de89e9a30045dce4b295543 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE_DPO](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T08:19:28.413679](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_DPO/blob/main/results_2024-01-17T08-19-28.413679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543504017413734,
"acc_stderr": 0.031976772487807385,
"acc_norm": 0.6548321667502475,
"acc_norm_stderr": 0.03262960667391453,
"mc1": 0.6829865361077111,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.81497814660951,
"mc2_stderr": 0.01276053270116617
},
"harness|arc:challenge|25": {
"acc": 0.7209897610921502,
"acc_stderr": 0.013106784883601338,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7066321449910377,
"acc_stderr": 0.004543750480065777,
"acc_norm": 0.887572196773551,
"acc_norm_stderr": 0.003152464637757642
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05000000000000001,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05000000000000001
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741626,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741626
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47150837988826816,
"acc_stderr": 0.016695329746015796,
"acc_norm": 0.47150837988826816,
"acc_norm_stderr": 0.016695329746015796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6829865361077111,
"mc1_stderr": 0.016289203374403385,
"mc2": 0.81497814660951,
"mc2_stderr": 0.01276053270116617
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855924
},
"harness|gsm8k|5": {
"acc": 0.6489764973464746,
"acc_stderr": 0.013146945941397222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_DPO | [
"region:us"
] | 2024-01-17T08:21:44+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_7Bx2_MoE_DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE_DPO](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T08:19:28.413679](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE_DPO/blob/main/results_2024-01-17T08-19-28.413679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543504017413734,\n \"acc_stderr\": 0.031976772487807385,\n \"acc_norm\": 0.6548321667502475,\n \"acc_norm_stderr\": 0.03262960667391453,\n \"mc1\": 0.6829865361077111,\n \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.81497814660951,\n \"mc2_stderr\": 0.01276053270116617\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7209897610921502,\n \"acc_stderr\": 0.013106784883601338,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7066321449910377,\n \"acc_stderr\": 0.004543750480065777,\n \"acc_norm\": 0.887572196773551,\n \"acc_norm_stderr\": 0.003152464637757642\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05000000000000001,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05000000000000001\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741626,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741626\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47150837988826816,\n \"acc_stderr\": 0.016695329746015796,\n \"acc_norm\": 0.47150837988826816,\n \"acc_norm_stderr\": 0.016695329746015796\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6829865361077111,\n \"mc1_stderr\": 0.016289203374403385,\n \"mc2\": 0.81497814660951,\n \"mc2_stderr\": 0.01276053270116617\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855924\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \"acc_stderr\": 0.013146945941397222\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE_DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|arc:challenge|25_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|gsm8k|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hellaswag|10_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T08-19-28.413679.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["**/details_harness|winogrande|5_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T08-19-28.413679.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T08_19_28.413679", "path": ["results_2024-01-17T08-19-28.413679.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T08-19-28.413679.parquet"]}]}]} | 2024-01-17T08:22:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_DPO
Dataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T08:19:28.413679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T08:19:28.413679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T08:19:28.413679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9a131c25eb7429d423c9092a119344c6b4efed62 | # Dataset Card for "no_robots_test400"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
This is a subset of "no_robots", selecting 400 questions from the test set.
| category | messages |
|:-----------|-----------:|
| Brainstorm | 36 |
| Chat | 101 |
| Classify | 16 |
| Closed QA | 15 |
| Coding | 16 |
| Extract | 7 |
| Generation | 129 |
| Open QA | 34 |
| Rewrite | 21 |
| Summarize | 25 |
Code:
```python
import pandas as pd
import numpy as np
import numpy.random
from datasets import load_dataset, Dataset
from copy import deepcopy
def get_norobot_dataset():
ds = load_dataset('HuggingFaceH4/no_robots')
all_test_data = []
for sample in ds['test_sft']:
sample: dict
for i, message in enumerate(sample['messages']):
if message['role'] == 'user':
item = dict(
messages=deepcopy(sample['messages'][:i + 1]),
category=sample['category'],
prompt_id=sample['prompt_id'],
)
all_test_data.append(item)
return Dataset.from_list(all_test_data)
dataset = get_norobot_dataset().to_pandas()
dataset.groupby('category').count()
dataset['_sort_key'] = dataset['messages'].map(str)
dataset = dataset.sort_values(['_sort_key'])
subset = []
for category, group_df in sorted(dataset.groupby('category')):
n = int(len(group_df) * 0.603)
if n <= 20:
n = len(group_df)
indices = np.random.default_rng(seed=42).choice(len(group_df), size=n, replace=False)
subset.append(group_df.iloc[indices])
df = pd.concat(subset)
df = df.drop(columns=['_sort_key'])
df = df.reset_index(drop=True)
print(len(df))
print(df.groupby('category').count().to_string())
Dataset.from_pandas(df).push_to_hub('yujiepan/no_robots_test400')
```
| yujiepan/no_robots_test400 | [
"region:us"
] | 2024-01-17T08:23:59+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 222353, "num_examples": 400}], "download_size": 139530, "dataset_size": 222353}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T08:31:32+00:00 | [] | [] | TAGS
#region-us
| Dataset Card for "no\_robots\_test400"
======================================
More Information needed
This is a subset of "no\_robots", selecting 400 questions from the test set.
Code:
| [] | [
"TAGS\n#region-us \n"
] |
86685b7702e57dd155805fcf2556f8e8370a83d4 | .gitattributes
2.31 kB
initial commit
10 minutes ago
Oh gee, no party.mp3
31.7 kB
LFS
Upload Oh gee, no party.mp3 | Voice-man-76/Molly | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T08:29:28+00:00 | {"license": "apache-2.0"} | 2024-01-17T08:34:16+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| .gitattributes
2.31 kB
initial commit
10 minutes ago
Oh gee, no party.mp3
31.7 kB
LFS
Upload Oh gee, no party.mp3 | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
63a7da6a128bac128d3b9d403dec5a02dc9629f0 | # Dataset Card for "Vietnamese-News-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tmnam20/Vietnamese-News-dedup | [
"region:us"
] | 2024-01-17T08:33:39+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6950792197, "num_examples": 20933169}], "download_size": 3419345304, "dataset_size": 6950792197}} | 2024-01-17T08:39:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Vietnamese-News-dedup"
More Information needed | [
"# Dataset Card for \"Vietnamese-News-dedup\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Vietnamese-News-dedup\"\n\nMore Information needed"
] |
6a6266f7ae28c9853c750666aa30e252f3463100 |
# Dataset of shigure_kira (Houkai 3rd)
This is the dataset of shigure_kira (Houkai 3rd), containing 85 images and their tags.
The core tags of this character are `long_hair, blue_eyes, bangs, blue_hair, breasts, hair_ornament, ponytail, ahoge, hair_between_eyes, very_long_hair, braid, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 145.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 71.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 199 | 153.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 124.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 199 | 231.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shigure_kira_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blue_nails, looking_at_viewer, solo, smile, hairclip, nail_polish, one_eye_closed, detached_sleeves, gloves, open_mouth, blue_dress, cleavage, holding_gun, bare_shoulders, single_thighhigh |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, full_body, solo, white_dress, looking_at_viewer, simple_background, white_background, bare_shoulders, black_thighhighs, bow, cleavage, detached_sleeves, hair_flower, holding_sword, white_hair, high_heels, single_glove, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_nails | looking_at_viewer | solo | smile | hairclip | nail_polish | one_eye_closed | detached_sleeves | gloves | open_mouth | blue_dress | cleavage | holding_gun | bare_shoulders | single_thighhigh | full_body | white_dress | simple_background | white_background | black_thighhighs | bow | hair_flower | holding_sword | white_hair | high_heels | single_glove | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:--------|:-----------|:--------------|:-----------------|:-------------------|:---------|:-------------|:-------------|:-----------|:--------------|:-----------------|:-------------------|:------------|:--------------|:--------------------|:-------------------|:-------------------|:------|:--------------|:----------------|:-------------|:-------------|:---------------|:-------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | | | X | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/shigure_kira_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T08:33:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T08:55:11+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of shigure\_kira (Houkai 3rd)
=====================================
This is the dataset of shigure\_kira (Houkai 3rd), containing 85 images and their tags.
The core tags of this character are 'long\_hair, blue\_eyes, bangs, blue\_hair, breasts, hair\_ornament, ponytail, ahoge, hair\_between\_eyes, very\_long\_hair, braid, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
cbe238a08577436ce74e86ed509015faf232f4e7 |
# Dataset of sin_mal (Houkai 3rd)
This is the dataset of sin_mal (Houkai 3rd), containing 186 images and their tags.
The core tags of this character are `ahoge, yellow_eyes, purple_hair, pink_hair, short_hair, heterochromia, bangs, hair_over_one_eye, hair_ornament, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 186 | 357.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 186 | 165.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 468 | 369.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 186 | 295.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 468 | 583.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sin_mal_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sin_mal_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | looking_at_viewer, sharp_teeth, solo, 1girl, grin, fingerless_gloves, thighhighs, top_hat, bare_shoulders, dress, microphone |
| 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, long_hair, looking_at_viewer, smile, solo, white_dress, white_gloves, bare_shoulders, open_mouth, hair_flower, holding_bouquet, bow, eyes_visible_through_hair, sleeveless_dress, virtual_youtuber, wedding_dress |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, hair_flower, long_hair, looking_at_viewer, smile, solo, bare_shoulders, holding, purple_rose, stuffed_animal, stuffed_bunny, closed_mouth, two_side_up, white_background, blue_eyes, blush, crown, object_hug, purple_dress, simple_background |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, smile, solo, long_sleeves, long_hair, two_side_up, virtual_youtuber, ghost, puffy_sleeves, ribbon, two-tone_hair, apron, blush, choker, closed_mouth, frilled_dress, mismatched_legwear, pantyhose, split-color_hair, white_background, white_dress |
| 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, solo, looking_at_viewer, choker, earrings, fox_mask, food, purple_kimono, sharp_teeth, split-color_hair, fireworks, grin, holding, obi, sitting, two-tone_hair, water |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | sharp_teeth | solo | 1girl | grin | fingerless_gloves | thighhighs | top_hat | bare_shoulders | dress | microphone | long_hair | smile | white_dress | white_gloves | open_mouth | hair_flower | holding_bouquet | bow | eyes_visible_through_hair | sleeveless_dress | virtual_youtuber | wedding_dress | holding | purple_rose | stuffed_animal | stuffed_bunny | closed_mouth | two_side_up | white_background | blue_eyes | blush | crown | object_hug | purple_dress | simple_background | long_sleeves | ghost | puffy_sleeves | ribbon | two-tone_hair | apron | choker | frilled_dress | mismatched_legwear | pantyhose | split-color_hair | earrings | fox_mask | food | purple_kimono | fireworks | obi | sitting | water |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------------|:-------|:--------|:-------|:--------------------|:-------------|:----------|:-----------------|:--------|:-------------|:------------|:--------|:--------------|:---------------|:-------------|:--------------|:------------------|:------|:----------------------------|:-------------------|:-------------------|:----------------|:----------|:--------------|:-----------------|:----------------|:---------------|:--------------|:-------------------|:------------|:--------|:--------|:-------------|:---------------|:--------------------|:---------------|:--------|:----------------|:---------|:----------------|:--------|:---------|:----------------|:---------------------|:------------|:-------------------|:-----------|:-----------|:-------|:----------------|:------------|:------|:----------|:--------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | | | X | | | X | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | | | | | | X | X | X | | | | | | | | X | | | | | | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 4 | 9 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/sin_mal_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T08:34:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:32:31+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sin\_mal (Houkai 3rd)
================================
This is the dataset of sin\_mal (Houkai 3rd), containing 186 images and their tags.
The core tags of this character are 'ahoge, yellow\_eyes, purple\_hair, pink\_hair, short\_hair, heterochromia, bangs, hair\_over\_one\_eye, hair\_ornament, multicolored\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ae72538044d6053e1deffe465480632d3b508884 |
# Dataset of carole_pepper (Houkai 3rd)
This is the dataset of carole_pepper (Houkai 3rd), containing 73 images and their tags.
The core tags of this character are `dark_skin, bangs, dark-skinned_female, white_hair, yellow_eyes, short_hair, earrings, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 108.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carole_pepper_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 56.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carole_pepper_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 121.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carole_pepper_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 92.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carole_pepper_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 176.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carole_pepper_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/carole_pepper_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 73 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, white_shirt, jacket_around_waist, bare_shoulders, black_gloves, fingerless_gloves, jewelry, blue_jacket, open_mouth, long_sleeves, shorts, :d |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_shirt | jacket_around_waist | bare_shoulders | black_gloves | fingerless_gloves | jewelry | blue_jacket | open_mouth | long_sleeves | shorts | :d |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:----------------------|:-----------------|:---------------|:--------------------|:----------|:--------------|:-------------|:---------------|:---------|:-----|
| 0 | 73 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/carole_pepper_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T08:48:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:06:34+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of carole\_pepper (Houkai 3rd)
======================================
This is the dataset of carole\_pepper (Houkai 3rd), containing 73 images and their tags.
The core tags of this character are 'dark\_skin, bangs, dark-skinned\_female, white\_hair, yellow\_eyes, short\_hair, earrings, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
105d1b8743b4f64bfd74560dd7fa67c357372bbc |
# Dataset of murata_himeko (Houkai 3rd)
This is the dataset of murata_himeko (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are `red_hair, bangs, yellow_eyes, breasts, large_breasts, long_hair, mole, mole_on_breast`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 719.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 381.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1171 | 803.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 619.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1171 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/murata_himeko_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/murata_himeko_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, solo, wedding_dress, white_dress, bridal_veil, bride, red_rose, smile, cleavage, hair_flower, looking_at_viewer, white_gloves, closed_mouth, petals, holding, simple_background, elbow_gloves, sleeveless, white_background, white_thighhighs |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, closed_mouth, looking_at_viewer, solo, cleavage, hair_ornament, smile, earrings, holding_sword, red_gloves |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, black_gloves, boots, black_shorts, cleavage, red_jacket, thighhighs, belt, closed_mouth, holding_sword, sleeves_rolled_up, looking_at_viewer, smile, fire, aiguillette, cropped_jacket, full_body, short_shorts |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, cleavage, closed_mouth, looking_at_viewer, simple_background, smile, solo, white_background, black_gloves, forehead, red_jacket |
| 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, solo, cleavage, looking_at_viewer, smile, bare_shoulders, closed_mouth, lipstick, forehead, simple_background, white_background, hair_ornament, china_dress, red_dress |
| 5 | 16 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | black_bikini, cleavage, looking_at_viewer, smile, 1girl, solo, closed_mouth, sleeves_rolled_up, white_shirt, black_choker, navel, one_eye_closed, simple_background, alcohol, see-through, side-tie_bikini_bottom, sitting |
| 6 | 17 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, hetero, penis, 1girl, open_mouth, blush, nipples, looking_at_viewer, dark-skinned_male, solo_focus, mosaic_censoring, navel, pussy, sweat, completely_nude, spread_legs, tongue_out, ass, cum, indoors, parted_bangs, sex_from_behind, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | wedding_dress | white_dress | bridal_veil | bride | red_rose | smile | cleavage | hair_flower | looking_at_viewer | white_gloves | closed_mouth | petals | holding | simple_background | elbow_gloves | sleeveless | white_background | white_thighhighs | hair_ornament | earrings | holding_sword | red_gloves | black_gloves | boots | black_shorts | red_jacket | thighhighs | belt | sleeves_rolled_up | fire | aiguillette | cropped_jacket | full_body | short_shorts | forehead | lipstick | china_dress | red_dress | black_bikini | white_shirt | black_choker | navel | one_eye_closed | alcohol | see-through | side-tie_bikini_bottom | sitting | 1boy | hetero | penis | open_mouth | blush | nipples | dark-skinned_male | solo_focus | mosaic_censoring | pussy | sweat | completely_nude | spread_legs | tongue_out | ass | cum | indoors | parted_bangs | sex_from_behind | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:----------------|:--------------|:--------------|:--------|:-----------|:--------|:-----------|:--------------|:--------------------|:---------------|:---------------|:---------|:----------|:--------------------|:---------------|:-------------|:-------------------|:-------------------|:----------------|:-----------|:----------------|:-------------|:---------------|:--------|:---------------|:-------------|:-------------|:-------|:--------------------|:-------|:--------------|:-----------------|:------------|:---------------|:-----------|:-----------|:--------------|:------------|:---------------|:--------------|:---------------|:--------|:-----------------|:----------|:--------------|:-------------------------|:----------|:-------|:---------|:--------|:-------------|:--------|:----------|:--------------------|:-------------|:-------------------|:--------|:--------|:------------------|:--------------|:-------------|:------|:------|:----------|:---------------|:------------------|:----------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | | | X | X | | X | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | | | | X | X | | X | | X | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | | | | | X | X | | X | | X | | | X | | | X | | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | | | | | X | X | | X | | X | | | X | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 16 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | | | | X | X | | X | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 17 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/murata_himeko_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T08:49:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:41:53+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of murata\_himeko (Houkai 3rd)
======================================
This is the dataset of murata\_himeko (Houkai 3rd), containing 500 images and their tags.
The core tags of this character are 'red\_hair, bangs, yellow\_eyes, breasts, large\_breasts, long\_hair, mole, mole\_on\_breast', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
07fcdcd4e89a57c5ceb60394e99a578ce36c2794 | # Dataset Card for "WMT22-Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | haoranxu/WMT22-Test | [
"region:us"
] | 2024-01-17T09:00:57+00:00 | {"dataset_info": [{"config_name": "cs-en", "features": [{"name": "cs-en", "struct": [{"name": "cs", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 325040, "num_examples": 1448}], "download_size": 224193, "dataset_size": 325040}, {"config_name": "de-en", "features": [{"name": "de-en", "struct": [{"name": "de", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 403424, "num_examples": 1984}], "download_size": 267107, "dataset_size": 403424}, {"config_name": "en-cs", "features": [{"name": "en-cs", "struct": [{"name": "cs", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 422875, "num_examples": 2037}], "download_size": 281086, "dataset_size": 422875}, {"config_name": "en-de", "features": [{"name": "en-de", "struct": [{"name": "de", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 442576, "num_examples": 2037}], "download_size": 280415, "dataset_size": 442576}, {"config_name": "en-is", "features": [{"name": "en-is", "struct": [{"name": "en", "dtype": "string"}, {"name": "is", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 310807, "num_examples": 1000}], "download_size": 197437, "dataset_size": 310807}, {"config_name": "en-ru", "features": [{"name": "en-ru", "struct": [{"name": "en", "dtype": "string"}, {"name": "ru", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 598414, "num_examples": 2037}], "download_size": 333784, "dataset_size": 598414}, {"config_name": "en-zh", "features": [{"name": "en-zh", "struct": [{"name": "en", "dtype": "string"}, {"name": "zh", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 383751, "num_examples": 2037}], "download_size": 257805, "dataset_size": 383751}, {"config_name": "is-en", "features": [{"name": "is-en", "struct": [{"name": "en", "dtype": "string"}, {"name": "is", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 248029, "num_examples": 1000}], "download_size": 152885, "dataset_size": 248029}, {"config_name": "ru-en", "features": [{"name": "ru-en", "struct": [{"name": "en", "dtype": "string"}, {"name": "ru", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 579656, "num_examples": 2016}], "download_size": 340830, "dataset_size": 579656}, {"config_name": "zh-en", "features": [{"name": "zh-en", "struct": [{"name": "en", "dtype": "string"}, {"name": "zh", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 526074, "num_examples": 1875}], "download_size": 333078, "dataset_size": 526074}], "configs": [{"config_name": "cs-en", "data_files": [{"split": "test", "path": "cs-en/test-*"}]}, {"config_name": "de-en", "data_files": [{"split": "test", "path": "de-en/test-*"}]}, {"config_name": "en-cs", "data_files": [{"split": "test", "path": "en-cs/test-*"}]}, {"config_name": "en-de", "data_files": [{"split": "test", "path": "en-de/test-*"}]}, {"config_name": "en-is", "data_files": [{"split": "test", "path": "en-is/test-*"}]}, {"config_name": "en-ru", "data_files": [{"split": "test", "path": "en-ru/test-*"}]}, {"config_name": "en-zh", "data_files": [{"split": "test", "path": "en-zh/test-*"}]}, {"config_name": "is-en", "data_files": [{"split": "test", "path": "is-en/test-*"}]}, {"config_name": "ru-en", "data_files": [{"split": "test", "path": "ru-en/test-*"}]}, {"config_name": "zh-en", "data_files": [{"split": "test", "path": "zh-en/test-*"}]}]} | 2024-01-17T09:01:17+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "WMT22-Test"
More Information needed | [
"# Dataset Card for \"WMT22-Test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"WMT22-Test\"\n\nMore Information needed"
] |
2893ade5ac184a7cb1aed51e83c6348f52d8d71a | # Dataset Card for "WMT23-Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | haoranxu/WMT23-Test | [
"region:us"
] | 2024-01-17T09:02:51+00:00 | {"dataset_info": [{"config_name": "de-en", "features": [{"name": "de-en", "struct": [{"name": "de", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 320901, "num_examples": 549}], "download_size": 206261, "dataset_size": 320901}, {"config_name": "en-cs", "features": [{"name": "en-cs", "struct": [{"name": "en", "dtype": "string"}, {"name": "cs", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 431734, "num_examples": 2074}], "download_size": 288875, "dataset_size": 431734}, {"config_name": "en-de", "features": [{"name": "en-de", "struct": [{"name": "en", "dtype": "string"}, {"name": "de", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 435393, "num_examples": 557}], "download_size": 267671, "dataset_size": 435393}, {"config_name": "en-ru", "features": [{"name": "en-ru", "struct": [{"name": "en", "dtype": "string"}, {"name": "ru", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 595222, "num_examples": 2074}], "download_size": 336184, "dataset_size": 595222}, {"config_name": "en-zh", "features": [{"name": "en-zh", "struct": [{"name": "en", "dtype": "string"}, {"name": "zh", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 396956, "num_examples": 2074}], "download_size": 267187, "dataset_size": 396956}, {"config_name": "ru-en", "features": [{"name": "ru-en", "struct": [{"name": "ru", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 496691, "num_examples": 1723}], "download_size": 287075, "dataset_size": 496691}, {"config_name": "zh-en", "features": [{"name": "zh-en", "struct": [{"name": "zh", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "test", "num_bytes": 527889, "num_examples": 1976}], "download_size": 333360, "dataset_size": 527889}], "configs": [{"config_name": "de-en", "data_files": [{"split": "test", "path": "de-en/test-*"}]}, {"config_name": "en-cs", "data_files": [{"split": "test", "path": "en-cs/test-*"}]}, {"config_name": "en-de", "data_files": [{"split": "test", "path": "en-de/test-*"}]}, {"config_name": "en-ru", "data_files": [{"split": "test", "path": "en-ru/test-*"}]}, {"config_name": "en-zh", "data_files": [{"split": "test", "path": "en-zh/test-*"}]}, {"config_name": "ru-en", "data_files": [{"split": "test", "path": "ru-en/test-*"}]}, {"config_name": "zh-en", "data_files": [{"split": "test", "path": "zh-en/test-*"}]}]} | 2024-01-17T09:03:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "WMT23-Test"
More Information needed | [
"# Dataset Card for \"WMT23-Test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"WMT23-Test\"\n\nMore Information needed"
] |
1ca91452521816eefa63640904d75371075e185e |
# Dataset of herrscher_of_origin (Houkai 3rd)
This is the dataset of herrscher_of_origin (Houkai 3rd), containing 427 images and their tags.
The core tags of this character are `bangs, long_hair, purple_hair, purple_eyes, horns, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 427 | 763.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herrscher_of_origin_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 427 | 374.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herrscher_of_origin_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1046 | 789.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herrscher_of_origin_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 427 | 642.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/herrscher_of_origin_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1046 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/herrscher_of_origin_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/herrscher_of_origin_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, closed_mouth, holding_sword, japanese_armor, katana, solo, cleavage, disembodied_limb, hair_between_eyes, single_gauntlet, bare_shoulders, black_gloves, looking_at_viewer, white_thighhighs, electricity, boots, full_body, black_footwear, sheath |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, black_gloves, japanese_armor, looking_at_viewer, solo, cleavage, closed_mouth, hair_between_eyes, electricity, antenna_hair, red_background, simple_background, single_gauntlet |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, closed_mouth, looking_at_viewer, simple_background, solo, white_background, cleavage, armor, hair_between_eyes, smile |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, china_dress, cleavage, hair_ornament, looking_at_viewer, purple_dress, simple_background, solo, white_background, purple_sleeves, smile, closed_mouth, holding, purple_gloves, glasses, official_alternate_costume, sleeveless_dress |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, china_dress, hair_ornament, looking_at_viewer, official_alternate_costume, solo, cleavage, pantyhose, sitting, earrings, large_breasts, purple_dress, smile, black_gloves, glasses, purple_sleeves, ahoge, closed_mouth, no_shoes, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | holding_sword | japanese_armor | katana | solo | cleavage | disembodied_limb | hair_between_eyes | single_gauntlet | bare_shoulders | black_gloves | looking_at_viewer | white_thighhighs | electricity | boots | full_body | black_footwear | sheath | antenna_hair | red_background | simple_background | white_background | armor | smile | china_dress | hair_ornament | purple_dress | purple_sleeves | holding | purple_gloves | glasses | official_alternate_costume | sleeveless_dress | pantyhose | sitting | earrings | large_breasts | ahoge | no_shoes | teeth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:-----------------|:---------|:-------|:-----------|:-------------------|:--------------------|:------------------|:-----------------|:---------------|:--------------------|:-------------------|:--------------|:--------|:------------|:-----------------|:---------|:---------------|:-----------------|:--------------------|:-------------------|:--------|:--------|:--------------|:----------------|:---------------|:-----------------|:----------|:----------------|:----------|:-----------------------------|:-------------------|:------------|:----------|:-----------|:----------------|:--------|:-----------|:--------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | X | X | | X | X | X | X | X | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | X | | X | | X | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | X | X | | | | X | | X | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | X | X | | | | X | X | X | | | | | | | | | | | | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | X |
| CyberHarem/herrscher_of_origin_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:04:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:53:17+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of herrscher\_of\_origin (Houkai 3rd)
=============================================
This is the dataset of herrscher\_of\_origin (Houkai 3rd), containing 427 images and their tags.
The core tags of this character are 'bangs, long\_hair, purple\_hair, purple\_eyes, horns, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2d2f14a83d92a9d63e30597510115d922600302d |
# Dataset of natasha_cioara (Houkai 3rd)
This is the dataset of natasha_cioara (Houkai 3rd), containing 115 images and their tags.
The core tags of this character are `bangs, mole, mole_under_mouth, breasts, short_hair, purple_eyes, red_eyes, black_hair, hair_between_eyes, grey_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 115 | 208.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/natasha_cioara_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 115 | 99.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/natasha_cioara_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 277 | 214.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/natasha_cioara_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 115 | 174.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/natasha_cioara_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 277 | 335.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/natasha_cioara_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/natasha_cioara_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_bodysuit, looking_at_viewer, black_cape, hood, smile, closed_mouth, hair_over_one_eye, claws, simple_background |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, closed_mouth, long_sleeves, solo, black_necktie, long_hair, looking_at_viewer, black_gloves, pantyhose, polo_shirt, smile, bartender, holding_weapon, ponytail, simple_background, single_glove, bird, black_footwear, green_shirt, holding_knife, thigh_boots, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_bodysuit | looking_at_viewer | black_cape | hood | smile | closed_mouth | hair_over_one_eye | claws | simple_background | long_sleeves | black_necktie | long_hair | black_gloves | pantyhose | polo_shirt | bartender | holding_weapon | ponytail | single_glove | bird | black_footwear | green_shirt | holding_knife | thigh_boots | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:-------------|:-------|:--------|:---------------|:--------------------|:--------|:--------------------|:---------------|:----------------|:------------|:---------------|:------------|:-------------|:------------|:-----------------|:-----------|:---------------|:-------|:-----------------|:--------------|:----------------|:--------------|:-------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/natasha_cioara_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:04:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:34:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of natasha\_cioara (Houkai 3rd)
=======================================
This is the dataset of natasha\_cioara (Houkai 3rd), containing 115 images and their tags.
The core tags of this character are 'bangs, mole, mole\_under\_mouth, breasts, short\_hair, purple\_eyes, red\_eyes, black\_hair, hair\_between\_eyes, grey\_hair, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2ea30888bb6ecfa272605520beba4e673743ba96 |
# Dataset of cecilia_shania (Houkai 3rd)
This is the dataset of cecilia_shania (Houkai 3rd), containing 88 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, blue_eyes, hair_between_eyes, white_hair, hair_ornament, very_long_hair, earrings, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 88 | 121.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cecilia_shania_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 88 | 65.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cecilia_shania_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 217 | 138.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cecilia_shania_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 88 | 106.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cecilia_shania_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 217 | 200.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cecilia_shania_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cecilia_shania_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, simple_background, smile, closed_mouth, jewelry, white_dress, bare_shoulders, hair_flower, white_background, cleavage, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | smile | closed_mouth | jewelry | white_dress | bare_shoulders | hair_flower | white_background | cleavage | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:---------------|:----------|:--------------|:-----------------|:--------------|:-------------------|:-----------|:-----------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/cecilia_shania_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:04:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:30:17+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cecilia\_shania (Houkai 3rd)
=======================================
This is the dataset of cecilia\_shania (Houkai 3rd), containing 88 images and their tags.
The core tags of this character are 'long\_hair, bangs, breasts, blue\_eyes, hair\_between\_eyes, white\_hair, hair\_ornament, very\_long\_hair, earrings, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5a2e26566e298b23fec9c026b26ea15dcc50ace6 |
# Dataset of li_sushang (Houkai 3rd)
This is the dataset of li_sushang (Houkai 3rd), containing 190 images and their tags.
The core tags of this character are `brown_hair, long_hair, bangs, breasts, hair_ornament, brown_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 190 | 359.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 190 | 172.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 456 | 361.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 190 | 301.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 456 | 552.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/li_sushang_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/li_sushang_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, china_dress, closed_mouth, fingerless_gloves, looking_at_viewer, smile, solo, white_dress, white_gloves, elbow_gloves |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, china_dress, closed_mouth, elbow_gloves, fingerless_gloves, holding_sword, looking_at_viewer, smile, solo, white_dress, white_gloves |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, nipples, cum_in_pussy, hetero, solo_focus, multiple_penises, open_mouth, vaginal, 3boys, cum_on_breasts, double_handjob, ejaculation, gangbang, navel, nude, piercing, pubic_hair, pubic_tattoo, spread_legs, thighhighs, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | china_dress | closed_mouth | fingerless_gloves | looking_at_viewer | smile | solo | white_dress | white_gloves | elbow_gloves | holding_sword | blush | nipples | cum_in_pussy | hetero | solo_focus | multiple_penises | open_mouth | vaginal | 3boys | cum_on_breasts | double_handjob | ejaculation | gangbang | navel | nude | piercing | pubic_hair | pubic_tattoo | spread_legs | thighhighs | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:---------------|:--------------------|:--------------------|:--------|:-------|:--------------|:---------------|:---------------|:----------------|:--------|:----------|:---------------|:---------|:-------------|:-------------------|:-------------|:----------|:--------|:-----------------|:-----------------|:--------------|:-----------|:--------|:-------|:-----------|:-------------|:---------------|:--------------|:-------------|:-------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/li_sushang_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:04:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:55:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of li\_sushang (Houkai 3rd)
===================================
This is the dataset of li\_sushang (Houkai 3rd), containing 190 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, bangs, breasts, hair\_ornament, brown\_eyes, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
32b368e97d7230344e3e667ab17cbddf27bb9c13 |
# Dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order)
This is the dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `white_hair, animal_ears, blue_eyes, dark_skin, breasts, dark-skinned_female, large_breasts, long_hair, hair_intakes, bangs, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 644.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 377.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1236 | 807.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 577.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1236 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/caenis_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/caenis_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, body_markings, navel, solo, tattoo, black_gloves, elbow_gloves, looking_at_viewer, muscular_female, ponytail, thighhighs, abs, black_bikini, cleavage, sitting |
| 1 | 43 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, body_markings, solo, headpiece, tattoo, elbow_gloves, pauldrons, navel, spear, shield, faulds, black_gloves, looking_at_viewer, gauntlets, highleg_bikini, black_thighhighs, black_bikini, cleavage, waist_cape, red_cape, ponytail, thighs, grin, abs |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, headpiece, looking_at_viewer, solo, pauldrons, tattoo, body_markings, shield, spear, white_background, bikini, cleavage, grin, open_mouth |
| 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, solo, black_bikini, looking_at_viewer, tattoo, body_markings, navel, cleavage, white_background, simple_background, abs, smile, bare_shoulders, dog_tags, highleg_bikini |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, black_bikini, body_markings, cleavage, eyewear_on_head, grin, looking_at_viewer, solo, sunglasses, collarbone, very_long_hair, black_hairband, thighs, white_nails, wristband |
| 5 | 22 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, black_bikini, black_hairband, cleavage, collarbone, eyewear_on_head, solo, sunglasses, blue_sky, body_markings, cloud, looking_at_viewer, navel, very_long_hair, day, thighs, smile, white_nails, wristband, beach, ocean, bracelet, covered_nipples, open_mouth, outdoors, thigh_strap |
| 6 | 36 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_bikini, body_markings, tattoo, denim_shorts, navel, solo, looking_at_viewer, cleavage, highleg_bikini, collarbone, dog_tags, cutoffs, short_shorts, white_jacket, belt, open_jacket, jewelry, long_sleeves, single_thighhigh, smile, off_shoulder, white_nails |
| 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, navel, nipples, solo, tattoo, body_markings, completely_nude, collarbone, looking_at_viewer, simple_background, smile |
| 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | bar_censor, elbow_gloves, sweat, 2girls, black_gloves, blonde_hair, interracial, navel, testicles, thighhighs, blush, bottomless, erection, futa_with_futa, large_penis, multiple_penises, tattoo, white_background, cum, futa_with_female, kneeling, muscular_female, smile, standing_sex, stomach_bulge, vaginal |
| 9 | 19 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | rabbit_ears, fake_animal_ears, playboy_bunny, 1girl, bowtie, cleavage, detached_collar, looking_at_viewer, white_leotard, wrist_cuffs, red_pantyhose, solo, fishnet_pantyhose, highleg_leotard, black_gloves, very_long_hair, tail, thighs, strapless_leotard |
| 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | pleated_skirt, school_uniform, 1girl, bag, cellphone, necktie, solo, collared_shirt, looking_at_viewer, single_thighhigh, thighs, white_nails, white_shirt, black_skirt, blue_skirt, cardigan, guitar_case, long_sleeves, underwear, very_long_hair, wristband, yellow_sweater |
| 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, juliet_sleeves, maid_headdress, solo, enmaided, maid_apron, black_dress, boots, braid, full_body, looking_at_viewer, clenched_teeth, smile, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | body_markings | navel | solo | tattoo | black_gloves | elbow_gloves | looking_at_viewer | muscular_female | ponytail | thighhighs | abs | black_bikini | cleavage | sitting | headpiece | pauldrons | spear | shield | faulds | gauntlets | highleg_bikini | black_thighhighs | waist_cape | red_cape | thighs | grin | white_background | bikini | open_mouth | simple_background | smile | bare_shoulders | dog_tags | eyewear_on_head | sunglasses | collarbone | very_long_hair | black_hairband | white_nails | wristband | blue_sky | cloud | day | beach | ocean | bracelet | covered_nipples | outdoors | thigh_strap | denim_shorts | cutoffs | short_shorts | white_jacket | belt | open_jacket | jewelry | long_sleeves | single_thighhigh | off_shoulder | nipples | completely_nude | bar_censor | sweat | 2girls | blonde_hair | interracial | testicles | blush | bottomless | erection | futa_with_futa | large_penis | multiple_penises | cum | futa_with_female | kneeling | standing_sex | stomach_bulge | vaginal | rabbit_ears | fake_animal_ears | playboy_bunny | bowtie | detached_collar | white_leotard | wrist_cuffs | red_pantyhose | fishnet_pantyhose | highleg_leotard | tail | strapless_leotard | pleated_skirt | school_uniform | bag | cellphone | necktie | collared_shirt | white_shirt | black_skirt | blue_skirt | cardigan | guitar_case | underwear | yellow_sweater | juliet_sleeves | maid_headdress | enmaided | maid_apron | black_dress | boots | braid | full_body | clenched_teeth |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------|:--------|:-------|:---------|:---------------|:---------------|:--------------------|:------------------|:-----------|:-------------|:------|:---------------|:-----------|:----------|:------------|:------------|:--------|:---------|:---------|:------------|:-----------------|:-------------------|:-------------|:-----------|:---------|:-------|:-------------------|:---------|:-------------|:--------------------|:--------|:-----------------|:-----------|:------------------|:-------------|:-------------|:-----------------|:-----------------|:--------------|:------------|:-----------|:--------|:------|:--------|:--------|:-----------|:------------------|:-----------|:--------------|:---------------|:----------|:---------------|:---------------|:-------|:--------------|:----------|:---------------|:-------------------|:---------------|:----------|:------------------|:-------------|:--------|:---------|:--------------|:--------------|:------------|:--------|:-------------|:-----------|:-----------------|:--------------|:-------------------|:------|:-------------------|:-----------|:---------------|:----------------|:----------|:--------------|:-------------------|:----------------|:---------|:------------------|:----------------|:--------------|:----------------|:--------------------|:------------------|:-------|:--------------------|:----------------|:-----------------|:------|:------------|:----------|:-----------------|:--------------|:--------------|:-------------|:-----------|:--------------|:------------|:-----------------|:-----------------|:-----------------|:-----------|:-------------|:--------------|:--------|:--------|:------------|:-----------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 43 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | | | X | | | | | | X | | X | X | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | | | X | | | | X | X | X | | | | | | | | X | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | | | X | | | | | X | X | | | | | | | | | | | | X | X | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | | | | X | | | | | X | X | | | | | | | | | | | | X | | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 36 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | | X | | | | | X | X | | | | | | | | X | | | | | | | | | | X | | X | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | | | X | | X | X | X | | X | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 19 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | | X | | X | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/caenis_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:11:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:53:28+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order)
=============================================
This is the dataset of Caenis/カイニス/凯妮斯 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are 'white\_hair, animal\_ears, blue\_eyes, dark\_skin, breasts, dark-skinned\_female, large\_breasts, long\_hair, hair\_intakes, bangs, hairband', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4cee1a0e3e1daac4b50a759e8d8659ee2e6595bc | # Dataset Card for "ALMA-Human-Parallel"
This is human-written parallel dataset used by [ALMA](https://arxiv.org/abs/2309.11674) translation models.
```
@misc{xu2023paradigm,
title={A Paradigm Shift in Machine Translation: Boosting Translation Performance of Large Language Models},
author={Haoran Xu and Young Jin Kim and Amr Sharaf and Hany Hassan Awadalla},
year={2023},
eprint={2309.11674},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@misc{xu2024contrastive,
title={Contrastive Preference Optimization: Pushing the Boundaries of LLM Performance in Machine Translation},
author={Haoran Xu and Amr Sharaf and Yunmo Chen and Weiting Tan and Lingfeng Shen and Benjamin Van Durme and Kenton Murray and Young Jin Kim},
year={2024},
eprint={2401.08417},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | haoranxu/ALMA-Human-Parallel | [
"arxiv:2309.11674",
"arxiv:2401.08417",
"region:us"
] | 2024-01-17T09:12:01+00:00 | {"dataset_info": [{"config_name": "cs-en", "features": [{"name": "translation", "struct": [{"name": "cs", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 3432181, "num_examples": 12076}, {"name": "validation", "num_bytes": 318813, "num_examples": 1002}], "download_size": 0, "dataset_size": 3750994}, {"config_name": "de-en", "features": [{"name": "translation", "struct": [{"name": "de", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 4108729, "num_examples": 14211}, {"name": "validation", "num_bytes": 329855, "num_examples": 1002}], "download_size": 0, "dataset_size": 4438584}, {"config_name": "is-en", "features": [{"name": "translation", "struct": [{"name": "is", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 554190, "num_examples": 2009}], "download_size": 0, "dataset_size": 554190}, {"config_name": "ru-en", "features": [{"name": "translation", "struct": [{"name": "ru", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 5427552, "num_examples": 15000}, {"name": "validation", "num_bytes": 442271, "num_examples": 1002}], "download_size": 0, "dataset_size": 5869823}, {"config_name": "zh-en", "features": [{"name": "translation", "struct": [{"name": "zh", "dtype": "string"}, {"name": "en", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 4700299, "num_examples": 15406}, {"name": "validation", "num_bytes": 285969, "num_examples": 1002}], "download_size": 0, "dataset_size": 4986268}], "configs": [{"config_name": "cs-en", "data_files": [{"split": "train", "path": "cs-en/train-*"}, {"split": "validation", "path": "cs-en/validation-*"}]}, {"config_name": "de-en", "data_files": [{"split": "train", "path": "de-en/train-*"}, {"split": "validation", "path": "de-en/validation-*"}]}, {"config_name": "is-en", "data_files": [{"split": "train", "path": "is-en/train-*"}]}, {"config_name": "ru-en", "data_files": [{"split": "train", "path": "ru-en/train-*"}, {"split": "validation", "path": "ru-en/validation-*"}]}, {"config_name": "zh-en", "data_files": [{"split": "train", "path": "zh-en/train-*"}, {"split": "validation", "path": "zh-en/validation-*"}]}]} | 2024-01-24T07:35:56+00:00 | [
"2309.11674",
"2401.08417"
] | [] | TAGS
#arxiv-2309.11674 #arxiv-2401.08417 #region-us
| # Dataset Card for "ALMA-Human-Parallel"
This is human-written parallel dataset used by ALMA translation models.
| [
"# Dataset Card for \"ALMA-Human-Parallel\"\n\nThis is human-written parallel dataset used by ALMA translation models."
] | [
"TAGS\n#arxiv-2309.11674 #arxiv-2401.08417 #region-us \n",
"# Dataset Card for \"ALMA-Human-Parallel\"\n\nThis is human-written parallel dataset used by ALMA translation models."
] |
0959e59dc24ac24d2a679f2366ee0b49a0b0dd91 |
# Dataset of Ophelia Phamrsolone (Fate/Grand Order)
This is the dataset of Ophelia Phamrsolone (Fate/Grand Order), containing 180 images and their tags.
The core tags of this character are `long_hair, brown_hair, eyepatch, blue_eyes, hair_over_one_eye, ribbon, black_ribbon, bangs, neck_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 180 | 190.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_phamrsolone_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 180 | 120.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_phamrsolone_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 402 | 247.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_phamrsolone_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 180 | 176.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_phamrsolone_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 402 | 335.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_phamrsolone_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ophelia_phamrsolone_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, upper_body, long_sleeves, looking_at_viewer, collared_shirt, white_shirt, black_jacket, closed_mouth, simple_background, white_background |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_pantyhose, closed_mouth, long_sleeves, looking_at_viewer, simple_background, solo, white_background, blue_skirt, collared_shirt, white_shirt, black_jacket, breasts, cowboy_shot |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, looking_at_viewer, solo, blush, navel, collarbone, medium_breasts, cleavage, black_bikini, open_clothes, simple_background, jacket, long_sleeves, open_mouth |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, black_jacket, blush, hetero, long_sleeves, solo_focus, clothed_sex, condom_wrapper, looking_at_viewer, medium_breasts, open_mouth, pussy, thighs, vaginal, mosaic_censoring, open_jacket, open_shirt, pillow, white_shirt, cleavage, collared_shirt, condom_on_penis, missionary, navel, nipples, on_back, on_side, panties_aside, panty_pull, pantyhose_pull, pink_bra, sex_from_behind, used_condom |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | upper_body | long_sleeves | looking_at_viewer | collared_shirt | white_shirt | black_jacket | closed_mouth | simple_background | white_background | black_pantyhose | blue_skirt | breasts | cowboy_shot | blush | navel | collarbone | medium_breasts | cleavage | black_bikini | open_clothes | jacket | open_mouth | 1boy | hetero | solo_focus | clothed_sex | condom_wrapper | pussy | thighs | vaginal | mosaic_censoring | open_jacket | open_shirt | pillow | condom_on_penis | missionary | nipples | on_back | on_side | panties_aside | panty_pull | pantyhose_pull | pink_bra | sex_from_behind | used_condom |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:---------------|:--------------------|:-----------------|:--------------|:---------------|:---------------|:--------------------|:-------------------|:------------------|:-------------|:----------|:--------------|:--------|:--------|:-------------|:-----------------|:-----------|:---------------|:---------------|:---------|:-------------|:-------|:---------|:-------------|:--------------|:-----------------|:--------|:---------|:----------|:-------------------|:--------------|:-------------|:---------|:------------------|:-------------|:----------|:----------|:----------|:----------------|:-------------|:-----------------|:-----------|:------------------|:--------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | X | X | X | | | | | | | | X | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ophelia_phamrsolone_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:12:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:51:01+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Ophelia Phamrsolone (Fate/Grand Order)
=================================================
This is the dataset of Ophelia Phamrsolone (Fate/Grand Order), containing 180 images and their tags.
The core tags of this character are 'long\_hair, brown\_hair, eyepatch, blue\_eyes, hair\_over\_one\_eye, ribbon, black\_ribbon, bangs, neck\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5269e17a521e5059d8cb465a516817296d074888 |
# e-Callisto Solar Flare Detection Dataset
![](https://www.fhnw.ch/en/++theme++web16theme/assets/media/img/university-applied-sciences-arts-northwestern-switzerland-fhnw-logo.svg)
[Institute of Data Science i4Ds, FHNW](https://i4ds.ch)<br>
Compiled by [Gabriel Torres Gamez | StellarMilk](https://huggingface.co/StellarMilk)
## Overview
This dataset comprises radio spectra from the [e-Callisto solar spectrometer network](https://www.e-callisto.org/index.html), annotated based on [labels from the e-Callisto database](http://soleil.i4ds.ch/solarradio/data/BurstLists/2010-yyyy_Monstein/).
The data was downloaded using the [ecallisto_ng Package](https://github.com/i4Ds/ecallisto_ng). It's designed for training machine learning models to automatically detect and classify solar flares.
## Data Collection
Data has been collected from various stations, with the following date ranges:
| Station | Date Range |
|-------------------|--------------------------|
| Australia-ASSA_01 | 2021-02-13 to 2021-12-11 |
| Australia-ASSA_02 | 2021-02-13 to 2021-12-09 |
| Australia-ASSA_62 | 2021-12-10 to 2023-12-12 |
| Australia-ASSA_63 | 2021-12-10 to 2023-12-12 |
## Data Augmentation
Due to the rarity of solar flares, we've augmented the dataset by padding the time series data around each flare event.
## Caution
The dataset underwent preprocessing and certain assumptions were made for label cleanup. Be aware of potential inaccuracies in the labels.
## Split Recommendations
The dataset doesn't include predefined train-validation-test splits. When creating splits, ensure augmented data does not overlap between training and validation/test sets to avoid data leakage. | i4ds/ecallisto-bursts | [
"task_categories:image-classification",
"size_categories:100K<n<1M",
"astrophysics",
"flares",
"solar flares",
"sun",
"region:us"
] | 2024-01-17T09:16:56+00:00 | {"size_categories": ["100K<n<1M"], "task_categories": ["image-classification"], "pretty_name": "e-Callisto Solar Flare Detection", "tags": ["astrophysics", "flares", "solar flares", "sun"]} | 2024-01-17T12:10:16+00:00 | [] | [] | TAGS
#task_categories-image-classification #size_categories-100K<n<1M #astrophysics #flares #solar flares #sun #region-us
| e-Callisto Solar Flare Detection Dataset
========================================
![](URL
Institute of Data Science i4Ds, FHNW
Compiled by Gabriel Torres Gamez | StellarMilk
Overview
--------
This dataset comprises radio spectra from the e-Callisto solar spectrometer network, annotated based on labels from the e-Callisto database.
The data was downloaded using the ecallisto\_ng Package. It's designed for training machine learning models to automatically detect and classify solar flares.
Data Collection
---------------
Data has been collected from various stations, with the following date ranges:
Data Augmentation
-----------------
Due to the rarity of solar flares, we've augmented the dataset by padding the time series data around each flare event.
Caution
-------
The dataset underwent preprocessing and certain assumptions were made for label cleanup. Be aware of potential inaccuracies in the labels.
Split Recommendations
---------------------
The dataset doesn't include predefined train-validation-test splits. When creating splits, ensure augmented data does not overlap between training and validation/test sets to avoid data leakage.
| [] | [
"TAGS\n#task_categories-image-classification #size_categories-100K<n<1M #astrophysics #flares #solar flares #sun #region-us \n"
] |
967f396771506c4a3228c5985f15c8e0083c25e1 | ## Vi-Ner
### Dataset Description
ner_tags: a list of classification labels (int). Full tagset with indices:
```python
{'B-DATETIME': 0,
'B-LOCATION': 1,
'B-ORGANIZATION': 2,
'B-PERSON': 3,
'I-DATETIME': 4,
'I-LOCATION': 5,
'I-ORGANIZATION': 6,
'I-PERSON': 7,
'O': 8}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
| Vi-Ner |19255| 2407|2407|
### Example
An example of 'train' looks as follows.
```
{
{'tokens': ['NSƯT', 'Hồng', 'Liên', '(trái)', 'đến', 'chúc', 'mừng', 'Thu', 'Trang..'],
'ner_tags': ['B-PERSON', 'I-PERSON', 'I-PERSON', 'O', 'O', 'O', 'O', 'B-PERSON', 'I-PERSON'],
'ner_idx': [3, 7, 7, 8, 8, 8, 8, 3, 7]}
}
```
### Usage
```python
import datasets
vi_ner = datasets.load_dataset('Minggz/Vi-Ner')
vi_ner
``` | Minggz/Vi-Ner | [
"task_categories:token-classification",
"size_categories:10K<n<100K",
"language:vi",
"legal",
"region:us"
] | 2024-01-17T09:19:04+00:00 | {"language": ["vi"], "size_categories": ["10K<n<100K"], "task_categories": ["token-classification"], "tags": ["legal"]} | 2024-01-18T08:27:35+00:00 | [] | [
"vi"
] | TAGS
#task_categories-token-classification #size_categories-10K<n<100K #language-Vietnamese #legal #region-us
| Vi-Ner
------
### Dataset Description
ner\_tags: a list of classification labels (int). Full tagset with indices:
### Data Splits
### Example
An example of 'train' looks as follows.
### Usage
| [
"### Dataset Description\n\n\nner\\_tags: a list of classification labels (int). Full tagset with indices:",
"### Data Splits",
"### Example\n\n\nAn example of 'train' looks as follows.",
"### Usage"
] | [
"TAGS\n#task_categories-token-classification #size_categories-10K<n<100K #language-Vietnamese #legal #region-us \n",
"### Dataset Description\n\n\nner\\_tags: a list of classification labels (int). Full tagset with indices:",
"### Data Splits",
"### Example\n\n\nAn example of 'train' looks as follows.",
"### Usage"
] |
d1498e2664a265b264b95c99fd573cc37d1654be |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Yura32000/eurosat | [
"region:us"
] | 2024-01-17T09:25:38+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "AnnualCrop", "1": "Forest", "2": "HerbaceousVegetation", "3": "Highway", "4": "Industrial", "5": "Pasture", "6": "PermanentCrop", "7": "Residential", "8": "River", "9": "SeaLake"}}}}, {"name": "choices", "dtype": "int64"}, {"name": "prices", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 73997723.2, "num_examples": 21600}, {"name": "test", "num_bytes": 9241099.7, "num_examples": 2700}, {"name": "valid", "num_bytes": 9232043.9, "num_examples": 2700}], "download_size": 91992228, "dataset_size": 92470866.80000001}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2024-01-17T13:45:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7cb296a12c32884bc0b11e497bdee7a0db06dbe1 |
# Dataset of nikola_tesla (Houkai 3rd)
This is the dataset of nikola_tesla (Houkai 3rd), containing 50 images and their tags.
The core tags of this character are `red_hair, twintails, long_hair, red_eyes, bangs, glasses, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 50 | 56.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nikola_tesla_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 50 | 30.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nikola_tesla_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 111 | 67.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nikola_tesla_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 50 | 48.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nikola_tesla_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 111 | 94.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nikola_tesla_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nikola_tesla_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------|
| 0 | 50 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, white_shirt, open_mouth, pantyhose, black_skirt, holding, red_necktie, short_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_shirt | open_mouth | pantyhose | black_skirt | holding | red_necktie | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:-------------|:------------|:--------------|:----------|:--------------|:----------------|
| 0 | 50 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nikola_tesla_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:27:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:38:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of nikola\_tesla (Houkai 3rd)
=====================================
This is the dataset of nikola\_tesla (Houkai 3rd), containing 50 images and their tags.
The core tags of this character are 'red\_hair, twintails, long\_hair, red\_eyes, bangs, glasses, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e71f84ac8abfaadbca0fe4fba6e9664f82a3b480 |
# Dataset of ai_chan (Houkai 3rd)
This is the dataset of ai_chan (Houkai 3rd), containing 106 images and their tags.
The core tags of this character are `green_hair, bangs, hair_bun, double_bun, orange_eyes, long_hair, hair_ornament, breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 106 | 158.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ai_chan_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 106 | 81.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ai_chan_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 251 | 177.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ai_chan_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 106 | 135.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ai_chan_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 251 | 258.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ai_chan_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ai_chan_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, barcode_tattoo, bare_shoulders, black_dress, black_gloves, cleavage, fingerless_gloves, solo, smile, looking_at_viewer, open_mouth, headband, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | barcode_tattoo | bare_shoulders | black_dress | black_gloves | cleavage | fingerless_gloves | solo | smile | looking_at_viewer | open_mouth | headband | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------|:--------------|:---------------|:-----------|:--------------------|:-------|:--------|:--------------------|:-------------|:-----------|:-------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ai_chan_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:27:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:54:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ai\_chan (Houkai 3rd)
================================
This is the dataset of ai\_chan (Houkai 3rd), containing 106 images and their tags.
The core tags of this character are 'green\_hair, bangs, hair\_bun, double\_bun, orange\_eyes, long\_hair, hair\_ornament, breasts, twintails', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0dd6194cf23bd3ec4b9503ef58963e251ce3b66f |
# Dataset of yae_rin (Houkai 3rd)
This is the dataset of yae_rin (Houkai 3rd), containing 18 images and their tags.
The core tags of this character are `long_hair, pink_hair, bangs, blue_eyes, hair_between_eyes, two_side_up, animal_ears, bow, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 23.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_rin_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 13.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_rin_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 26.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_rin_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 20.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_rin_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 37.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yae_rin_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yae_rin_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | long_sleeves, 1girl, open_mouth, solo, looking_at_viewer, dress, blush, holding, :d, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | long_sleeves | 1girl | open_mouth | solo | looking_at_viewer | dress | blush | holding | :d | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:-------------|:-------|:--------------------|:--------|:--------|:----------|:-----|:-------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yae_rin_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:27:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:31:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of yae\_rin (Houkai 3rd)
================================
This is the dataset of yae\_rin (Houkai 3rd), containing 18 images and their tags.
The core tags of this character are 'long\_hair, pink\_hair, bangs, blue\_eyes, hair\_between\_eyes, two\_side\_up, animal\_ears, bow, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
54749e39246f31471c3cbc479394cc857b8d5620 |
# Dataset of taira_no_kagekiyo/平景清/平景清 (Fate/Grand Order)
This is the dataset of taira_no_kagekiyo/平景清/平景清 (Fate/Grand Order), containing 119 images and their tags.
The core tags of this character are `long_hair, black_hair, side_ponytail, breasts, bangs, hat, very_long_hair, parted_bangs, multicolored_eyes, purple_lips`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 119 | 181.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/taira_no_kagekiyo_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 119 | 96.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/taira_no_kagekiyo_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 270 | 197.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/taira_no_kagekiyo_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 119 | 157.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/taira_no_kagekiyo_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 270 | 293.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/taira_no_kagekiyo_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/taira_no_kagekiyo_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 37 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, katana, solo, holding_sword, tate_eboshi, gloves, japanese_armor, makeup, looking_at_viewer, smile, shoulder_armor, black_headwear, dual_wielding, purple_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | katana | solo | holding_sword | tate_eboshi | gloves | japanese_armor | makeup | looking_at_viewer | smile | shoulder_armor | black_headwear | dual_wielding | purple_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:----------------|:--------------|:---------|:-----------------|:---------|:--------------------|:--------|:-----------------|:-----------------|:----------------|:--------------|
| 0 | 37 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/taira_no_kagekiyo_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:32:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:09:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of taira\_no\_kagekiyo/平景清/平景清 (Fate/Grand Order)
=========================================================
This is the dataset of taira\_no\_kagekiyo/平景清/平景清 (Fate/Grand Order), containing 119 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, side\_ponytail, breasts, bangs, hat, very\_long\_hair, parted\_bangs, multicolored\_eyes, purple\_lips', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
79a184b82f999d3c9db9a4266fa9d5fe873206d5 |
# Dataset of okita_souji_alter/沖田総司〔オルタ〕/冲田总司〔Alter〕 (Fate/Grand Order)
This is the dataset of okita_souji_alter/沖田総司〔オルタ〕/冲田总司〔Alter〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `dark_skin, dark-skinned_female, white_hair, ahoge, bow, breasts, hair_bow, hair_between_eyes, black_bow, hair_ornament, tassel, large_breasts, long_hair, bangs, very_long_hair, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 818.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_alter_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 440.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_alter_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1271 | 935.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_alter_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 714.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_alter_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1271 | 1.32 GiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_alter_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/okita_souji_alter_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, red_scarf, solo, arm_guards, cleavage_cutout, short_hair, bare_shoulders, upper_body, blush, closed_mouth, simple_background, white_background |
| 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, katana, solo, cleavage_cutout, black_coat, holding_sword, looking_at_viewer, thigh_strap, black_thighhighs, dress, arm_guards, high_collar, white_eyes |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage_cutout, floating_hair, looking_at_viewer, solo, standing, black_coat, black_thighhighs, closed_mouth, holding_sword, short_dress, thigh_strap, underboob, arm_guards, katana |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, cleavage_cutout, full_moon, holding_sword, katana, night_sky, solo, black_thighhighs, cherry_blossoms, looking_at_viewer, petals, thigh_strap, outdoors, parted_lips, red_dress, high_collar, underboob |
| 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, high_collar, solo, cleavage_cutout, looking_at_viewer, upper_body, simple_background, closed_mouth, white_background, underboob, black_coat, white_eyes |
| 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, black_bikini, blue_sky, blush, cloud, day, looking_at_viewer, navel, ocean, outdoors, bare_shoulders, cleavage, short_hair, beach, closed_mouth, collarbone, cowboy_shot, food, red_scarf, side-tie_bikini_bottom, solo_focus, thigh_strap |
| 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | cropped_jacket, white_bikini, 1girl, black_jacket, orange_scarf, shrug_(clothing), long_sleeves, looking_at_viewer, navel, thighs, solo, medium_breasts, blush, grey_eyes, smile, katana, open_mouth |
| 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1boy, 1girl, looking_at_viewer, solo_focus, penis, blush, nipples, censored, breasts_squeezed_together, cum_on_body, open_mouth, paizuri_under_clothes, sweat |
| 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, censored, hetero, navel, nipples, solo_focus, blush, nude, penis, sex, 1boy, collarbone, vaginal, looking_at_viewer, sweat, cum_in_pussy, girl_on_top, open_mouth, cowgirl_position, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | red_scarf | solo | arm_guards | cleavage_cutout | short_hair | bare_shoulders | upper_body | blush | closed_mouth | simple_background | white_background | katana | black_coat | holding_sword | thigh_strap | black_thighhighs | dress | high_collar | white_eyes | floating_hair | standing | short_dress | underboob | full_moon | night_sky | cherry_blossoms | petals | outdoors | parted_lips | red_dress | black_bikini | blue_sky | cloud | day | navel | ocean | cleavage | beach | collarbone | cowboy_shot | food | side-tie_bikini_bottom | solo_focus | cropped_jacket | white_bikini | black_jacket | orange_scarf | shrug_(clothing) | long_sleeves | thighs | medium_breasts | grey_eyes | smile | open_mouth | 1boy | penis | nipples | censored | breasts_squeezed_together | cum_on_body | paizuri_under_clothes | sweat | hetero | nude | sex | vaginal | cum_in_pussy | girl_on_top | cowgirl_position | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:------------|:-------|:-------------|:------------------|:-------------|:-----------------|:-------------|:--------|:---------------|:--------------------|:-------------------|:---------|:-------------|:----------------|:--------------|:-------------------|:--------|:--------------|:-------------|:----------------|:-----------|:--------------|:------------|:------------|:------------|:------------------|:---------|:-----------|:--------------|:------------|:---------------|:-----------|:--------|:------|:--------|:--------|:-----------|:--------|:-------------|:--------------|:-------|:-------------------------|:-------------|:-----------------|:---------------|:---------------|:---------------|:-------------------|:---------------|:---------|:-----------------|:------------|:--------|:-------------|:-------|:--------|:----------|:-----------|:----------------------------|:--------------|:------------------------|:--------|:---------|:-------|:------|:----------|:---------------|:--------------|:-------------------|:--------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | | | | | X | | | X | X | X | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | X | | | | | | | | X | | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | X | | | X | | X | X | X | | X | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | | | X | X | | X | X | | | | | | X | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | X | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 7 | 12 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | X | | | | | | | | | | | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/okita_souji_alter_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:33:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T11:43:05+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of okita\_souji\_alter/沖田総司〔オルタ〕/冲田总司〔Alter〕 (Fate/Grand Order)
=======================================================================
This is the dataset of okita\_souji\_alter/沖田総司〔オルタ〕/冲田总司〔Alter〕 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are 'dark\_skin, dark-skinned\_female, white\_hair, ahoge, bow, breasts, hair\_bow, hair\_between\_eyes, black\_bow, hair\_ornament, tassel, large\_breasts, long\_hair, bangs, very\_long\_hair, yellow\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d56fd48fcd5b6dfb4568eb9fd1b1035b6f15ec57 |
# Dataset of Dioscuri Pollux (Fate/Grand Order)
This is the dataset of Dioscuri Pollux (Fate/Grand Order), containing 131 images and their tags.
The core tags of this character are `blonde_hair, bangs, breasts, medium_hair, blue_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 131 | 152.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 131 | 98.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 311 | 202.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 131 | 136.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 311 | 266.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dioscuri_pollux_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, armlet, bare_shoulders, diadem, looking_at_viewer, metal_collar, pauldrons, solo, white_robe, halterneck, thighs, black_shirt, bracer, sword, closed_mouth, faulds, simple_background |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, armlet, bare_shoulders, blush, bracer, covered_navel, diadem, halterneck, looking_at_viewer, medium_breasts, simple_background, solo, thighs, white_background, metal_collar, purple_eyes, smile, closed_mouth, white_robe, faulds |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, armlet, black_shirt, diadem, looking_at_viewer, metal_collar, short_hair, twins, white_robe, 1boy, bare_shoulders, brother_and_sister, simple_background, white_background, pauldrons, halterneck, smile |
| 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, blush, hetero, nipples, thighs, diadem, large_breasts, open_mouth, collarbone, penis, armlet, bar_censor, pussy, vaginal, bare_shoulders, girl_on_top, nude, purple_eyes, sex_from_behind, smile, speech_bubble, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armlet | bare_shoulders | diadem | looking_at_viewer | metal_collar | pauldrons | solo | white_robe | halterneck | thighs | black_shirt | bracer | sword | closed_mouth | faulds | simple_background | blush | covered_navel | medium_breasts | white_background | purple_eyes | smile | short_hair | twins | 1boy | brother_and_sister | hetero | nipples | large_breasts | open_mouth | collarbone | penis | bar_censor | pussy | vaginal | girl_on_top | nude | sex_from_behind | speech_bubble | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-----------------|:---------|:--------------------|:---------------|:------------|:-------|:-------------|:-------------|:---------|:--------------|:---------|:--------|:---------------|:---------|:--------------------|:--------|:----------------|:-----------------|:-------------------|:--------------|:--------|:-------------|:--------|:-------|:---------------------|:---------|:----------|:----------------|:-------------|:-------------|:--------|:-------------|:--------|:----------|:--------------|:-------|:------------------|:----------------|:-------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | X | X | | X | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | | | | | | X | | | | | | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dioscuri_pollux_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:37:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:04:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of Dioscuri Pollux (Fate/Grand Order)
=============================================
This is the dataset of Dioscuri Pollux (Fate/Grand Order), containing 131 images and their tags.
The core tags of this character are 'blonde\_hair, bangs, breasts, medium\_hair, blue\_eyes, small\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e23192be558ba638e23fef80b9fead9ab57f87b0 |
# Dataset of blood_embrace (Houkai 3rd)
This is the dataset of blood_embrace (Houkai 3rd), containing 14 images and their tags.
The core tags of this character are `hair_between_eyes, long_hair, bangs, blue_eyes, ahoge, horns, very_long_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 8.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blood_embrace_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 6.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blood_embrace_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 9.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blood_embrace_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 7.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blood_embrace_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 11.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blood_embrace_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blood_embrace_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, bare_shoulders, looking_at_viewer, white_background, black_gloves, open_mouth, simple_background, black_dress, smile, striped_thighhighs, full_body, black_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | white_background | black_gloves | open_mouth | simple_background | black_dress | smile | striped_thighhighs | full_body | black_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:-------------------|:---------------|:-------------|:--------------------|:--------------|:--------|:---------------------|:------------|:-----------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/blood_embrace_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:38:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:41:34+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of blood\_embrace (Houkai 3rd)
======================================
This is the dataset of blood\_embrace (Houkai 3rd), containing 14 images and their tags.
The core tags of this character are 'hair\_between\_eyes, long\_hair, bangs, blue\_eyes, ahoge, horns, very\_long\_hair, white\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1144f7e0cc7b54bb376332ce835a50ba8c7c2f58 |
# Dataset of dr_mei (Houkai 3rd)
This is the dataset of dr_mei (Houkai 3rd), containing 27 images and their tags.
The core tags of this character are `long_hair, bangs, purple_hair, purple_eyes, glasses, ponytail, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 30.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dr_mei_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 21.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dr_mei_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 55 | 36.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dr_mei_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 28.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dr_mei_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 55 | 46.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dr_mei_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dr_mei_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jacket, long_sleeves, shirt, solo, closed_mouth, looking_at_viewer, smile, white_coat, labcoat, necktie |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | closed_mouth, holding, short_sleeves, 1girl, hair_bow, looking_at_viewer, sailor_collar, sitting, solo, black_footwear, black_shirt, black_skirt, book, serafuku, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | long_sleeves | shirt | solo | closed_mouth | looking_at_viewer | smile | white_coat | labcoat | necktie | holding | short_sleeves | hair_bow | sailor_collar | sitting | black_footwear | black_shirt | black_skirt | book | serafuku | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:--------|:-------|:---------------|:--------------------|:--------|:-------------|:----------|:----------|:----------|:----------------|:-----------|:----------------|:----------|:-----------------|:--------------|:--------------|:-------|:-----------|:-------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dr_mei_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:38:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T09:44:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of dr\_mei (Houkai 3rd)
===============================
This is the dataset of dr\_mei (Houkai 3rd), containing 27 images and their tags.
The core tags of this character are 'long\_hair, bangs, purple\_hair, purple\_eyes, glasses, ponytail, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
822267d392571e5afb6bd229db52b044f91e8a6d |
# Dataset of miracle_magic_girl (Houkai 3rd)
This is the dataset of miracle_magic_girl (Houkai 3rd), containing 52 images and their tags.
The core tags of this character are `long_hair, purple_hair, hair_between_eyes, bangs, yellow_eyes, hair_ornament, very_long_hair, breasts, symbol-shaped_pupils, diamond-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 52 | 125.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miracle_magic_girl_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 52 | 54.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miracle_magic_girl_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 138 | 117.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miracle_magic_girl_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 52 | 101.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miracle_magic_girl_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 138 | 189.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miracle_magic_girl_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/miracle_magic_girl_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, solo, cleavage, closed_mouth, dress, looking_at_viewer, purple_gloves, blush, hairband, simple_background, diamond_(shape), navel, upper_body, white_background |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, :d, bare_shoulders, looking_at_viewer, open_mouth, purple_gloves, solo, cleavage, fang, fingerless_gloves, hairband, purple_dress, black_gloves, diamond_(shape), sitting, white_dress |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, long_sleeves, white_shirt, black_skirt, looking_at_viewer, solo, closed_mouth, high-waist_skirt, purple_bowtie, thigh_strap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | cleavage | closed_mouth | dress | looking_at_viewer | purple_gloves | blush | hairband | simple_background | diamond_(shape) | navel | upper_body | white_background | :d | open_mouth | fang | fingerless_gloves | purple_dress | black_gloves | sitting | white_dress | long_sleeves | white_shirt | black_skirt | high-waist_skirt | purple_bowtie | thigh_strap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:-----------|:---------------|:--------|:--------------------|:----------------|:--------|:-----------|:--------------------|:------------------|:--------|:-------------|:-------------------|:-----|:-------------|:-------|:--------------------|:---------------|:---------------|:----------|:--------------|:---------------|:--------------|:--------------|:-------------------|:----------------|:--------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | | | | | | |
| 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
| CyberHarem/miracle_magic_girl_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T09:48:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T10:02:52+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of miracle\_magic\_girl (Houkai 3rd)
============================================
This is the dataset of miracle\_magic\_girl (Houkai 3rd), containing 52 images and their tags.
The core tags of this character are 'long\_hair, purple\_hair, hair\_between\_eyes, bangs, yellow\_eyes, hair\_ornament, very\_long\_hair, breasts, symbol-shaped\_pupils, diamond-shaped\_pupils', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d5f5304cc29bdb0c12cbd171a64d22139f4b6a94 |
# Dataset Card for Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/MetaMath-bagel-34b-v0.2-c1500](https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T09:50:20.465897](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500/blob/main/results_2024-01-17T09-50-20.465897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7413320969592924,
"acc_stderr": 0.029043054551903404,
"acc_norm": 0.7446051241876451,
"acc_norm_stderr": 0.029606969755429664,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5370395824057138,
"mc2_stderr": 0.015318939057636297
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670731,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.6275642302330213,
"acc_stderr": 0.004824655406075562,
"acc_norm": 0.8243377813184625,
"acc_norm_stderr": 0.003797548252851623
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.037245636197746304,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.037245636197746304
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262585,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8025641025641026,
"acc_stderr": 0.020182646968674826,
"acc_norm": 0.8025641025641026,
"acc_norm_stderr": 0.020182646968674826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02300545944667395,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02300545944667395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.012809780081878929,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.012809780081878929
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486885,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486885
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.033432700628696216,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.033432700628696216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253864,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253864
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.010830724713134182,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.010830724713134182
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7865921787709497,
"acc_stderr": 0.01370285993219609,
"acc_norm": 0.7865921787709497,
"acc_norm_stderr": 0.01370285993219609
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.021339479988816027,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.021339479988816027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257114,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614095,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5625814863102999,
"acc_stderr": 0.012669813464935719,
"acc_norm": 0.5625814863102999,
"acc_norm_stderr": 0.012669813464935719
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.016358044297478506,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.016358044297478506
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659407,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659407
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072878,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072878
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5370395824057138,
"mc2_stderr": 0.015318939057636297
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.011030335798617443
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500 | [
"region:us"
] | 2024-01-17T09:49:53+00:00 | {"pretty_name": "Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/MetaMath-bagel-34b-v0.2-c1500](https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T09:50:20.465897](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500/blob/main/results_2024-01-17T09-50-20.465897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7413320969592924,\n \"acc_stderr\": 0.029043054551903404,\n \"acc_norm\": 0.7446051241876451,\n \"acc_norm_stderr\": 0.029606969755429664,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5370395824057138,\n \"mc2_stderr\": 0.015318939057636297\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670731,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6275642302330213,\n \"acc_stderr\": 0.004824655406075562,\n \"acc_norm\": 0.8243377813184625,\n \"acc_norm_stderr\": 0.003797548252851623\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101456,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101456\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432302,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262585,\n \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262585\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8025641025641026,\n \"acc_stderr\": 0.020182646968674826,\n \"acc_norm\": 0.8025641025641026,\n \"acc_norm_stderr\": 0.020182646968674826\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02300545944667395,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02300545944667395\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.012809780081878929,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.012809780081878929\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486885,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486885\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.033432700628696216,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.033432700628696216\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253864,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253864\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n \"acc_stderr\": 0.010830724713134182,\n \"acc_norm\": 0.8978288633461047,\n \"acc_norm_stderr\": 0.010830724713134182\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7865921787709497,\n \"acc_stderr\": 0.01370285993219609,\n \"acc_norm\": 0.7865921787709497,\n \"acc_norm_stderr\": 0.01370285993219609\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.021339479988816027,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.021339479988816027\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.7877813504823151,\n \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257114,\n \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5625814863102999,\n \"acc_stderr\": 0.012669813464935719,\n \"acc_norm\": 0.5625814863102999,\n \"acc_norm_stderr\": 0.012669813464935719\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.016358044297478506,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.016358044297478506\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659407,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659407\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072878,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072878\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5370395824057138,\n \"mc2_stderr\": 0.015318939057636297\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.011030335798617443\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \"acc_stderr\": 0.012522795894420869\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|arc:challenge|25_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|arc:challenge|25_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|gsm8k|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|gsm8k|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hellaswag|10_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hellaswag|10_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T09-47-33.246115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["**/details_harness|winogrande|5_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["**/details_harness|winogrande|5_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T09-50-20.465897.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T09_47_33.246115", "path": ["results_2024-01-17T09-47-33.246115.parquet"]}, {"split": "2024_01_17T09_50_20.465897", "path": ["results_2024-01-17T09-50-20.465897.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T09-50-20.465897.parquet"]}]}]} | 2024-01-17T09:52:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500
Dataset automatically created during the evaluation run of model abacusai/MetaMath-bagel-34b-v0.2-c1500 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T09:50:20.465897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500\n\n\n\nDataset automatically created during the evaluation run of model abacusai/MetaMath-bagel-34b-v0.2-c1500 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T09:50:20.465897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500\n\n\n\nDataset automatically created during the evaluation run of model abacusai/MetaMath-bagel-34b-v0.2-c1500 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T09:50:20.465897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
39efc8961a1a3d3a6044bb84b97dcaea7850c1b5 |
# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-33B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [WhiteRabbitNeo/WhiteRabbitNeo-33B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-33B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-33B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T09:51:00.139544](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-33B-v1/blob/main/results_2024-01-17T09-51-00.139544.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4084433830187643,
"acc_stderr": 0.03461298543112304,
"acc_norm": 0.40954626519765713,
"acc_norm_stderr": 0.03533514396550226,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.416805939433293,
"mc2_stderr": 0.014767283735086846
},
"harness|arc:challenge|25": {
"acc": 0.40187713310580203,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.44368600682593856,
"acc_norm_stderr": 0.014518421825670452
},
"harness|hellaswag|10": {
"acc": 0.44831706831308504,
"acc_stderr": 0.004963053161193613,
"acc_norm": 0.6021708822943637,
"acc_norm_stderr": 0.004884495069459711
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.432258064516129,
"acc_stderr": 0.02818173972001941,
"acc_norm": 0.432258064516129,
"acc_norm_stderr": 0.02818173972001941
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43523316062176165,
"acc_stderr": 0.03578038165008586,
"acc_norm": 0.43523316062176165,
"acc_norm_stderr": 0.03578038165008586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.024433016466052452,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.024433016466052452
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47339449541284406,
"acc_stderr": 0.021406952688151584,
"acc_norm": 0.47339449541284406,
"acc_norm_stderr": 0.021406952688151584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.03172295004332333,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.03172295004332333
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4508301404853129,
"acc_stderr": 0.01779329757269904,
"acc_norm": 0.4508301404853129,
"acc_norm_stderr": 0.01779329757269904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4479768786127168,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.4479768786127168,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.01475690648326066,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.01475690648326066
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36419753086419754,
"acc_stderr": 0.02677492989972233,
"acc_norm": 0.36419753086419754,
"acc_norm_stderr": 0.02677492989972233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3116036505867014,
"acc_stderr": 0.011829039182849648,
"acc_norm": 0.3116036505867014,
"acc_norm_stderr": 0.011829039182849648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.527363184079602,
"acc_stderr": 0.035302355173346824,
"acc_norm": 0.527363184079602,
"acc_norm_stderr": 0.035302355173346824
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3742690058479532,
"acc_stderr": 0.03711601185389482,
"acc_norm": 0.3742690058479532,
"acc_norm_stderr": 0.03711601185389482
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.416805939433293,
"mc2_stderr": 0.014767283735086846
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008463
},
"harness|gsm8k|5": {
"acc": 0.3373768006065201,
"acc_stderr": 0.013023665136222091
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-33B-v1 | [
"region:us"
] | 2024-01-17T09:53:16+00:00 | {"pretty_name": "Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-33B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [WhiteRabbitNeo/WhiteRabbitNeo-33B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-33B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-33B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T09:51:00.139544](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-33B-v1/blob/main/results_2024-01-17T09-51-00.139544.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4084433830187643,\n \"acc_stderr\": 0.03461298543112304,\n \"acc_norm\": 0.40954626519765713,\n \"acc_norm_stderr\": 0.03533514396550226,\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.416805939433293,\n \"mc2_stderr\": 0.014767283735086846\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.40187713310580203,\n \"acc_stderr\": 0.014327268614578276,\n \"acc_norm\": 0.44368600682593856,\n \"acc_norm_stderr\": 0.014518421825670452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44831706831308504,\n \"acc_stderr\": 0.004963053161193613,\n \"acc_norm\": 0.6021708822943637,\n \"acc_norm_stderr\": 0.004884495069459711\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.03036505082911521,\n \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.03036505082911521\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.432258064516129,\n \"acc_stderr\": 0.02818173972001941,\n \"acc_norm\": 0.432258064516129,\n \"acc_norm_stderr\": 0.02818173972001941\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4292929292929293,\n \"acc_stderr\": 0.035265527246011986,\n \"acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.035265527246011986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.43523316062176165,\n \"acc_stderr\": 0.03578038165008586,\n \"acc_norm\": 0.43523316062176165,\n \"acc_norm_stderr\": 0.03578038165008586\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.024433016466052452,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.024433016466052452\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.47339449541284406,\n \"acc_stderr\": 0.021406952688151584,\n \"acc_norm\": 0.47339449541284406,\n \"acc_norm_stderr\": 0.021406952688151584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.03384132045674119,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.03384132045674119\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3881856540084388,\n \"acc_stderr\": 0.03172295004332333,\n \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.03172295004332333\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.35874439461883406,\n \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4508301404853129,\n \"acc_stderr\": 0.01779329757269904,\n \"acc_norm\": 0.4508301404853129,\n \"acc_norm_stderr\": 0.01779329757269904\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4479768786127168,\n \"acc_stderr\": 0.02677299065336182,\n \"acc_norm\": 0.4479768786127168,\n \"acc_norm_stderr\": 0.02677299065336182\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.01475690648326066,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.01475690648326066\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.40522875816993464,\n \"acc_stderr\": 0.028110928492809075,\n \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.028110928492809075\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.36419753086419754,\n \"acc_stderr\": 0.02677492989972233,\n \"acc_norm\": 0.36419753086419754,\n \"acc_norm_stderr\": 0.02677492989972233\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3116036505867014,\n \"acc_stderr\": 0.011829039182849648,\n \"acc_norm\": 0.3116036505867014,\n \"acc_norm_stderr\": 0.011829039182849648\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.019023726160724556,\n \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.019023726160724556\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.527363184079602,\n \"acc_stderr\": 0.035302355173346824,\n \"acc_norm\": 0.527363184079602,\n \"acc_norm_stderr\": 0.035302355173346824\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3742690058479532,\n \"acc_stderr\": 0.03711601185389482,\n \"acc_norm\": 0.3742690058479532,\n \"acc_norm_stderr\": 0.03711601185389482\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.416805939433293,\n \"mc2_stderr\": 0.014767283735086846\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3373768006065201,\n \"acc_stderr\": 0.013023665136222091\n }\n}\n```", "repo_url": "https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-33B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|arc:challenge|25_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|gsm8k|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hellaswag|10_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T09-51-00.139544.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["**/details_harness|winogrande|5_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T09-51-00.139544.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T09_51_00.139544", "path": ["results_2024-01-17T09-51-00.139544.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T09-51-00.139544.parquet"]}]}]} | 2024-01-17T09:53:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-33B-v1
Dataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-33B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T09:51:00.139544(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-33B-v1\n\n\n\nDataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-33B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T09:51:00.139544(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-33B-v1\n\n\n\nDataset automatically created during the evaluation run of model WhiteRabbitNeo/WhiteRabbitNeo-33B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T09:51:00.139544(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6f6d7db1b2eeb18d2879c9247273a4ca274e0a98 | # Dataset Card for "debug_drugprot2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jakelever/debug_drugprot2 | [
"region:us"
] | 2024-01-17T10:03:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "e1_indices", "sequence": "int64"}, {"name": "e2_indices", "sequence": "int64"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "label", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 26601010, "num_examples": 48689}, {"name": "val", "num_bytes": 6317892, "num_examples": 12135}, {"name": "test", "num_bytes": 6595188, "num_examples": 12621}], "download_size": 4223780, "dataset_size": 39514090}} | 2024-01-17T10:03:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "debug_drugprot2"
More Information needed | [
"# Dataset Card for \"debug_drugprot2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"debug_drugprot2\"\n\nMore Information needed"
] |
5546aa563d6c05f58699a5ce1244f862a58b2992 |
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Estopia
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Estopia](https://huggingface.co/KoboldAI/LLaMA2-13B-Estopia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Estopia",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T10:20:16.827865](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Estopia/blob/main/results_2024-01-17T10-20-16.827865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5495662014719073,
"acc_stderr": 0.03376695103950105,
"acc_norm": 0.5570408829233439,
"acc_norm_stderr": 0.034526962041142965,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5413705500332432,
"mc2_stderr": 0.015402241345158227
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221009,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.0141633668961926
},
"harness|hellaswag|10": {
"acc": 0.6280621390161323,
"acc_stderr": 0.004823341569605421,
"acc_norm": 0.8251344353714399,
"acc_norm_stderr": 0.0037907576465758984
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983067,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973467,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973467
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7229357798165138,
"acc_stderr": 0.01918848259016953,
"acc_norm": 0.7229357798165138,
"acc_norm_stderr": 0.01918848259016953
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040342,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040342
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7394636015325671,
"acc_stderr": 0.01569600856380707,
"acc_norm": 0.7394636015325671,
"acc_norm_stderr": 0.01569600856380707
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.02918980567358709,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.02918980567358709
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534353,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485697,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485697
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02017548876548404,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02017548876548404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.03038726291954772,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.03038726291954772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355554,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5413705500332432,
"mc2_stderr": 0.015402241345158227
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.01204235252617479
},
"harness|gsm8k|5": {
"acc": 0.13419257012888552,
"acc_stderr": 0.009388953419897745
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Estopia | [
"region:us"
] | 2024-01-17T10:22:36+00:00 | {"pretty_name": "Evaluation run of KoboldAI/LLaMA2-13B-Estopia", "dataset_summary": "Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Estopia](https://huggingface.co/KoboldAI/LLaMA2-13B-Estopia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Estopia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T10:20:16.827865](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Estopia/blob/main/results_2024-01-17T10-20-16.827865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5495662014719073,\n \"acc_stderr\": 0.03376695103950105,\n \"acc_norm\": 0.5570408829233439,\n \"acc_norm_stderr\": 0.034526962041142965,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5413705500332432,\n \"mc2_stderr\": 0.015402241345158227\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221009,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.0141633668961926\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6280621390161323,\n \"acc_stderr\": 0.004823341569605421,\n \"acc_norm\": 0.8251344353714399,\n \"acc_norm_stderr\": 0.0037907576465758984\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983067,\n \"acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983067\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.026453508054040342,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.026453508054040342\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7394636015325671,\n \"acc_stderr\": 0.01569600856380707,\n \"acc_norm\": 0.7394636015325671,\n \"acc_norm_stderr\": 0.01569600856380707\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3971631205673759,\n \"acc_stderr\": 0.02918980567358709,\n \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.02918980567358709\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485697,\n \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485697\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02017548876548404,\n \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02017548876548404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954772,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.031157150869355554,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.031157150869355554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5413705500332432,\n \"mc2_stderr\": 0.015402241345158227\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.01204235252617479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13419257012888552,\n \"acc_stderr\": 0.009388953419897745\n }\n}\n```", "repo_url": "https://huggingface.co/KoboldAI/LLaMA2-13B-Estopia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|arc:challenge|25_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|gsm8k|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hellaswag|10_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T10-20-16.827865.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["**/details_harness|winogrande|5_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T10-20-16.827865.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T10_20_16.827865", "path": ["results_2024-01-17T10-20-16.827865.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T10-20-16.827865.parquet"]}]}]} | 2024-01-17T10:23:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Estopia
Dataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Estopia on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-17T10:20:16.827865(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Estopia\n\n\n\nDataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Estopia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T10:20:16.827865(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Estopia\n\n\n\nDataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Estopia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-17T10:20:16.827865(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6ddd93825820a92b06215c56b12860f3caa0b86b |
# Dataset of charlotte (Houkai 3rd)
This is the dataset of charlotte (Houkai 3rd), containing 158 images and their tags.
The core tags of this character are `pink_hair, hat, breasts, bangs, red_headwear, short_hair, beret, green_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 158 | 321.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 158 | 150.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 428 | 358.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 158 | 270.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 428 | 571.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/charlotte_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, white_gloves, bare_shoulders, long_sleeves, white_shirt, white_background, monocle, sleeveless_shirt, :d, open_mouth, peaked_cap, simple_background, blush, detached_sleeves, holding, upper_body, bow, hair_between_eyes, hat_feather, off_shoulder |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1boy, 1girl, hetero, mosaic_censoring, solo_focus, bare_shoulders, blush, looking_at_viewer, penis, blue_eyes, pov, shirt, white_gloves, hat_feather, monocle, outdoors, belt, fellatio, jewelry, long_hair, off_shoulder, open_mouth, smile |
| 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1boy, 1girl, hetero, nipples, nude, sex, blush, navel, open_mouth, solo_focus, collarbone, sweat, penis, pussy, looking_at_viewer, spread_legs, vaginal, medium_hair, mosaic_censoring |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, black_pantyhose, blush, detached_collar, fake_animal_ears, fake_tail, from_behind, looking_at_viewer, looking_back, playboy_bunny, rabbit_ears, rabbit_tail, solo, black_leotard, strapless_leotard, thighband_pantyhose, closed_mouth, indoors, on_bed, black_footwear, cameltoe, hairband, high_heels, large_breasts, pink_leotard, shoulder_blades, sideboob, bare_back, black_bowtie, glasses, pillow, squatting, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_gloves | bare_shoulders | long_sleeves | white_shirt | white_background | monocle | sleeveless_shirt | :d | open_mouth | peaked_cap | simple_background | blush | detached_sleeves | holding | upper_body | bow | hair_between_eyes | hat_feather | off_shoulder | 1boy | hetero | mosaic_censoring | solo_focus | penis | blue_eyes | pov | shirt | outdoors | belt | fellatio | jewelry | long_hair | smile | nipples | nude | sex | navel | collarbone | sweat | pussy | spread_legs | vaginal | medium_hair | black_pantyhose | detached_collar | fake_animal_ears | fake_tail | from_behind | looking_back | playboy_bunny | rabbit_ears | rabbit_tail | black_leotard | strapless_leotard | thighband_pantyhose | closed_mouth | indoors | on_bed | black_footwear | cameltoe | hairband | high_heels | large_breasts | pink_leotard | shoulder_blades | sideboob | bare_back | black_bowtie | glasses | pillow | squatting | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:-----------------|:---------------|:--------------|:-------------------|:----------|:-------------------|:-----|:-------------|:-------------|:--------------------|:--------|:-------------------|:----------|:-------------|:------|:--------------------|:--------------|:---------------|:-------|:---------|:-------------------|:-------------|:--------|:------------|:------|:--------|:-----------|:-------|:-----------|:----------|:------------|:--------|:----------|:-------|:------|:--------|:-------------|:--------|:--------|:--------------|:----------|:--------------|:------------------|:------------------|:-------------------|:------------|:--------------|:---------------|:----------------|:--------------|:--------------|:----------------|:--------------------|:----------------------|:---------------|:----------|:---------|:-----------------|:-----------|:-----------|:-------------|:----------------|:---------------|:------------------|:-----------|:------------|:---------------|:----------|:---------|:------------|:--------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | | | X | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | | | | | | | X | | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/charlotte_honkai3 | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T10:32:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T11:21:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of charlotte (Houkai 3rd)
=================================
This is the dataset of charlotte (Houkai 3rd), containing 158 images and their tags.
The core tags of this character are 'pink\_hair, hat, breasts, bangs, red\_headwear, short\_hair, beret, green\_eyes, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7ba3eef5c5d099e8a22f49480822d19c61fdb75d |
# Repository Level Code Completion Dataset for Evaluation
This is a dataset of repository snapshots before a commit where a python file has been added. One needs to complete added file with given content of repository composed in different ways.
## How to load the data
1. via [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.load_dataset):
```
from datasets import load_dataset
data_files = # choose from the table below
dataset = load_dataset("jenyag/repo-code-completion", data_files=data_files, split="train")
```
#### Options for `data_files`:
| | **all_context** | **non_py_context** | **py_context** |
|----|----|----|----|
| **function class mask half composer** | data/function_class_mask_half_composer/all_context/test-* | data/function_class_mask_half_composer/non_py_context/test-* | data/function_class_mask_half_composer/py_context/test-* |
| **imports first composer** | data/imports_first_composer/all_context/test-* | data/imports_first_composer/non_py_context/test-* | data/imports_first_composer/py_context/test-* |
| **alphabetical composer** | data/alphabetical_composer/all_context/test-* | data/alphabetical_composer/non_py_context/test-* | data/alphabetical_composer/py_context/test-* |
| **naive composer** | data/naive_composer/all_context/test-* | data/naive_composer/non_py_context/test-* | data/naive_composer/py_context/test-* |
| **path distance composer** | data/path_distance_composer/all_context/test-* | data/path_distance_composer/non_py_context/test-* | data/path_distance_composer/py_context/test-* |
| **file length composer** | data/file_length_composer/all_context/test-* | data/file_length_composer/non_py_context/test-* | data/file_length_composer/py_context/test-* |
| **half memory composer** | data/half_memory_composer/all_context/test-* | data/half_memory_composer/non_py_context/test-* | data/half_memory_composer/py_context/test-* |
| **function class mask one composer** | data/function_class_mask_one_composer/all_context/test-* | data/function_class_mask_one_composer/non_py_context/test-* | data/function_class_mask_one_composer/py_context/test-* |
## How to get the full context for the specific line
```
for datapoint in dataset:
project_context = datapoint['project_context'] # The project context may be quite long
for file_context_dict, ground_truth in zip(datapoint['file_context'], datapoint['gt']):
file_context = file_context_dict['content']
full_context = project_context + file_context
```
| jenyag/repo-code-completion | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T10:35:20+00:00 | {"license": "apache-2.0", "dataset_info": [{"config_name": "alphabetical_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 236538429, "dataset_size": 590554966}, {"config_name": "alphabetical_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 226511858, "dataset_size": 560157388}, {"config_name": "alphabetical_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22096586, "dataset_size": 114370147}, {"config_name": "file_length_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 239093262, "dataset_size": 590554966}, {"config_name": "file_length_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 228632512, "dataset_size": 560157388}, {"config_name": "file_length_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22181715, "dataset_size": 114370147}, {"config_name": "function_class_mask_half_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 316335006, "num_examples": 224}], "download_size": 0, "dataset_size": 316335006}, {"config_name": "function_class_mask_half_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 315664977, "num_examples": 224}], "download_size": 127938122, "dataset_size": 315664977}, {"config_name": "function_class_mask_half_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 101260211, "num_examples": 224}], "download_size": 17862587, "dataset_size": 101260211}, {"config_name": "function_class_mask_one_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 90116249, "num_examples": 224}], "download_size": 13554986, "dataset_size": 90116249}, {"config_name": "function_class_mask_one_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 105054619, "num_examples": 224}], "download_size": 15624970, "dataset_size": 105054619}, {"config_name": "function_class_mask_one_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 87046937, "num_examples": 224}], "download_size": 12999652, "dataset_size": 87046937}, {"config_name": "half_memory_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 334960024, "num_examples": 224}], "download_size": 123799195, "dataset_size": 334960024}, {"config_name": "half_memory_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 311325289, "num_examples": 224}], "download_size": 115444406, "dataset_size": 311325289}, {"config_name": "half_memory_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 99351776, "num_examples": 224}], "download_size": 18008844, "dataset_size": 99351776}, {"config_name": "imports_first_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 236389259, "dataset_size": 590554966}, {"config_name": "imports_first_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 226465503, "dataset_size": 560157388}, {"config_name": "imports_first_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22077336, "dataset_size": 114370147}, {"config_name": "naive_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 236382094, "dataset_size": 590554966}, {"config_name": "naive_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 226480268, "dataset_size": 560157388}, {"config_name": "naive_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22084803, "dataset_size": 114370147}, {"config_name": "path_distance_composer_all_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 590554966, "num_examples": 224}], "download_size": 236585246, "dataset_size": 590554966}, {"config_name": "path_distance_composer_non_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 560157388, "num_examples": 224}], "download_size": 226460548, "dataset_size": 560157388}, {"config_name": "path_distance_composer_py_context", "features": [{"name": "repo_id", "dtype": "int64"}, {"name": "repo_name", "dtype": "string"}, {"name": "project_context", "dtype": "string"}, {"name": "file_context", "list": [{"name": "content", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "gt", "sequence": "string"}, {"name": "metainfo_separator", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 114370147, "num_examples": 224}], "download_size": 22014753, "dataset_size": 114370147}, {"config_name": "function_class_mask_half_composer_all_context", "data_files": [{"split": "test", "path": "data/function_class_mask_half_composer/all_context/test-*"}]}, {"config_name": "function_class_mask_half_composer_non_py_context", "data_files": [{"split": "test", "path": "data/function_class_mask_half_composer/non_py_context/test-*"}]}, {"config_name": "function_class_mask_half_composer_py_context", "data_files": [{"split": "test", "path": "data/function_class_mask_half_composer/py_context/test-*"}]}, {"config_name": "imports_first_composer_all_context", "data_files": [{"split": "test", "path": "data/imports_first_composer/all_context/test-*"}]}, {"config_name": "imports_first_composer_non_py_context", "data_files": [{"split": "test", "path": "data/imports_first_composer/non_py_context/test-*"}]}, {"config_name": "imports_first_composer_py_context", "data_files": [{"split": "test", "path": "data/imports_first_composer/py_context/test-*"}]}, {"config_name": "alphabetical_composer_all_context", "data_files": [{"split": "test", "path": "data/alphabetical_composer/all_context/test-*"}]}, {"config_name": "alphabetical_composer_non_py_context", "data_files": [{"split": "test", "path": "data/alphabetical_composer/non_py_context/test-*"}]}, {"config_name": "alphabetical_composer_py_context", "data_files": [{"split": "test", "path": "data/alphabetical_composer/py_context/test-*"}]}, {"config_name": "naive_composer_all_context", "data_files": [{"split": "test", "path": "data/naive_composer/all_context/test-*"}]}, {"config_name": "naive_composer_non_py_context", "data_files": [{"split": "test", "path": "data/naive_composer/non_py_context/test-*"}]}, {"config_name": "naive_composer_py_context", "data_files": [{"split": "test", "path": "data/naive_composer/py_context/test-*"}]}, {"config_name": "path_distance_composer_all_context", "data_files": [{"split": "test", "path": "data/path_distance_composer/all_context/test-*"}]}, {"config_name": "path_distance_composer_non_py_context", "data_files": [{"split": "test", "path": "data/path_distance_composer/non_py_context/test-*"}]}, {"config_name": "path_distance_composer_py_context", "data_files": [{"split": "test", "path": "data/path_distance_composer/py_context/test-*"}], "default": true}, {"config_name": "file_length_composer_all_context", "data_files": [{"split": "test", "path": "data/file_length_composer/all_context/test-*"}]}, {"config_name": "file_length_composer_non_py_context", "data_files": [{"split": "test", "path": "data/file_length_composer/non_py_context/test-*"}]}, {"config_name": "file_length_composer_py_context", "data_files": [{"split": "test", "path": "data/file_length_composer/py_context/test-*"}]}, {"config_name": "half_memory_composer_all_context", "data_files": [{"split": "test", "path": "data/half_memory_composer/all_context/test-*"}]}, {"config_name": "half_memory_composer_non_py_context", "data_files": [{"split": "test", "path": "data/half_memory_composer/non_py_context/test-*"}]}, {"config_name": "half_memory_composer_py_context", "data_files": [{"split": "test", "path": "data/half_memory_composer/py_context/test-*"}]}, {"config_name": "function_class_mask_one_composer_all_context", "data_files": [{"split": "test", "path": "data/function_class_mask_one_composer/all_context/test-*"}]}, {"config_name": "function_class_mask_one_composer_non_py_context", "data_files": [{"split": "test", "path": "data/function_class_mask_one_composer/non_py_context/test-*"}]}, {"config_name": "function_class_mask_one_composer_py_context", "data_files": [{"split": "test", "path": "data/function_class_mask_one_composer/py_context/test-*"}]}]} | 2024-01-18T09:56:33+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| Repository Level Code Completion Dataset for Evaluation
=======================================================
This is a dataset of repository snapshots before a commit where a python file has been added. One needs to complete added file with given content of repository composed in different ways.
How to load the data
--------------------
1. via 'load\_dataset':
#### Options for 'data\_files':
How to get the full context for the specific line
-------------------------------------------------
| [
"#### Options for 'data\\_files':\n\n\n\nHow to get the full context for the specific line\n-------------------------------------------------"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"#### Options for 'data\\_files':\n\n\n\nHow to get the full context for the specific line\n-------------------------------------------------"
] |
960185628859f72dd357108b394a2343ffd70e83 | # The list of all subsets in the dataset
Each subset is generated splitting videos from given particular ukrainiam YouTube channel
All subsets are in test split
- "opodcast" subset is from channel "О! ПОДКАСТ"
- "rozdympodcast" subset is from channel "Роздум | Подкаст"
- "test" subset is just a small subset of samples
# Loading a particular subset
```
>>> data_files = {"train": "data/<your_subset>.parquet"}
>>> data = load_dataset("Zarakun/youtube_ua_subtitles_test", data_files=data_files)
>>> data
DatasetDict({
train: Dataset({
features: ['audio', 'rate', 'duration', 'sentence'],
num_rows: <some_number>
})
})
``` | Zarakun/youtube_ua_noisy_subtitles_test | [
"task_categories:automatic-speech-recognition",
"region:us"
] | 2024-01-17T10:43:20+00:00 | {"task_categories": ["automatic-speech-recognition"], "pretty_name": "MangoSpeech", "configs": [{"config_name": "opodcast", "data_files": "data/opodcast.parquet"}, {"config_name": "rozdympodcast", "data_files": "data/rozdympodcast.parquet"}, {"config_name": "test", "data_files": "data/test.parquet"}]} | 2024-01-17T13:06:28+00:00 | [] | [] | TAGS
#task_categories-automatic-speech-recognition #region-us
| # The list of all subsets in the dataset
Each subset is generated splitting videos from given particular ukrainiam YouTube channel
All subsets are in test split
- "opodcast" subset is from channel "О! ПОДКАСТ"
- "rozdympodcast" subset is from channel "Роздум | Подкаст"
- "test" subset is just a small subset of samples
# Loading a particular subset
| [
"# The list of all subsets in the dataset\nEach subset is generated splitting videos from given particular ukrainiam YouTube channel\nAll subsets are in test split\n\n- \"opodcast\" subset is from channel \"О! ПОДКАСТ\"\n- \"rozdympodcast\" subset is from channel \"Роздум | Подкаст\" \n- \"test\" subset is just a small subset of samples",
"# Loading a particular subset"
] | [
"TAGS\n#task_categories-automatic-speech-recognition #region-us \n",
"# The list of all subsets in the dataset\nEach subset is generated splitting videos from given particular ukrainiam YouTube channel\nAll subsets are in test split\n\n- \"opodcast\" subset is from channel \"О! ПОДКАСТ\"\n- \"rozdympodcast\" subset is from channel \"Роздум | Подкаст\" \n- \"test\" subset is just a small subset of samples",
"# Loading a particular subset"
] |
f7915dce4e4e962909789818a75a1438c3d8b444 |
Why always Python?
![Flow](https://raw.githubusercontent.com/LeVuMinhHuy/brocode/master/.pics/20k_flow.png)
I get 20,000 TypeScript code from [The Stack](https://huggingface.co/datasets/bigcode/the-stack-smol-xl) and generate {"instruction", "output"} pairs (based on gpt-3.5-turbo)
Using this dataset for finetune code generation model just for TypeScript
Make web developers great again ! | mhhmm/typescript-instruct-20k-v2c | [
"task_categories:text-generation",
"language:en",
"license:cc",
"typescript",
"code-generation",
"instruct-tuning",
"region:us"
] | 2024-01-17T10:55:48+00:00 | {"language": ["en"], "license": "cc", "task_categories": ["text-generation"], "tags": ["typescript", "code-generation", "instruct-tuning"]} | 2024-01-17T10:56:59+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #license-cc #typescript #code-generation #instruct-tuning #region-us
|
Why always Python?
!Flow
I get 20,000 TypeScript code from The Stack and generate {"instruction", "output"} pairs (based on gpt-3.5-turbo)
Using this dataset for finetune code generation model just for TypeScript
Make web developers great again ! | [] | [
"TAGS\n#task_categories-text-generation #language-English #license-cc #typescript #code-generation #instruct-tuning #region-us \n"
] |
5733ac177952a502f328601c41be494ec4838e16 | # GEST Dataset
This is a repository for the GEST dataset used to measure gender-stereotypical reasoning in language models and machine translation systems.
- Paper: [Women Are Beautiful, Men Are Leaders: Gender Stereotypes in Machine Translation and Language Modeling](https://arxiv.org/abs/2311.18711)
- Code and additional data (annotation details, translations) are avialable in [our repository](https://github.com/kinit-sk/gest)
## Stereotypes
The stereotype ids in the dataset represent following stereotypes (the full definition of each stereotype can be found [here](https://github.com/kinit-sk/gest/blob/main/data/data_guidelines.pdf)):
1. Women are emotional and irrational
2. Women are gentle, kind, and submissive
3. Women are empathetic and caring
4. Women are neat and diligent
5. Women are social
6. Women are weak
7. Women are beautiful
8. Men are tough and rough
9. Men are self-confident
10. Men are professional
11. Men are rational
12. Men are providers
13. Men are leaders
14. Men are childish
15. Men are sexual
16. Men are strong | kinit/gest | [
"size_categories:1K<n<10K",
"language:en",
"language:sl",
"language:sk",
"language:cs",
"language:pl",
"language:sr",
"language:hr",
"language:be",
"language:ru",
"language:uk",
"license:apache-2.0",
"gender",
"gender bias",
"gender stereotypes",
"stereotypes",
"machine translation",
"language models",
"arxiv:2311.18711",
"region:us"
] | 2024-01-17T10:57:29+00:00 | {"language": ["en", "sl", "sk", "cs", "pl", "sr", "hr", "be", "ru", "uk"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "tags": ["gender", "gender bias", "gender stereotypes", "stereotypes", "machine translation", "language models"]} | 2024-01-17T14:08:11+00:00 | [
"2311.18711"
] | [
"en",
"sl",
"sk",
"cs",
"pl",
"sr",
"hr",
"be",
"ru",
"uk"
] | TAGS
#size_categories-1K<n<10K #language-English #language-Slovenian #language-Slovak #language-Czech #language-Polish #language-Serbian #language-Croatian #language-Belarusian #language-Russian #language-Ukrainian #license-apache-2.0 #gender #gender bias #gender stereotypes #stereotypes #machine translation #language models #arxiv-2311.18711 #region-us
| # GEST Dataset
This is a repository for the GEST dataset used to measure gender-stereotypical reasoning in language models and machine translation systems.
- Paper: Women Are Beautiful, Men Are Leaders: Gender Stereotypes in Machine Translation and Language Modeling
- Code and additional data (annotation details, translations) are avialable in our repository
## Stereotypes
The stereotype ids in the dataset represent following stereotypes (the full definition of each stereotype can be found here):
1. Women are emotional and irrational
2. Women are gentle, kind, and submissive
3. Women are empathetic and caring
4. Women are neat and diligent
5. Women are social
6. Women are weak
7. Women are beautiful
8. Men are tough and rough
9. Men are self-confident
10. Men are professional
11. Men are rational
12. Men are providers
13. Men are leaders
14. Men are childish
15. Men are sexual
16. Men are strong | [
"# GEST Dataset\n\nThis is a repository for the GEST dataset used to measure gender-stereotypical reasoning in language models and machine translation systems.\n\n- Paper: Women Are Beautiful, Men Are Leaders: Gender Stereotypes in Machine Translation and Language Modeling\n- Code and additional data (annotation details, translations) are avialable in our repository",
"## Stereotypes\n\nThe stereotype ids in the dataset represent following stereotypes (the full definition of each stereotype can be found here):\n\n1. Women are emotional and irrational\n2. Women are gentle, kind, and submissive\n3. Women are empathetic and caring\n4. Women are neat and diligent\n5. Women are social\n6. Women are weak\n7. Women are beautiful\n8. Men are tough and rough\n9. Men are self-confident\n10. Men are professional\n11. Men are rational\n12. Men are providers\n13. Men are leaders\n14. Men are childish\n15. Men are sexual\n16. Men are strong"
] | [
"TAGS\n#size_categories-1K<n<10K #language-English #language-Slovenian #language-Slovak #language-Czech #language-Polish #language-Serbian #language-Croatian #language-Belarusian #language-Russian #language-Ukrainian #license-apache-2.0 #gender #gender bias #gender stereotypes #stereotypes #machine translation #language models #arxiv-2311.18711 #region-us \n",
"# GEST Dataset\n\nThis is a repository for the GEST dataset used to measure gender-stereotypical reasoning in language models and machine translation systems.\n\n- Paper: Women Are Beautiful, Men Are Leaders: Gender Stereotypes in Machine Translation and Language Modeling\n- Code and additional data (annotation details, translations) are avialable in our repository",
"## Stereotypes\n\nThe stereotype ids in the dataset represent following stereotypes (the full definition of each stereotype can be found here):\n\n1. Women are emotional and irrational\n2. Women are gentle, kind, and submissive\n3. Women are empathetic and caring\n4. Women are neat and diligent\n5. Women are social\n6. Women are weak\n7. Women are beautiful\n8. Men are tough and rough\n9. Men are self-confident\n10. Men are professional\n11. Men are rational\n12. Men are providers\n13. Men are leaders\n14. Men are childish\n15. Men are sexual\n16. Men are strong"
] |
7a111e8221099aadd8426b7f2518891f89905378 |
# Dataset Card for ChartGPT-Dataset
## Dataset Details
### Dataset Description
This dataset is used to train the model [ChartGPT](https://huggingface.co/yuan-tian/chartgpt). For more information, please refer to the paper.
* **Language(s) (NLP)**: English
* **License**: Apache 2.0
* **Research paper**: [ChartGPT: Leveraging LLMs to Generate Charts from Abstract Natural Language](https://arxiv.org/abs/2311.01920)
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{tian2023chartgpt,
title={{ChartGPT}: Leveraging LLMs to Generate Charts from Abstract Natural Language},
author={Tian, Yuan and Cui, Weiwei and Deng, Dazhen and Yi, Xinjing and Yang, Yurun and Zhang, Haidong and Wu, Yingcai},
journal={arXiv preprint arXiv:2311.01920},
year={2023}
}
``` | yuan-tian/chartgpt-dataset | [
"license:apache-2.0",
"arxiv:2311.01920",
"region:us"
] | 2024-01-17T11:01:21+00:00 | {"license": "apache-2.0"} | 2024-01-17T12:57:35+00:00 | [
"2311.01920"
] | [] | TAGS
#license-apache-2.0 #arxiv-2311.01920 #region-us
|
# Dataset Card for ChartGPT-Dataset
## Dataset Details
### Dataset Description
This dataset is used to train the model ChartGPT. For more information, please refer to the paper.
* Language(s) (NLP): English
* License: Apache 2.0
* Research paper: ChartGPT: Leveraging LLMs to Generate Charts from Abstract Natural Language
BibTeX:
| [
"# Dataset Card for ChartGPT-Dataset",
"## Dataset Details",
"### Dataset Description\n\nThis dataset is used to train the model ChartGPT. For more information, please refer to the paper. \n\n* Language(s) (NLP): English\n* License: Apache 2.0\n* Research paper: ChartGPT: Leveraging LLMs to Generate Charts from Abstract Natural Language\n\nBibTeX:"
] | [
"TAGS\n#license-apache-2.0 #arxiv-2311.01920 #region-us \n",
"# Dataset Card for ChartGPT-Dataset",
"## Dataset Details",
"### Dataset Description\n\nThis dataset is used to train the model ChartGPT. For more information, please refer to the paper. \n\n* Language(s) (NLP): English\n* License: Apache 2.0\n* Research paper: ChartGPT: Leveraging LLMs to Generate Charts from Abstract Natural Language\n\nBibTeX:"
] |
7bacd80928d2f799c469f3d6693993bfeed62165 |
# Dataset of kiichi_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order)
This is the dataset of kiichi_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order), containing 35 images and their tags.
The core tags of this character are `long_hair, white_hair, breasts, very_long_hair, horns, pointy_ears, bangs, orange_eyes, yellow_eyes, large_breasts, tassel, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 50.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 32.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 61.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 46.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 80.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kiichi_hogen_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, black_gloves, cleavage_cutout, looking_at_viewer, smile, solo, red_armor, white_dress, armored_dress, feathers, navel_cutout, blush, thighs, spear |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, looking_at_viewer, smile, solo, feathers, navel_cutout, red_armor, spear, cleavage |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | cleavage_cutout | looking_at_viewer | smile | solo | red_armor | white_dress | armored_dress | feathers | navel_cutout | blush | thighs | spear | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:------------------|:--------------------|:--------|:-------|:------------|:--------------|:----------------|:-----------|:---------------|:--------|:---------|:--------|:-----------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | X | X | | | X | X | | | X | X |
| CyberHarem/kiichi_hogen_fgo | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T11:06:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T11:13:49+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kiichi\_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order)
=====================================================
This is the dataset of kiichi\_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order), containing 35 images and their tags.
The core tags of this character are 'long\_hair, white\_hair, breasts, very\_long\_hair, horns, pointy\_ears, bangs, orange\_eyes, yellow\_eyes, large\_breasts, tassel, sidelocks', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1322bff7e52b73db9df669afa4431dc59d803ccc |
Tickers
=======
| shortbread/tickers | [
"size_categories:1K<n<10K",
"language:en",
"finance",
"region:us"
] | 2023-07-22T00:11:35+00:00 | {} | 2023-11-02T14:58:21+00:00 | [] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #finance #region-us
|
Tickers
=======
| [] | [
"TAGS\n#size_categories-1K<n<10K #language-English #finance #region-us \n"
] |
e4753c71aa44fdd0303a44673a9f72a9fb2a64c7 | # Dataset Card for "dataset_20231007_024059"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tr416/dataset_20231007_024059 | [
"region:us"
] | 2023-10-07T01:40:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 762696.0, "num_examples": 297}, {"name": "test", "num_bytes": 7704.0, "num_examples": 3}], "download_size": 73943, "dataset_size": 770400.0}} | 2023-10-07T01:41:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dataset_20231007_024059"
More Information needed | [
"# Dataset Card for \"dataset_20231007_024059\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dataset_20231007_024059\"\n\nMore Information needed"
] |
4def4661f8501e1da652cd9b5e185dbabb93d7fb | https://huggingface.co/datasets/mhenrichsen/context-aware-splits-english | PocketDoc/text-splitter-alpaca | [
"task_categories:text-generation",
"language:en",
"region:us"
] | 2024-02-15T19:59:32+00:00 | {"language": ["en"], "task_categories": ["text-generation"]} | 2024-02-16T22:27:17+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #region-us
| URL | [] | [
"TAGS\n#task_categories-text-generation #language-English #region-us \n"
] |
33a1e962efc1668084958d77d16354acef1d7746 |
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-34B-ties](https://huggingface.co/louisbrulenaudet/Pearl-34B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T20:29:21.982361](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties/blob/main/results_2024-02-15T20-29-21.982361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7624896367346236,
"acc_stderr": 0.02823253317418589,
"acc_norm": 0.7667330036075873,
"acc_norm_stderr": 0.028764116967369732,
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7032022498819784,
"mc2_stderr": 0.014189265275795037
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.01364094309194653,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520767
},
"harness|hellaswag|10": {
"acc": 0.6525592511451902,
"acc_stderr": 0.004751840646730855,
"acc_norm": 0.8483369846644094,
"acc_norm_stderr": 0.0035796087435066093
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866518,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866518
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7248677248677249,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.7248677248677249,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199488,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163085,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163085
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769584,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.909323116219668,
"acc_stderr": 0.010268429662528548,
"acc_norm": 0.909323116219668,
"acc_norm_stderr": 0.010268429662528548
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8055865921787709,
"acc_stderr": 0.013235808096742286,
"acc_norm": 0.8055865921787709,
"acc_norm_stderr": 0.013235808096742286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.02282731749105969,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.02282731749105969
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062075,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062075
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5808344198174706,
"acc_stderr": 0.012602244505788228,
"acc_norm": 0.5808344198174706,
"acc_norm_stderr": 0.012602244505788228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559342,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559342
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736854,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736854
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7032022498819784,
"mc2_stderr": 0.014189265275795037
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480330996
},
"harness|gsm8k|5": {
"acc": 0.6747536012130402,
"acc_stderr": 0.012903904752543913
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties | [
"region:us"
] | 2024-02-15T20:31:45+00:00 | {"pretty_name": "Evaluation run of louisbrulenaudet/Pearl-34B-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-34B-ties](https://huggingface.co/louisbrulenaudet/Pearl-34B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T20:29:21.982361](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties/blob/main/results_2024-02-15T20-29-21.982361.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7624896367346236,\n \"acc_stderr\": 0.02823253317418589,\n \"acc_norm\": 0.7667330036075873,\n \"acc_norm_stderr\": 0.028764116967369732,\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7032022498819784,\n \"mc2_stderr\": 0.014189265275795037\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.01364094309194653,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6525592511451902,\n \"acc_stderr\": 0.004751840646730855,\n \"acc_norm\": 0.8483369846644094,\n \"acc_norm_stderr\": 0.0035796087435066093\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866518,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866518\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7248677248677249,\n \"acc_stderr\": 0.023000086859068642,\n \"acc_norm\": 0.7248677248677249,\n \"acc_norm_stderr\": 0.023000086859068642\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199488,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163085,\n \"acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163085\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769584,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n \"acc_stderr\": 0.010268429662528548,\n \"acc_norm\": 0.909323116219668,\n \"acc_norm_stderr\": 0.010268429662528548\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8055865921787709,\n \"acc_stderr\": 0.013235808096742286,\n \"acc_norm\": 0.8055865921787709,\n \"acc_norm_stderr\": 0.013235808096742286\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n \"acc_stderr\": 0.02282731749105969,\n \"acc_norm\": 0.797427652733119,\n \"acc_norm_stderr\": 0.02282731749105969\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062075,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062075\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n \"acc_stderr\": 0.012602244505788228,\n \"acc_norm\": 0.5808344198174706,\n \"acc_norm_stderr\": 0.012602244505788228\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559342,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559342\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736854,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736854\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7032022498819784,\n \"mc2_stderr\": 0.014189265275795037\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330996\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6747536012130402,\n \"acc_stderr\": 0.012903904752543913\n }\n}\n```", "repo_url": "https://huggingface.co/louisbrulenaudet/Pearl-34B-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|arc:challenge|25_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|gsm8k|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hellaswag|10_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T20-29-21.982361.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["**/details_harness|winogrande|5_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T20-29-21.982361.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T20_29_21.982361", "path": ["results_2024-02-15T20-29-21.982361.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T20-29-21.982361.parquet"]}]}]} | 2024-02-15T20:32:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-ties
Dataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T20:29:21.982361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T20:29:21.982361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-34B-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-34B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T20:29:21.982361(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5fb3b6308132804ce31daa3cc5629e43837c40a7 |
# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/Wistral-7B-Instruct-v0.4](https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T20:35:44.878136](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.4/blob/main/results_2024-02-15T20-35-44.878136.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032184784518743,
"acc_stderr": 0.03333730204729809,
"acc_norm": 0.607891645213564,
"acc_norm_stderr": 0.03401402537730786,
"mc1": 0.5226438188494492,
"mc1_stderr": 0.01748554225848964,
"mc2": 0.6766513448639357,
"mc2_stderr": 0.015264009667659464
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464392,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.0141696645203031
},
"harness|hellaswag|10": {
"acc": 0.6612228639713205,
"acc_stderr": 0.004723266971563391,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817935
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335842,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335842
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534427,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5226438188494492,
"mc1_stderr": 0.01748554225848964,
"mc2": 0.6766513448639357,
"mc2_stderr": 0.015264009667659464
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827936
},
"harness|gsm8k|5": {
"acc": 0.3957543593631539,
"acc_stderr": 0.013469823701048815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.4 | [
"region:us"
] | 2024-02-15T20:38:05+00:00 | {"pretty_name": "Evaluation run of BarraHome/Wistral-7B-Instruct-v0.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/Wistral-7B-Instruct-v0.4](https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T20:35:44.878136](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.4/blob/main/results_2024-02-15T20-35-44.878136.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032184784518743,\n \"acc_stderr\": 0.03333730204729809,\n \"acc_norm\": 0.607891645213564,\n \"acc_norm_stderr\": 0.03401402537730786,\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464392,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n \"acc_stderr\": 0.004723266971563391,\n \"acc_norm\": 0.8481378211511651,\n \"acc_norm_stderr\": 0.0035815378475817935\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335842,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335842\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \"acc_stderr\": 0.013469823701048815\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|arc:challenge|25_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|gsm8k|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hellaswag|10_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T20-35-44.878136.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["**/details_harness|winogrande|5_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T20-35-44.878136.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T20_35_44.878136", "path": ["results_2024-02-15T20-35-44.878136.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T20-35-44.878136.parquet"]}]}]} | 2024-02-15T20:38:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.4
Dataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T20:35:44.878136(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.4\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T20:35:44.878136(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.4\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T20:35:44.878136(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
418dfbba1351ac18742a1bc8f7428d5fbc0150c8 |
Dataset for human eval infill for java, based on https://arxiv.org/pdf/2207.14255
| njkumarr/humanevalinfilljava | [
"language:en",
"arxiv:2207.14255",
"region:us"
] | 2024-02-15T21:25:22+00:00 | {"language": ["en"], "pretty_name": "HumanEval-Infilling Java"} | 2024-02-16T07:43:34+00:00 | [
"2207.14255"
] | [
"en"
] | TAGS
#language-English #arxiv-2207.14255 #region-us
|
Dataset for human eval infill for java, based on URL
| [] | [
"TAGS\n#language-English #arxiv-2207.14255 #region-us \n"
] |
b12fc65cb8d2fd75335ea8ce2fa64a4be9f8fa7c |
## Dataset Description
- **Repository:** [https://github.com/nlp-uoregon/CulturaX](https://github.com/nlp-uoregon/CulturaX)
- **Papers:** [CulturaX: A Cleaned, Enormous, and Multilingual Dataset for Large Language Models in 167 Languages](https://arxiv.org/abs/2309.09400)
## Dataset Summary
We present CulturaX, a substantial multilingual dataset with 6.3 trillion tokens in 167 languages, tailored for large language model (LLM) development. Our dataset undergoes meticulous cleaning and deduplication through a rigorous pipeline of multiple stages to accomplish the best quality for model training, including language identification, URL-based filtering, metric-based cleaning, document refinement, and data deduplication. We employ MinHash at document level to achieve fuzzy deduplication for the datasets in different languages. Our data cleaning framework includes diverse criteria and threshold selections, guided by extensive data samples, ensuring comprehensive noise filtering in various aspects. CulturaX is fully released to the public in HuggingFace to facilitate research and advancements in multilingual LLMs.
Our dataset combines the most recent iteration of mC4 (version 3.1.0) [1] with all accessible OSCAR corpora up to the present year, including 20.19, 21.09, 22.01, and 23.01 [2]. After deep cleaning and deduplication, CulturaX involves 16TB data in the parquet format (expanding to 27TB when unpacked). More than a half of our dataset is dedicated to non-English languages to significantly boost the data size and enhance the feasibility of training models in multilingual scenarios.
To obtain perplexity scores for data cleaning, we train a SentencePiece tokenizer and 5-gram Kneser-Ney language models as provided in the KenLM library [3] using the 20230501 dumps of Wikipedia. Our KenLM models are also released in HuggingFace: https://huggingface.co/uonlp/kenlm.
Details for the dataset can be found in our technical paper: [https://arxiv.org/abs/2309.09400](https://arxiv.org/abs/2309.09400)
You can download the dataset using Hugging Face datasets:
*You may need to follow these instructions to setup authentication before downloading the dataset: [https://huggingface.co/docs/huggingface_hub/quick-start#login](https://huggingface.co/docs/huggingface_hub/quick-start#login)*
```python
from datasets import load_dataset
ds = load_dataset("uonlp/CulturaX",
"en",
use_auth_token=True)
```
### Languages
The supported languages and statistics for our dataset can be found below:
*(Note that the language code `als` and `eml` refer to `gsw` and `x-eml` in the OSCAR-2301 dataset.)*
| | Code | Language | # Documents | # Tokens | # Tokens (%) |
|----:|:-------|:-------------------------|:----------------|:--------------------|:------|
| 0 | en | English | 3,241,065,682 | 2,846,970,578,793 | 45.13 |
| 1 | ru | Russian | 799,310,908 | 737,201,800,363 | 11.69 |
| 2 | es | Spanish | 450,937,645 | 373,845,662,394 | 5.93 |
| 3 | de | German | 420,017,484 | 357,030,348,021 | 5.66 |
| 4 | fr | French | 363,754,348 | 319,332,674,695 | 5.06 |
| 5 | zh | Chinese | 218,624,604 | 227,055,380,882 | 3.60 |
| 6 | it | Italian | 211,309,922 | 165,446,410,843 | 2.62 |
| 7 | pt | Portuguese | 190,289,658 | 136,941,763,923 | 2.17 |
| 8 | pl | Polish | 142,167,217 | 117,269,087,143 | 1.86 |
| 9 | ja | Japanese | 111,188,475 | 107,873,841,351 | 1.71 |
| 10 | nl | Dutch | 117,392,666 | 80,032,209,900 | 1.27 |
| 11 | ar | Arabic | 74,027,952 | 69,354,335,076 | 1.10 |
| 12 | tr | Turkish | 94,207,460 | 64,292,787,164 | 1.02 |
| 13 | cs | Czech | 65,350,564 | 56,910,486,745 | 0.90 |
| 14 | vi | Vietnamese | 57,606,341 | 55,380,123,774 | 0.88 |
| 15 | fa | Persian | 59,531,144 | 45,947,657,495 | 0.73 |
| 16 | hu | Hungarian | 44,132,152 | 43,417,981,714 | 0.69 |
| 17 | el | Greek | 51,430,226 | 43,147,590,757 | 0.68 |
| 18 | ro | Romanian | 40,325,424 | 39,647,954,768 | 0.63 |
| 19 | sv | Swedish | 49,709,189 | 38,486,181,494 | 0.61 |
| 20 | uk | Ukrainian | 44,740,545 | 38,226,128,686 | 0.61 |
| 21 | fi | Finnish | 30,467,667 | 28,925,009,180 | 0.46 |
| 22 | ko | Korean | 20,557,310 | 24,765,448,392 | 0.39 |
| 23 | da | Danish | 25,429,808 | 22,921,651,314 | 0.36 |
| 24 | bg | Bulgarian | 24,131,819 | 22,917,954,776 | 0.36 |
| 25 | no | Norwegian | 18,907,310 | 18,426,628,868 | 0.29 |
| 26 | hi | Hindi | 19,665,355 | 16,791,362,871 | 0.27 |
| 27 | sk | Slovak | 18,582,517 | 16,442,669,076 | 0.26 |
| 28 | th | Thai | 20,960,550 | 15,717,374,014 | 0.25 |
| 29 | lt | Lithuanian | 13,339,785 | 14,247,110,836 | 0.23 |
| 30 | ca | Catalan | 15,531,777 | 12,530,288,006 | 0.20 |
| 31 | id | Indonesian | 23,251,368 | 12,062,966,061 | 0.19 |
| 32 | bn | Bangla | 12,436,596 | 9,572,929,804 | 0.15 |
| 33 | et | Estonian | 8,004,753 | 8,805,656,165 | 0.14 |
| 34 | sl | Slovenian | 7,335,378 | 8,007,587,522 | 0.13 |
| 35 | lv | Latvian | 7,136,587 | 7,845,180,319 | 0.12 |
| 36 | he | Hebrew | 4,653,979 | 4,937,152,096 | 0.08 |
| 37 | sr | Serbian | 4,053,166 | 4,619,482,725 | 0.07 |
| 38 | ta | Tamil | 4,728,460 | 4,378,078,610 | 0.07 |
| 39 | sq | Albanian | 5,205,579 | 3,648,893,215 | 0.06 |
| 40 | az | Azerbaijani | 5,084,505 | 3,513,351,967 | 0.06 |
| 41 | kk | Kazakh | 2,733,982 | 2,802,485,195 | 0.04 |
| 42 | ur | Urdu | 2,757,279 | 2,703,052,627 | 0.04 |
| 43 | ka | Georgian | 3,120,321 | 2,617,625,564 | 0.04 |
| 44 | hy | Armenian | 2,964,488 | 2,395,179,284 | 0.04 |
| 45 | is | Icelandic | 2,373,560 | 2,350,592,857 | 0.04 |
| 46 | ml | Malayalam | 2,693,052 | 2,100,556,809 | 0.03 |
| 47 | ne | Nepali | 3,124,040 | 2,061,601,961 | 0.03 |
| 48 | mk | Macedonian | 2,762,807 | 2,003,302,006 | 0.03 |
| 49 | mr | Marathi | 2,266,588 | 1,955,227,796 | 0.03 |
| 50 | mn | Mongolian | 1,928,828 | 1,850,667,656 | 0.03 |
| 51 | be | Belarusian | 1,643,486 | 1,791,473,041 | 0.03 |
| 52 | te | Telugu | 1,822,865 | 1,566,972,146 | 0.02 |
| 53 | gl | Galician | 1,785,963 | 1,382,539,693 | 0.02 |
| 54 | eu | Basque | 1,598,822 | 1,262,066,759 | 0.02 |
| 55 | kn | Kannada | 1,352,142 | 1,242,285,201 | 0.02 |
| 56 | gu | Gujarati | 1,162,878 | 1,131,730,537 | 0.02 |
| 57 | af | Afrikaans | 826,519 | 1,119,009,767 | 0.02 |
| 58 | my | Burmese | 865,575 | 882,606,546 | 0.01 |
| 59 | si | Sinhala | 753,655 | 880,289,097 | 0.01 |
| 60 | eo | Esperanto | 460,088 | 803,948,528 | 0.01 |
| 61 | km | Khmer | 1,013,181 | 746,664,132 | 0.01 |
| 62 | pa | Punjabi | 646,987 | 727,546,145 | 0.01 |
| 63 | cy | Welsh | 549,955 | 576,743,162 | 0.01 |
| 64 | ky | Kyrgyz | 570,922 | 501,442,620 | 0.01 |
| 65 | ga | Irish | 304,251 | 376,947,935 | 0.01 |
| 66 | ps | Pashto | 376,914 | 363,007,770 | 0.01 |
| 67 | am | Amharic | 243,349 | 358,206,762 | 0.01 |
| 68 | ku | Kurdish | 295,314 | 302,990,910 | 0.00 |
| 69 | tl | Filipino | 348,453 | 242,086,456 | 0.00 |
| 70 | yi | Yiddish | 141,156 | 217,584,643 | 0.00 |
| 71 | lo | Lao | 217,842 | 168,256,876 | 0.00 |
| 72 | fy | Western Frisian | 223,268 | 167,193,111 | 0.00 |
| 73 | sd | Sindhi | 109,162 | 147,487,058 | 0.00 |
| 74 | mg | Malagasy | 115,910 | 142,685,412 | 0.00 |
| 75 | or | Odia | 153,461 | 100,323,213 | 0.00 |
| 76 | as | Assamese | 52,627 | 83,787,896 | 0.00 |
| 77 | ug | Uyghur | 47,035 | 77,677,306 | 0.00 |
| 78 | uz | Uzbek | 87,219 | 75,250,787 | 0.00 |
| 79 | la | Latin | 48,968 | 44,176,580 | 0.00 |
| 80 | hr | Croatian | 460,690 | 40,796,811 | 0.00 |
| 81 | sw | Swahili | 66,506 | 30,708,309 | 0.00 |
| 82 | ms | Malay | 238,151 | 19,375,976 | 0.00 |
| 83 | br | Breton | 43,765 | 13,987,037 | 0.00 |
| 84 | sa | Sanskrit | 16,290 | 13,561,367 | 0.00 |
| 85 | gd | Scottish Gaelic | 8,408 | 4,796,485 | 0.00 |
| 86 | su | Sundanese | 1,554 | 1,308,460 | 0.00 |
| 87 | jv | Javanese | 2,058 | 625,429 | 0.00 |
| 88 | tg | Tajik | 483,835 | - | - |
| 89 | ceb | Cebuano | 263,890 | - | - |
| 90 | tt | Tatar | 218,102 | - | - |
| 91 | ckb | Central Kurdish | 172,035 | - | - |
| 92 | lb | Luxembourgish | 165,891 | - | - |
| 93 | mt | Maltese | 151,320 | - | - |
| 94 | nn | Norwegian Nynorsk | 126,083 | - | - |
| 95 | qu | Quechua | 1,202 | 72,101 | 0.00 |
| 96 | ba | Bashkir | 71,957 | - | - |
| 97 | arz | Egyptian Arabic | 71,625 | - | - |
| 98 | dv | Divehi | 66,702 | - | - |
| 99 | bo | Tibetan | 54,185 | - | - |
| 100 | sh | Serbian (Latin) | 45,619 | - | - |
| 101 | yo | Yoruba | 192 | 42,943 | 0.00 |
| 102 | bs | Bosnian | 1,237 | 39,768 | 0.00 |
| 103 | azb | South Azerbaijani | 29,833 | - | - |
| 104 | ht | Haitian Creole | 12 | 26,183 | 0.00 |
| 105 | war | Waray | 23,687 | - | - |
| 106 | cv | Chuvash | 22,570 | - | - |
| 107 | sah | Sakha | 22,141 | - | - |
| 108 | li | Limburgish | 206 | 18,532 | 0.00 |
| 109 | ce | Chechen | 17,322 | - | - |
| 110 | pnb | Western Panjabi | 15,625 | - | - |
| 111 | nds | Low German | 15,139 | - | - |
| 112 | tk | Turkmen | 14,393 | - | - |
| 113 | gn | Guarani | 103 | 12,708 | 0.00 |
| 114 | oc | Occitan | 10,556 | - | - |
| 115 | xmf | Mingrelian | 9,706 | - | - |
| 116 | ast | Asturian | 9,002 | - | - |
| 117 | os | Ossetic | 8,596 | - | - |
| 118 | mhr | Eastern Mari | 7,883 | - | - |
| 119 | pms | Piedmontese | 7,566 | - | - |
| 120 | als[*] | Swiss German | 6,936 | - | - |
| 121 | vo | Volapük | 6,621 | - | - |
| 122 | so | Somali | 39 | 6,053 | 0.00 |
| 123 | bpy | Bishnupriya | 5,087 | - | - |
| 124 | new | Newari | 4,344 | - | - |
| 125 | hsb | Upper Sorbian | 4,244 | - | - |
| 126 | lmo | Lombard | 3,530 | - | - |
| 127 | an | Aragonese | 2,746 | - | - |
| 128 | ilo | Iloko | 2,328 | - | - |
| 129 | mzn | Mazanderani | 1,914 | - | - |
| 130 | lez | Lezghian | 1,806 | - | - |
| 131 | rm | Romansh | 30 | 1,769 | 0.00 |
| 132 | krc | Karachay-Balkar | 1,745 | - | - |
| 133 | min | Minangkabau | 1,429 | - | - |
| 134 | kv | Komi | 1,396 | - | - |
| 135 | wa | Walloon | 1,383 | - | - |
| 136 | jbo | Lojban | 1,349 | - | - |
| 137 | io | Ido | 1,144 | - | - |
| 138 | mrj | Western Mari | 1,056 | - | - |
| 139 | gom | Goan Konkani | 721 | - | - |
| 140 | ia | Interlingua | 613 | - | - |
| 141 | av | Avaric | 438 | - | - |
| 142 | bh | Bihari languages | 265 | - | - |
| 143 | wuu | Wu Chinese | 222 | - | - |
| 144 | nah | Nahuatl languages | 131 | - | - |
| 145 | vec | Venetian | 113 | - | - |
| 146 | bxr | Russia Buriat | 100 | - | - |
| 147 | kw | Cornish | 94 | - | - |
| 148 | mai | Maithili | 93 | - | - |
| 149 | eml[*] | Emiliano-Romagnol | 91 | - | - |
| 150 | dsb | Lower Sorbian | 59 | - | - |
| 151 | xal | Kalmyk | 51 | - | - |
| 152 | lrc | Northern Luri | 43 | - | - |
| 153 | nap | Neapolitan | 31 | - | - |
| 154 | tyv | Tuvinian | 23 | - | - |
| 155 | scn | Sicilian | 21 | - | - |
| 156 | frr | Northern Frisian | 11 | - | - |
| 157 | mwl | Mirandese | 9 | - | - |
| 158 | myv | Erzya | 4 | - | - |
| 159 | ie | Interlingue | 4 | - | - |
| 160 | pam | Pampanga | 4 | - | - |
| 161 | bar | Bavarian | 3 | - | - |
| 162 | yue | Yue Chinese | 3 | - | - |
| 163 | cbk | Chavacano | 2 | - | - |
| 164 | bcl | Central Bikol | 1 | - | - |
| 165 | vls | West Flemish | 1 | - | - |
| 166 | rue | Rusyn | 1 | - | - |
### Dataset Structure
```json
{
"text": ...,
"timestamp": ...,
"url": ...,
"source": "mc4" | "OSCAR-xxxx",
}
```
## Considerations for Using the Data
As CulturaX is the cleaned version of the mC4 and OSCAR datasets, which were both extracted from CommonCrawl, personal and sensitive information might still contain personal and sensitive information.
This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.
## License Information
The licence terms for CulturaX strictly follows those of `mC4` and `OSCAR`. Please refer to both below licenses when using this dataset.
- [mC4 license](https://huggingface.co/datasets/allenai/c4#license)
- [OSCAR license](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information)
## Citation
To cite CulturaX, please use:
```
@misc{nguyen2023culturax,
title={CulturaX: A Cleaned, Enormous, and Multilingual Dataset for Large Language Models in 167 Languages},
author={Thuat Nguyen and Chien Van Nguyen and Viet Dac Lai and Hieu Man and Nghia Trung Ngo and Franck Dernoncourt and Ryan A. Rossi and Thien Huu Nguyen},
year={2023},
eprint={2309.09400},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Reference
[1] Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, and Colin Raffel. 2021. mT5: A massively multilingual
pre-trained text-to-text transformer. In NAACL 2021. https://huggingface.co/datasets/mc4
[2] Pedro Javier Ortiz Suárez, Benoît Sagot, and Laurent Romary. 2019. Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures. In Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-
7) 2019. https://oscar-project.org/
[3] KenLM: Faster and smaller language model queries. In Proceedings of the Sixth
Workshop on Statistical Machine Translation, 2011.
| Madjakul/l-halversting | [
"task_categories:text-generation",
"task_categories:fill-mask",
"task_ids:language-modeling",
"task_ids:masked-language-modeling",
"annotations_creators:no-annotation",
"language_creators:found",
"multilinguality:multilingual",
"source_datasets:original",
"language:af",
"language:als",
"language:am",
"language:an",
"language:ar",
"language:arz",
"language:as",
"language:ast",
"language:av",
"language:az",
"language:azb",
"language:ba",
"language:bar",
"language:bcl",
"language:be",
"language:bg",
"language:bh",
"language:bn",
"language:bo",
"language:bpy",
"language:br",
"language:bs",
"language:bxr",
"language:ca",
"language:cbk",
"language:ce",
"language:ceb",
"language:ckb",
"language:cs",
"language:cv",
"language:cy",
"language:da",
"language:de",
"language:dsb",
"language:dv",
"language:el",
"language:eml",
"language:en",
"language:eo",
"language:es",
"language:et",
"language:eu",
"language:fa",
"language:fi",
"language:fr",
"language:frr",
"language:fy",
"language:ga",
"language:gd",
"language:gl",
"language:gn",
"language:gom",
"language:gu",
"language:he",
"language:hi",
"language:hr",
"language:hsb",
"language:ht",
"language:hu",
"language:hy",
"language:ia",
"language:id",
"language:ie",
"language:ilo",
"language:io",
"language:is",
"language:it",
"language:ja",
"language:jbo",
"language:jv",
"language:ka",
"language:kk",
"language:km",
"language:kn",
"language:ko",
"language:krc",
"language:ku",
"language:kv",
"language:kw",
"language:ky",
"language:la",
"language:lb",
"language:lez",
"language:li",
"language:lmo",
"language:lo",
"language:lrc",
"language:lt",
"language:lv",
"language:mai",
"language:mg",
"language:mhr",
"language:min",
"language:mk",
"language:ml",
"language:mn",
"language:mr",
"language:mrj",
"language:ms",
"language:mt",
"language:mwl",
"language:my",
"language:myv",
"language:mzn",
"language:nah",
"language:nap",
"language:nds",
"language:ne",
"language:new",
"language:nl",
"language:nn",
"language:no",
"language:oc",
"language:or",
"language:os",
"language:pa",
"language:pam",
"language:pl",
"language:pms",
"language:pnb",
"language:ps",
"language:pt",
"language:qu",
"language:rm",
"language:ro",
"language:ru",
"language:rue",
"language:sa",
"language:sah",
"language:scn",
"language:sd",
"language:sh",
"language:si",
"language:sk",
"language:sl",
"language:so",
"language:sq",
"language:sr",
"language:su",
"language:sv",
"language:sw",
"language:ta",
"language:te",
"language:tg",
"language:th",
"language:tk",
"language:tl",
"language:tr",
"language:tt",
"language:tyv",
"language:ug",
"language:uk",
"language:ur",
"language:uz",
"language:vec",
"language:vi",
"language:vls",
"language:vo",
"language:wa",
"language:war",
"language:wuu",
"language:xal",
"language:xmf",
"language:yi",
"language:yo",
"language:yue",
"language:zh",
"arxiv:2309.09400",
"region:us"
] | 2024-02-15T21:36:53+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["found"], "language": ["af", "als", "am", "an", "ar", "arz", "as", "ast", "av", "az", "azb", "ba", "bar", "bcl", "be", "bg", "bh", "bn", "bo", "bpy", "br", "bs", "bxr", "ca", "cbk", "ce", "ceb", "ckb", "cs", "cv", "cy", "da", "de", "dsb", "dv", "el", "eml", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "frr", "fy", "ga", "gd", "gl", "gn", "gom", "gu", "he", "hi", "hr", "hsb", "ht", "hu", "hy", "ia", "id", "ie", "ilo", "io", "is", "it", "ja", "jbo", "jv", "ka", "kk", "km", "kn", "ko", "krc", "ku", "kv", "kw", "ky", "la", "lb", "lez", "li", "lmo", "lo", "lrc", "lt", "lv", "mai", "mg", "mhr", "min", "mk", "ml", "mn", "mr", "mrj", "ms", "mt", "mwl", "my", "myv", "mzn", "nah", "nap", "nds", "ne", "new", "nl", "nn", "no", "oc", "or", "os", "pa", "pam", "pl", "pms", "pnb", "ps", "pt", "qu", "rm", "ro", "ru", "rue", "sa", "sah", "scn", "sd", "sh", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "tg", "th", "tk", "tl", "tr", "tt", "tyv", "ug", "uk", "ur", "uz", "vec", "vi", "vls", "vo", "wa", "war", "wuu", "xal", "xmf", "yi", "yo", "yue", "zh"], "multilinguality": ["multilingual"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "pretty_name": "LHALversting", "configs": [{"config_name": "de", "data_files": "de/*.tar.gz"}, {"config_name": "en", "data_files": "en/*.tar.gz"}, {"config_name": "fr", "data_files": "fr/*.tar.gz"}], "extra_gated_prompt": "By completing the form below, you acknowledge that the provided data is offered as is. Although we anticipate no problems, you accept full responsibility for any repercussions resulting from the use of this data. Furthermore, you agree that the data must not be utilized for malicious or harmful purposes towards humanity.", "extra_gated_fields": {"Name": "text", "Email": "text", "Affiliation": "text", "Country": "text", "Usecase": "text", "I have explicitly check with my jurisdiction and I confirm that downloading CulturaX is legal in the country/region where I am located right now, and for the use case that I have described above": "checkbox", "You agree to not attempt to determine the identity of individuals in this dataset": "checkbox"}} | 2024-02-16T19:53:17+00:00 | [
"2309.09400"
] | [
"af",
"als",
"am",
"an",
"ar",
"arz",
"as",
"ast",
"av",
"az",
"azb",
"ba",
"bar",
"bcl",
"be",
"bg",
"bh",
"bn",
"bo",
"bpy",
"br",
"bs",
"bxr",
"ca",
"cbk",
"ce",
"ceb",
"ckb",
"cs",
"cv",
"cy",
"da",
"de",
"dsb",
"dv",
"el",
"eml",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"frr",
"fy",
"ga",
"gd",
"gl",
"gn",
"gom",
"gu",
"he",
"hi",
"hr",
"hsb",
"ht",
"hu",
"hy",
"ia",
"id",
"ie",
"ilo",
"io",
"is",
"it",
"ja",
"jbo",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"krc",
"ku",
"kv",
"kw",
"ky",
"la",
"lb",
"lez",
"li",
"lmo",
"lo",
"lrc",
"lt",
"lv",
"mai",
"mg",
"mhr",
"min",
"mk",
"ml",
"mn",
"mr",
"mrj",
"ms",
"mt",
"mwl",
"my",
"myv",
"mzn",
"nah",
"nap",
"nds",
"ne",
"new",
"nl",
"nn",
"no",
"oc",
"or",
"os",
"pa",
"pam",
"pl",
"pms",
"pnb",
"ps",
"pt",
"qu",
"rm",
"ro",
"ru",
"rue",
"sa",
"sah",
"scn",
"sd",
"sh",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tk",
"tl",
"tr",
"tt",
"tyv",
"ug",
"uk",
"ur",
"uz",
"vec",
"vi",
"vls",
"vo",
"wa",
"war",
"wuu",
"xal",
"xmf",
"yi",
"yo",
"yue",
"zh"
] | TAGS
#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-found #multilinguality-multilingual #source_datasets-original #language-Afrikaans #language-Tosk Albanian #language-Amharic #language-Aragonese #language-Arabic #language-Egyptian Arabic #language-Assamese #language-Asturian #language-Avaric #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-Central Bikol #language-Belarusian #language-Bulgarian #language-bh #language-Bengali #language-Tibetan #language-Bishnupriya #language-Breton #language-Bosnian #language-Russia Buriat #language-Catalan #language-Chavacano #language-Chechen #language-Cebuano #language-Central Kurdish #language-Czech #language-Chuvash #language-Welsh #language-Danish #language-German #language-Lower Sorbian #language-Dhivehi #language-Modern Greek (1453-) #language-Emiliano-Romagnolo #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Persian #language-Finnish #language-French #language-Northern Frisian #language-Western Frisian #language-Irish #language-Scottish Gaelic #language-Galician #language-Guarani #language-Goan Konkani #language-Gujarati #language-Hebrew #language-Hindi #language-Croatian #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Iloko #language-Ido #language-Icelandic #language-Italian #language-Japanese #language-Lojban #language-Javanese #language-Georgian #language-Kazakh #language-Khmer #language-Kannada #language-Korean #language-Karachay-Balkar #language-Kurdish #language-Komi #language-Cornish #language-Kirghiz #language-Latin #language-Luxembourgish #language-Lezghian #language-Limburgan #language-Lombard #language-Lao #language-Northern Luri #language-Lithuanian #language-Latvian #language-Maithili #language-Malagasy #language-Eastern Mari #language-Minangkabau #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Malay (macrolanguage) #language-Maltese #language-Mirandese #language-Burmese #language-Erzya #language-Mazanderani #language-nah #language-Neapolitan #language-Low German #language-Nepali (macrolanguage) #language-Newari #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-Occitan (post 1500) #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Pampanga #language-Polish #language-Piemontese #language-Western Panjabi #language-Pushto #language-Portuguese #language-Quechua #language-Romansh #language-Romanian #language-Russian #language-Rusyn #language-Sanskrit #language-Yakut #language-Sicilian #language-Sindhi #language-Serbo-Croatian #language-Sinhala #language-Slovak #language-Slovenian #language-Somali #language-Albanian #language-Serbian #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Tamil #language-Telugu #language-Tajik #language-Thai #language-Turkmen #language-Tagalog #language-Turkish #language-Tatar #language-Tuvinian #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Venetian #language-Vietnamese #language-Vlaams #language-Volapük #language-Walloon #language-Waray (Philippines) #language-Wu Chinese #language-Kalmyk #language-Mingrelian #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Chinese #arxiv-2309.09400 #region-us
| Dataset Description
-------------------
* Repository: URL
* Papers: CulturaX: A Cleaned, Enormous, and Multilingual Dataset for Large Language Models in 167 Languages
Dataset Summary
---------------
We present CulturaX, a substantial multilingual dataset with 6.3 trillion tokens in 167 languages, tailored for large language model (LLM) development. Our dataset undergoes meticulous cleaning and deduplication through a rigorous pipeline of multiple stages to accomplish the best quality for model training, including language identification, URL-based filtering, metric-based cleaning, document refinement, and data deduplication. We employ MinHash at document level to achieve fuzzy deduplication for the datasets in different languages. Our data cleaning framework includes diverse criteria and threshold selections, guided by extensive data samples, ensuring comprehensive noise filtering in various aspects. CulturaX is fully released to the public in HuggingFace to facilitate research and advancements in multilingual LLMs.
Our dataset combines the most recent iteration of mC4 (version 3.1.0) [1] with all accessible OSCAR corpora up to the present year, including 20.19, 21.09, 22.01, and 23.01 [2]. After deep cleaning and deduplication, CulturaX involves 16TB data in the parquet format (expanding to 27TB when unpacked). More than a half of our dataset is dedicated to non-English languages to significantly boost the data size and enhance the feasibility of training models in multilingual scenarios.
To obtain perplexity scores for data cleaning, we train a SentencePiece tokenizer and 5-gram Kneser-Ney language models as provided in the KenLM library [3] using the 20230501 dumps of Wikipedia. Our KenLM models are also released in HuggingFace: URL
Details for the dataset can be found in our technical paper: URL
You can download the dataset using Hugging Face datasets:
\*You may need to follow these instructions to setup authentication before downloading the dataset: URL
### Languages
The supported languages and statistics for our dataset can be found below:
*(Note that the language code 'als' and 'eml' refer to 'gsw' and 'x-eml' in the OSCAR-2301 dataset.)*
### Dataset Structure
Considerations for Using the Data
---------------------------------
As CulturaX is the cleaned version of the mC4 and OSCAR datasets, which were both extracted from CommonCrawl, personal and sensitive information might still contain personal and sensitive information.
This must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.
License Information
-------------------
The licence terms for CulturaX strictly follows those of 'mC4' and 'OSCAR'. Please refer to both below licenses when using this dataset.
* mC4 license
* OSCAR license
To cite CulturaX, please use:
Reference
---------
[1] Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, and Colin Raffel. 2021. mT5: A massively multilingual
pre-trained text-to-text transformer. In NAACL 2021. URL
[2] Pedro Javier Ortiz Suárez, Benoît Sagot, and Laurent Romary. 2019. Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures. In Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-
7) 2019. URL
[3] KenLM: Faster and smaller language model queries. In Proceedings of the Sixth
Workshop on Statistical Machine Translation, 2011.
| [
"### Languages\n\n\nThe supported languages and statistics for our dataset can be found below:\n\n\n*(Note that the language code 'als' and 'eml' refer to 'gsw' and 'x-eml' in the OSCAR-2301 dataset.)*",
"### Dataset Structure\n\n\nConsiderations for Using the Data\n---------------------------------\n\n\nAs CulturaX is the cleaned version of the mC4 and OSCAR datasets, which were both extracted from CommonCrawl, personal and sensitive information might still contain personal and sensitive information.\nThis must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.\n\n\nLicense Information\n-------------------\n\n\nThe licence terms for CulturaX strictly follows those of 'mC4' and 'OSCAR'. Please refer to both below licenses when using this dataset.\n\n\n* mC4 license\n* OSCAR license\n\n\nTo cite CulturaX, please use:\n\n\nReference\n---------\n\n\n[1] Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, and Colin Raffel. 2021. mT5: A massively multilingual\npre-trained text-to-text transformer. In NAACL 2021. URL\n\n\n[2] Pedro Javier Ortiz Suárez, Benoît Sagot, and Laurent Romary. 2019. Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures. In Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-\n7) 2019. URL\n\n\n[3] KenLM: Faster and smaller language model queries. In Proceedings of the Sixth\nWorkshop on Statistical Machine Translation, 2011."
] | [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-found #multilinguality-multilingual #source_datasets-original #language-Afrikaans #language-Tosk Albanian #language-Amharic #language-Aragonese #language-Arabic #language-Egyptian Arabic #language-Assamese #language-Asturian #language-Avaric #language-Azerbaijani #language-South Azerbaijani #language-Bashkir #language-Bavarian #language-Central Bikol #language-Belarusian #language-Bulgarian #language-bh #language-Bengali #language-Tibetan #language-Bishnupriya #language-Breton #language-Bosnian #language-Russia Buriat #language-Catalan #language-Chavacano #language-Chechen #language-Cebuano #language-Central Kurdish #language-Czech #language-Chuvash #language-Welsh #language-Danish #language-German #language-Lower Sorbian #language-Dhivehi #language-Modern Greek (1453-) #language-Emiliano-Romagnolo #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Persian #language-Finnish #language-French #language-Northern Frisian #language-Western Frisian #language-Irish #language-Scottish Gaelic #language-Galician #language-Guarani #language-Goan Konkani #language-Gujarati #language-Hebrew #language-Hindi #language-Croatian #language-Upper Sorbian #language-Haitian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Interlingue #language-Iloko #language-Ido #language-Icelandic #language-Italian #language-Japanese #language-Lojban #language-Javanese #language-Georgian #language-Kazakh #language-Khmer #language-Kannada #language-Korean #language-Karachay-Balkar #language-Kurdish #language-Komi #language-Cornish #language-Kirghiz #language-Latin #language-Luxembourgish #language-Lezghian #language-Limburgan #language-Lombard #language-Lao #language-Northern Luri #language-Lithuanian #language-Latvian #language-Maithili #language-Malagasy #language-Eastern Mari #language-Minangkabau #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Malay (macrolanguage) #language-Maltese #language-Mirandese #language-Burmese #language-Erzya #language-Mazanderani #language-nah #language-Neapolitan #language-Low German #language-Nepali (macrolanguage) #language-Newari #language-Dutch #language-Norwegian Nynorsk #language-Norwegian #language-Occitan (post 1500) #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Pampanga #language-Polish #language-Piemontese #language-Western Panjabi #language-Pushto #language-Portuguese #language-Quechua #language-Romansh #language-Romanian #language-Russian #language-Rusyn #language-Sanskrit #language-Yakut #language-Sicilian #language-Sindhi #language-Serbo-Croatian #language-Sinhala #language-Slovak #language-Slovenian #language-Somali #language-Albanian #language-Serbian #language-Sundanese #language-Swedish #language-Swahili (macrolanguage) #language-Tamil #language-Telugu #language-Tajik #language-Thai #language-Turkmen #language-Tagalog #language-Turkish #language-Tatar #language-Tuvinian #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Venetian #language-Vietnamese #language-Vlaams #language-Volapük #language-Walloon #language-Waray (Philippines) #language-Wu Chinese #language-Kalmyk #language-Mingrelian #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Chinese #arxiv-2309.09400 #region-us \n",
"### Languages\n\n\nThe supported languages and statistics for our dataset can be found below:\n\n\n*(Note that the language code 'als' and 'eml' refer to 'gsw' and 'x-eml' in the OSCAR-2301 dataset.)*",
"### Dataset Structure\n\n\nConsiderations for Using the Data\n---------------------------------\n\n\nAs CulturaX is the cleaned version of the mC4 and OSCAR datasets, which were both extracted from CommonCrawl, personal and sensitive information might still contain personal and sensitive information.\nThis must be considered prior to using this dataset for any purpose, such as training deep learning models, etc.\n\n\nLicense Information\n-------------------\n\n\nThe licence terms for CulturaX strictly follows those of 'mC4' and 'OSCAR'. Please refer to both below licenses when using this dataset.\n\n\n* mC4 license\n* OSCAR license\n\n\nTo cite CulturaX, please use:\n\n\nReference\n---------\n\n\n[1] Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, and Colin Raffel. 2021. mT5: A massively multilingual\npre-trained text-to-text transformer. In NAACL 2021. URL\n\n\n[2] Pedro Javier Ortiz Suárez, Benoît Sagot, and Laurent Romary. 2019. Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures. In Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-\n7) 2019. URL\n\n\n[3] KenLM: Faster and smaller language model queries. In Proceedings of the Sixth\nWorkshop on Statistical Machine Translation, 2011."
] |
5c9a1fc2c132bb638807c63e46f9fba76730c000 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500,
max_rm_response_length=1500),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1708034814 | [
"region:us"
] | 2024-02-15T22:09:51+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_sft", "num_bytes": 1982888370.9168758, "num_examples": 22991}, {"name": "train_sft", "num_bytes": 17846869528.524822, "num_examples": 206698}], "download_size": 3301659997, "dataset_size": 19829757899.441696}} | 2024-02-15T22:12:40+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] |
02478d2b9ea312804fd60a4088ef80a67e0c7283 |
# Dataset consisting of anonymous polish stories
## Warning: Stories were not curated by me, some may use strong language or use sexual references
This dataset consists of all (28k) stories dumped from [anonimowe wyzwania](https://anonimowe.pl/) in January. Stories are submitted by anonymous users. I have included a community rating, which you can use for filtering.
Stories are very diverse — some are sad, some funny. The huge amount might sound fake, but the vast majority is made by young people. This may help your model to not sound strict, corporate, boring, or academic.
Default sorting is based on community rating.
More information about requirements for stories: [link](https://anonimowe.pl/faq)
## Where to find me
- [Github](https://github.com/JonaszPotoniec)
- [Linkedin](https://www.linkedin.com/in/jonasz-potoniec/)
- [E-mail](mailto:[email protected])
- [Telegram](https://t.me/JonaszPotoniec) | JonaszPotoniec/anonimowe-polish-stories | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:pl",
"license:mit",
"not-for-all-audiences",
"region:us"
] | 2024-02-15T22:20:13+00:00 | {"language": ["pl"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "Anonimowe wyzwania", "dataset_info": {"features": [{"name": "points", "dtype": "int64"}, {"name": "story", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 33017836, "num_examples": 27798}], "download_size": 22463377, "dataset_size": 33017836}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["not-for-all-audiences"]} | 2024-02-15T22:39:55+00:00 | [] | [
"pl"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-Polish #license-mit #not-for-all-audiences #region-us
|
# Dataset consisting of anonymous polish stories
## Warning: Stories were not curated by me, some may use strong language or use sexual references
This dataset consists of all (28k) stories dumped from anonimowe wyzwania in January. Stories are submitted by anonymous users. I have included a community rating, which you can use for filtering.
Stories are very diverse — some are sad, some funny. The huge amount might sound fake, but the vast majority is made by young people. This may help your model to not sound strict, corporate, boring, or academic.
Default sorting is based on community rating.
More information about requirements for stories: link
## Where to find me
- Github
- Linkedin
- E-mail
- Telegram | [
"# Dataset consisting of anonymous polish stories",
"## Warning: Stories were not curated by me, some may use strong language or use sexual references\n\nThis dataset consists of all (28k) stories dumped from anonimowe wyzwania in January. Stories are submitted by anonymous users. I have included a community rating, which you can use for filtering. \nStories are very diverse — some are sad, some funny. The huge amount might sound fake, but the vast majority is made by young people. This may help your model to not sound strict, corporate, boring, or academic. \nDefault sorting is based on community rating. \nMore information about requirements for stories: link",
"## Where to find me\n\n- Github\n- Linkedin\n- E-mail\n- Telegram"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-Polish #license-mit #not-for-all-audiences #region-us \n",
"# Dataset consisting of anonymous polish stories",
"## Warning: Stories were not curated by me, some may use strong language or use sexual references\n\nThis dataset consists of all (28k) stories dumped from anonimowe wyzwania in January. Stories are submitted by anonymous users. I have included a community rating, which you can use for filtering. \nStories are very diverse — some are sad, some funny. The huge amount might sound fake, but the vast majority is made by young people. This may help your model to not sound strict, corporate, boring, or academic. \nDefault sorting is based on community rating. \nMore information about requirements for stories: link",
"## Where to find me\n\n- Github\n- Linkedin\n- E-mail\n- Telegram"
] |
3bbd3fa7009ed6534ede5bbb220b8e10b6025985 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500,
max_rm_response_length=1500),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1708035667 | [
"region:us"
] | 2024-02-15T22:23:56+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_sft", "num_bytes": 1982888370.9168758, "num_examples": 22991}, {"name": "train_sft", "num_bytes": 17846869528.524822, "num_examples": 206698}], "download_size": 3301659997, "dataset_size": 19829757899.441696}} | 2024-02-15T22:29:05+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] |
24e68a15ed3e18781397e6ea448fa00a65812ac3 | Generated by ChatGPT | CreitinGameplays/you-are-elisa-chan | [
"region:us"
] | 2024-02-15T22:26:23+00:00 | {} | 2024-02-15T22:27:31+00:00 | [] | [] | TAGS
#region-us
| Generated by ChatGPT | [] | [
"TAGS\n#region-us \n"
] |
63c71da57f5891125e2c5fd36ff1351747675891 | # Dataset Card for "ultrafeedback_binarized_1708035667"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1708035667 | [
"region:us"
] | 2024-02-15T22:28:31+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "query_chosen_token", "sequence": "int64"}, {"name": "query_chosen_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "query_rejected_token", "sequence": "int64"}, {"name": "query_rejected_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 235051943.0, "num_examples": 2000}, {"name": "train_prefs", "num_bytes": 7188255622.3255415, "num_examples": 61112}], "download_size": 477048940, "dataset_size": 7423307565.3255415}} | 2024-02-15T22:29:04+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1708035667"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1708035667\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1708035667\"\n\nMore Information needed"
] |
509daa48db465032eeda6721595abb09a638ee2f |
# Dataset Card for Evaluation run of BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit](https://huggingface.co/BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__PequeLLaMa-1B-Instruct-v0.1-16bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T22:30:35.769938](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__PequeLLaMa-1B-Instruct-v0.1-16bit/blob/main/results_2024-02-15T22-30-35.769938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24925356596098744,
"acc_stderr": 0.03049039803240805,
"acc_norm": 0.250967942400928,
"acc_norm_stderr": 0.0312990053733219,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.41096447978752615,
"mc2_stderr": 0.014916925934314724
},
"harness|arc:challenge|25": {
"acc": 0.24658703071672355,
"acc_stderr": 0.01259572626879012,
"acc_norm": 0.27986348122866894,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.333698466440948,
"acc_stderr": 0.0047056977452221435,
"acc_norm": 0.4302927703644692,
"acc_norm_stderr": 0.004941051795214789
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325437,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325437
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123384,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749884,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749884
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.03147830790259575,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.03147830790259575
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02113285918275444,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02113285918275444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.16129032258064516,
"acc_stderr": 0.020923327006423305,
"acc_norm": 0.16129032258064516,
"acc_norm_stderr": 0.020923327006423305
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1477832512315271,
"acc_stderr": 0.02496962133352127,
"acc_norm": 0.1477832512315271,
"acc_norm_stderr": 0.02496962133352127
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.03097543638684542,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.03097543638684542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423088,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423088
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18907563025210083,
"acc_stderr": 0.025435119438105353,
"acc_norm": 0.18907563025210083,
"acc_norm_stderr": 0.025435119438105353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.033742355504256936,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.033742355504256936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20550458715596331,
"acc_stderr": 0.017324352325016005,
"acc_norm": 0.20550458715596331,
"acc_norm_stderr": 0.017324352325016005
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955927,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529614,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529614
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.02322275679743511,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.02322275679743511
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451163,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23533246414602346,
"acc_stderr": 0.010834432543912226,
"acc_norm": 0.23533246414602346,
"acc_norm_stderr": 0.010834432543912226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813292,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813292
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071857,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071857
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487295,
"mc2": 0.41096447978752615,
"mc2_stderr": 0.014916925934314724
},
"harness|winogrande|5": {
"acc": 0.5272296764009471,
"acc_stderr": 0.014031631629827701
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarraHome__PequeLLaMa-1B-Instruct-v0.1-16bit | [
"region:us"
] | 2024-02-15T22:32:54+00:00 | {"pretty_name": "Evaluation run of BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit](https://huggingface.co/BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__PequeLLaMa-1B-Instruct-v0.1-16bit\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T22:30:35.769938](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__PequeLLaMa-1B-Instruct-v0.1-16bit/blob/main/results_2024-02-15T22-30-35.769938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24925356596098744,\n \"acc_stderr\": 0.03049039803240805,\n \"acc_norm\": 0.250967942400928,\n \"acc_norm_stderr\": 0.0312990053733219,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.41096447978752615,\n \"mc2_stderr\": 0.014916925934314724\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.24658703071672355,\n \"acc_stderr\": 0.01259572626879012,\n \"acc_norm\": 0.27986348122866894,\n \"acc_norm_stderr\": 0.013119040897725922\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.333698466440948,\n \"acc_stderr\": 0.0047056977452221435,\n \"acc_norm\": 0.4302927703644692,\n \"acc_norm_stderr\": 0.004941051795214789\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03455473702325437,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03455473702325437\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123384,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749884,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749884\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.03147830790259575,\n \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.03147830790259575\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275444,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.16129032258064516,\n \"acc_stderr\": 0.020923327006423305,\n \"acc_norm\": 0.16129032258064516,\n \"acc_norm_stderr\": 0.020923327006423305\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1477832512315271,\n \"acc_stderr\": 0.02496962133352127,\n \"acc_norm\": 0.1477832512315271,\n \"acc_norm_stderr\": 0.02496962133352127\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.03097543638684542,\n \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.03097543638684542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423088,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423088\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.025435119438105353,\n \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.025435119438105353\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.033742355504256936,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.033742355504256936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.20550458715596331,\n \"acc_stderr\": 0.017324352325016005,\n \"acc_norm\": 0.20550458715596331,\n \"acc_norm_stderr\": 0.017324352325016005\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955927,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955927\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.29596412556053814,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529614,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529614\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n \"acc_stderr\": 0.02322275679743511,\n \"acc_norm\": 0.21221864951768488,\n \"acc_norm_stderr\": 0.02322275679743511\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451163,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451163\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23533246414602346,\n \"acc_stderr\": 0.010834432543912226,\n \"acc_norm\": 0.23533246414602346,\n \"acc_norm_stderr\": 0.010834432543912226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813292,\n \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813292\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071857,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071857\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487295,\n \"mc2\": 0.41096447978752615,\n \"mc2_stderr\": 0.014916925934314724\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5272296764009471,\n \"acc_stderr\": 0.014031631629827701\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-30-35.769938.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["**/details_harness|winogrande|5_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T22-30-35.769938.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T22_30_35.769938", "path": ["results_2024-02-15T22-30-35.769938.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T22-30-35.769938.parquet"]}]}]} | 2024-02-15T22:33:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit
Dataset automatically created during the evaluation run of model BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T22:30:35.769938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:30:35.769938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/PequeLLaMa-1B-Instruct-v0.1-16bit on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:30:35.769938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6a511f446d966a6003871f0781f56b511303dd16 |
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T22:34:32.769102](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted/blob/main/results_2024-02-15T22-34-32.769102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6363878047023346,
"acc_stderr": 0.032296390142505176,
"acc_norm": 0.6396525594831768,
"acc_norm_stderr": 0.03293649951708016,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5164963215954579,
"mc2_stderr": 0.015210020803636122
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467325,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.01399057113791876
},
"harness|hellaswag|10": {
"acc": 0.6483768173670583,
"acc_stderr": 0.004765012078929387,
"acc_norm": 0.8394742083250348,
"acc_norm_stderr": 0.003663427536178161
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903338,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.01544571691099888,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.01544571691099888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083131,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5164963215954579,
"mc2_stderr": 0.015210020803636122
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712667
},
"harness|gsm8k|5": {
"acc": 0.5322213798332069,
"acc_stderr": 0.013743857303073797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted | [
"region:us"
] | 2024-02-15T22:36:52+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T22:34:32.769102](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted/blob/main/results_2024-02-15T22-34-32.769102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6363878047023346,\n \"acc_stderr\": 0.032296390142505176,\n \"acc_norm\": 0.6396525594831768,\n \"acc_norm_stderr\": 0.03293649951708016,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5164963215954579,\n \"mc2_stderr\": 0.015210020803636122\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467325,\n \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.01399057113791876\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n \"acc_stderr\": 0.004765012078929387,\n \"acc_norm\": 0.8394742083250348,\n \"acc_norm_stderr\": 0.003663427536178161\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903338,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.01544571691099888,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.01544571691099888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5164963215954579,\n \"mc2_stderr\": 0.015210020803636122\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5322213798332069,\n \"acc_stderr\": 0.013743857303073797\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-34-32.769102.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["**/details_harness|winogrande|5_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T22-34-32.769102.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T22_34_32.769102", "path": ["results_2024-02-15T22-34-32.769102.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T22-34-32.769102.parquet"]}]}]} | 2024-02-15T22:37:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted
Dataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T22:34:32.769102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:34:32.769102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:34:32.769102(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
58c6886f62bf6f4139598db460db5bb4693048ff |
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T22:39:42.033476](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2/blob/main/results_2024-02-15T22-39-42.033476.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6371686329291266,
"acc_stderr": 0.03229837388268866,
"acc_norm": 0.6400332159863307,
"acc_norm_stderr": 0.03293928846555229,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5239619444151867,
"mc2_stderr": 0.015249337191329784
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726099
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.0047463946133845325,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062950894
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371542,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031204,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5239619444151867,
"mc2_stderr": 0.015249337191329784
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643412
},
"harness|gsm8k|5": {
"acc": 0.55420773313116,
"acc_stderr": 0.01369130517450669
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2 | [
"region:us"
] | 2024-02-15T22:41:58+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T22:39:42.033476](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2/blob/main/results_2024-02-15T22-39-42.033476.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6371686329291266,\n \"acc_stderr\": 0.03229837388268866,\n \"acc_norm\": 0.6400332159863307,\n \"acc_norm_stderr\": 0.03293928846555229,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5239619444151867,\n \"mc2_stderr\": 0.015249337191329784\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726099\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n \"acc_stderr\": 0.0047463946133845325,\n \"acc_norm\": 0.8454491137223661,\n \"acc_norm_stderr\": 0.0036073726062950894\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371542,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371542\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031204,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031204\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5239619444151867,\n \"mc2_stderr\": 0.015249337191329784\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.55420773313116,\n \"acc_stderr\": 0.01369130517450669\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T22-39-42.033476.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["**/details_harness|winogrande|5_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T22-39-42.033476.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T22_39_42.033476", "path": ["results_2024-02-15T22-39-42.033476.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T22-39-42.033476.parquet"]}]}]} | 2024-02-15T22:42:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2
Dataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T22:39:42.033476(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:39:42.033476(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-original-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T22:39:42.033476(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e998248084186177e8803376dbcf3e8967925bc3 |
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-bf16](https://huggingface.co/Kquant03/Buttercup-V2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T23:01:38.445097](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16/blob/main/results_2024-02-15T23-01-38.445097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530538161665846,
"acc_stderr": 0.031991249389942286,
"acc_norm": 0.6524163789680969,
"acc_norm_stderr": 0.03266539743840073,
"mc1": 0.554467564259486,
"mc1_stderr": 0.01739933528014034,
"mc2": 0.6947306262348207,
"mc2_stderr": 0.015031157853542046
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.013340916085246258,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.7112129057956582,
"acc_stderr": 0.004522725412556955,
"acc_norm": 0.885381398127863,
"acc_norm_stderr": 0.003179100565887989
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265335,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.554467564259486,
"mc1_stderr": 0.01739933528014034,
"mc2": 0.6947306262348207,
"mc2_stderr": 0.015031157853542046
},
"harness|winogrande|5": {
"acc": 0.8650355169692187,
"acc_stderr": 0.009603064913219049
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16 | [
"region:us"
] | 2024-02-15T23:03:54+00:00 | {"pretty_name": "Evaluation run of Kquant03/Buttercup-V2-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-bf16](https://huggingface.co/Kquant03/Buttercup-V2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T23:01:38.445097](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16/blob/main/results_2024-02-15T23-01-38.445097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530538161665846,\n \"acc_stderr\": 0.031991249389942286,\n \"acc_norm\": 0.6524163789680969,\n \"acc_norm_stderr\": 0.03266539743840073,\n \"mc1\": 0.554467564259486,\n \"mc1_stderr\": 0.01739933528014034,\n \"mc2\": 0.6947306262348207,\n \"mc2_stderr\": 0.015031157853542046\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246258,\n \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7112129057956582,\n \"acc_stderr\": 0.004522725412556955,\n \"acc_norm\": 0.885381398127863,\n \"acc_norm_stderr\": 0.003179100565887989\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.554467564259486,\n \"mc1_stderr\": 0.01739933528014034,\n \"mc2\": 0.6947306262348207,\n \"mc2_stderr\": 0.015031157853542046\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8650355169692187,\n \"acc_stderr\": 0.009603064913219049\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.012782681251053198\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Buttercup-V2-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|arc:challenge|25_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|gsm8k|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hellaswag|10_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["**/details_harness|winogrande|5_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T23-01-38.445097.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T23_01_38.445097", "path": ["results_2024-02-15T23-01-38.445097.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T23-01-38.445097.parquet"]}]}]} | 2024-02-15T23:04:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-bf16
Dataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T23:01:38.445097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T23:01:38.445097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T23:01:38.445097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
967dd83abf6b67b4c5066e94dddbd7fe5ab0bfb6 | Based on AEZAKMI V3, I removed some general airoboros things that made the model predictable and boring and changed up system prompts for wsb_001 prompts a bit. | adamo1139/AEZAKMI_v3-1 | [
"license:other",
"region:us"
] | 2024-02-15T23:21:26+00:00 | {"license": "other", "license_name": "other", "license_link": "LICENSE"} | 2024-02-15T23:22:52+00:00 | [] | [] | TAGS
#license-other #region-us
| Based on AEZAKMI V3, I removed some general airoboros things that made the model predictable and boring and changed up system prompts for wsb_001 prompts a bit. | [] | [
"TAGS\n#license-other #region-us \n"
] |
7c92851a5d413d723a5611b7c94f6b78ceedd6d3 | # MathPile ArXiv (subset)
## Description
This dataset consists of 343,830 TeX files containing mathematics papers sourced from the arXiv. Training and testing sets are already split
## Source
The data was obtained from the training + validation portion of the arXiv subset of MathPile.
## Format
- LLaMa BOS and EOS tokens (`<s>` and `</s>`) have been added to mark the beginning and end of each sequence.
- The dataset is organized into blocks of 64,000 documents each, stored in JSONL format.
## Usage
- LaTeX stuff idk
## License
The original data is subject to the licensing terms of the arXiv. Users should refer to the arXiv's terms of use for details on permissible usage. | aluncstokes/math_arxiv_temp | [
"region:us"
] | 2024-02-16T00:10:27+00:00 | {} | 2024-02-17T02:59:11+00:00 | [] | [] | TAGS
#region-us
| # MathPile ArXiv (subset)
## Description
This dataset consists of 343,830 TeX files containing mathematics papers sourced from the arXiv. Training and testing sets are already split
## Source
The data was obtained from the training + validation portion of the arXiv subset of MathPile.
## Format
- LLaMa BOS and EOS tokens ('<s>' and '</s>') have been added to mark the beginning and end of each sequence.
- The dataset is organized into blocks of 64,000 documents each, stored in JSONL format.
## Usage
- LaTeX stuff idk
## License
The original data is subject to the licensing terms of the arXiv. Users should refer to the arXiv's terms of use for details on permissible usage. | [
"# MathPile ArXiv (subset)",
"## Description\nThis dataset consists of 343,830 TeX files containing mathematics papers sourced from the arXiv. Training and testing sets are already split",
"## Source\nThe data was obtained from the training + validation portion of the arXiv subset of MathPile.",
"## Format\n- LLaMa BOS and EOS tokens ('<s>' and '</s>') have been added to mark the beginning and end of each sequence.\n- The dataset is organized into blocks of 64,000 documents each, stored in JSONL format.",
"## Usage\n- LaTeX stuff idk",
"## License\nThe original data is subject to the licensing terms of the arXiv. Users should refer to the arXiv's terms of use for details on permissible usage."
] | [
"TAGS\n#region-us \n",
"# MathPile ArXiv (subset)",
"## Description\nThis dataset consists of 343,830 TeX files containing mathematics papers sourced from the arXiv. Training and testing sets are already split",
"## Source\nThe data was obtained from the training + validation portion of the arXiv subset of MathPile.",
"## Format\n- LLaMa BOS and EOS tokens ('<s>' and '</s>') have been added to mark the beginning and end of each sequence.\n- The dataset is organized into blocks of 64,000 documents each, stored in JSONL format.",
"## Usage\n- LaTeX stuff idk",
"## License\nThe original data is subject to the licensing terms of the arXiv. Users should refer to the arXiv's terms of use for details on permissible usage."
] |
e9945407504d876959f0932b3e2387072b6ab099 |
# Dataset Card for Evaluation run of NovoCode/NeuralPaca-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/NeuralPaca-7b](https://huggingface.co/NovoCode/NeuralPaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__NeuralPaca-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T00:47:24.688523](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__NeuralPaca-7b/blob/main/results_2024-02-16T00-47-24.688523.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6323051718212915,
"acc_stderr": 0.032436835484599615,
"acc_norm": 0.6335791604628077,
"acc_norm_stderr": 0.033101979094730234,
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.4831753085957374,
"mc2_stderr": 0.015324947436319568
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449707,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.632742481577375,
"acc_stderr": 0.004810723108378215,
"acc_norm": 0.8301135232025493,
"acc_norm_stderr": 0.0037476555337545153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768787,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768787
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973133,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.01568044151888918,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.01568044151888918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457162,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457162
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580214,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.4831753085957374,
"mc2_stderr": 0.015324947436319568
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.01086977863316836
},
"harness|gsm8k|5": {
"acc": 0.6057619408642911,
"acc_stderr": 0.013460852357095666
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NovoCode__NeuralPaca-7b | [
"region:us"
] | 2024-02-16T00:49:45+00:00 | {"pretty_name": "Evaluation run of NovoCode/NeuralPaca-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NovoCode/NeuralPaca-7b](https://huggingface.co/NovoCode/NeuralPaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__NeuralPaca-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T00:47:24.688523](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__NeuralPaca-7b/blob/main/results_2024-02-16T00-47-24.688523.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6323051718212915,\n \"acc_stderr\": 0.032436835484599615,\n \"acc_norm\": 0.6335791604628077,\n \"acc_norm_stderr\": 0.033101979094730234,\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.4831753085957374,\n \"mc2_stderr\": 0.015324947436319568\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449707,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632742481577375,\n \"acc_stderr\": 0.004810723108378215,\n \"acc_norm\": 0.8301135232025493,\n \"acc_norm_stderr\": 0.0037476555337545153\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768787,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768787\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973133,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973133\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.01568044151888918,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.01568044151888918\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457162,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457162\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580214,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580214\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.4831753085957374,\n \"mc2_stderr\": 0.015324947436319568\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.01086977863316836\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6057619408642911,\n \"acc_stderr\": 0.013460852357095666\n }\n}\n```", "repo_url": "https://huggingface.co/NovoCode/NeuralPaca-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|arc:challenge|25_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|gsm8k|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hellaswag|10_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T00-47-24.688523.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["**/details_harness|winogrande|5_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T00-47-24.688523.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T00_47_24.688523", "path": ["results_2024-02-16T00-47-24.688523.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T00-47-24.688523.parquet"]}]}]} | 2024-02-16T00:50:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NovoCode/NeuralPaca-7b
Dataset automatically created during the evaluation run of model NovoCode/NeuralPaca-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T00:47:24.688523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NovoCode/NeuralPaca-7b\n\n\n\nDataset automatically created during the evaluation run of model NovoCode/NeuralPaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T00:47:24.688523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NovoCode/NeuralPaca-7b\n\n\n\nDataset automatically created during the evaluation run of model NovoCode/NeuralPaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T00:47:24.688523(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
693d353fc9bd9e57ca97a05a8175a7abfdc752d0 |
# Dataset Card for Evaluation run of jeiku/NarrativeNexus_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeiku/NarrativeNexus_7B](https://huggingface.co/jeiku/NarrativeNexus_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeiku__NarrativeNexus_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T01:30:29.349287](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__NarrativeNexus_7B/blob/main/results_2024-02-16T01-30-29.349287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6331502373053775,
"acc_stderr": 0.032649477056743835,
"acc_norm": 0.6360612367088411,
"acc_norm_stderr": 0.03330403787596569,
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6394506791157332,
"mc2_stderr": 0.015272071804569947
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.01412459788184446,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.6773551085441147,
"acc_stderr": 0.004665327309399188,
"acc_norm": 0.8573989245170285,
"acc_norm_stderr": 0.003489509493001621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6394506791157332,
"mc2_stderr": 0.015272071804569947
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.5178165276724791,
"acc_stderr": 0.013763738379867933
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jeiku__NarrativeNexus_7B | [
"region:us"
] | 2024-02-16T01:32:47+00:00 | {"pretty_name": "Evaluation run of jeiku/NarrativeNexus_7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [jeiku/NarrativeNexus_7B](https://huggingface.co/jeiku/NarrativeNexus_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeiku__NarrativeNexus_7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T01:30:29.349287](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__NarrativeNexus_7B/blob/main/results_2024-02-16T01-30-29.349287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6331502373053775,\n \"acc_stderr\": 0.032649477056743835,\n \"acc_norm\": 0.6360612367088411,\n \"acc_norm_stderr\": 0.03330403787596569,\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6394506791157332,\n \"mc2_stderr\": 0.015272071804569947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.01412459788184446,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6773551085441147,\n \"acc_stderr\": 0.004665327309399188,\n \"acc_norm\": 0.8573989245170285,\n \"acc_norm_stderr\": 0.003489509493001621\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46878824969400246,\n \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6394506791157332,\n \"mc2_stderr\": 0.015272071804569947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5178165276724791,\n \"acc_stderr\": 0.013763738379867933\n }\n}\n```", "repo_url": "https://huggingface.co/jeiku/NarrativeNexus_7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|arc:challenge|25_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|gsm8k|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hellaswag|10_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["**/details_harness|winogrande|5_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T01-30-29.349287.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T01_30_29.349287", "path": ["results_2024-02-16T01-30-29.349287.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T01-30-29.349287.parquet"]}]}]} | 2024-02-16T01:33:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jeiku/NarrativeNexus_7B
Dataset automatically created during the evaluation run of model jeiku/NarrativeNexus_7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T01:30:29.349287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jeiku/NarrativeNexus_7B\n\n\n\nDataset automatically created during the evaluation run of model jeiku/NarrativeNexus_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T01:30:29.349287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jeiku/NarrativeNexus_7B\n\n\n\nDataset automatically created during the evaluation run of model jeiku/NarrativeNexus_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T01:30:29.349287(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
90451914c3fca8621fb8ba377beb88da3e7b4adc | # Dataset Card for "Test_Dataset_1K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ouvic215/Test_Dataset_1K | [
"region:us"
] | 2024-02-16T02:23:19+00:00 | {"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 147332332.0, "num_examples": 1588}], "download_size": 146499523, "dataset_size": 147332332.0}} | 2024-02-16T02:23:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Test_Dataset_1K"
More Information needed | [
"# Dataset Card for \"Test_Dataset_1K\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Test_Dataset_1K\"\n\nMore Information needed"
] |
90485c7ec0ee3d8278ab1dab86a3fcefc9616e3d |
This is an LLM rated version of **euclaise/reddit-instruct-curated**, which is already a good dataset imo.
Only **post titles** and **comment texts** were rated as post texts can be confusing due to edits and seemingly out of context information.
First, **I filtered examples with <250 comment score**. Of course this is not a very efficient filtering as some pairs might have references to other comments or simply be unhelpful, yet upvoted due to Reddit hivemind.
Next I sent the example pairs with a rating prompt to Senku-Q2-XS and collected the numeric votes **(out of 10)**.
Overall there aren't many low rated examples. Here are three "worst" examples:
![image/png](/static-proxy?url=https%3A%2F%2Fcdn-uploads.huggingface.co%2Fproduction%2Fuploads%2F6324eabf05bd8a54c6eb1650%2Flxj7BGeJXqgRwtx3UoPlU.png)
There are only 66 examples with <6 rate.
An example of highly upvoted but poorly rated pair:
![image/png](/static-proxy?url=https%3A%2F%2Fcdn-uploads.huggingface.co%2Fproduction%2Fuploads%2F6324eabf05bd8a54c6eb1650%2Fu6wsjzeHNnN4OGPWplyXe.png)
**Let me know if I fucked up anything, I still have no idea what I am doing honestly.** | Ba2han/Reddit-instruct-curated_rated-1.2k | [
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"region:us"
] | 2024-02-16T03:14:47+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"]} | 2024-02-16T03:50:26+00:00 | [] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-mit #region-us
|
This is an LLM rated version of euclaise/reddit-instruct-curated, which is already a good dataset imo.
Only post titles and comment texts were rated as post texts can be confusing due to edits and seemingly out of context information.
First, I filtered examples with <250 comment score. Of course this is not a very efficient filtering as some pairs might have references to other comments or simply be unhelpful, yet upvoted due to Reddit hivemind.
Next I sent the example pairs with a rating prompt to Senku-Q2-XS and collected the numeric votes (out of 10).
Overall there aren't many low rated examples. Here are three "worst" examples:
!image/png
There are only 66 examples with <6 rate.
An example of highly upvoted but poorly rated pair:
!image/png
Let me know if I fucked up anything, I still have no idea what I am doing honestly. | [] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-mit #region-us \n"
] |
7cd7243e4f7c07090530db29083d02fc34217020 | # Dataset Card for "Test_Dataset_1K-0216"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ouvic215/Test_Dataset_1K-0216 | [
"region:us"
] | 2024-02-16T03:22:22+00:00 | {"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 147332332.0, "num_examples": 1588}], "download_size": 146499523, "dataset_size": 147332332.0}} | 2024-02-16T03:22:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Test_Dataset_1K-0216"
More Information needed | [
"# Dataset Card for \"Test_Dataset_1K-0216\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Test_Dataset_1K-0216\"\n\nMore Information needed"
] |
9a72476c0dc833b829d1b99b521ffa99a096b811 |
# Dataset
This repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:
- [Telugu NLP Dataset from Kaggle](https://www.kaggle.com/datasets/sudalairajkumar/telugu-nlp)
- [Telugu ASR Corpus from HuggingFace](https://huggingface.co/datasets/parambharat/telugu_asr_corpus)
- [Wikipedia Telugu Dataset from Wikimedia on HuggingFace](https://huggingface.co/datasets/wikimedia/wikipedia)
These datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks.
| indiehackers/telugu_dataset | [
"region:us"
] | 2024-02-16T03:32:53+00:00 | {"dataset_info": [{"config_name": "telugu_asr", "features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 47887486, "num_examples": 209270}], "download_size": 20219871, "dataset_size": 47887486}, {"config_name": "telugu_nlp", "features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 387671180, "num_examples": 47415}], "download_size": 150012515, "dataset_size": 387671180}, {"config_name": "wikipedia", "features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 710613522, "num_examples": 87854}], "download_size": 209754217, "dataset_size": 710613522}], "configs": [{"config_name": "telugu_asr", "data_files": [{"split": "train", "path": "telugu_asr/train-*"}]}, {"config_name": "telugu_nlp", "data_files": [{"split": "train", "path": "telugu_nlp/train-*"}]}, {"config_name": "wikipedia", "data_files": [{"split": "train", "path": "wikipedia/train-*"}]}]} | 2024-02-16T03:40:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset
This repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:
- Telugu NLP Dataset from Kaggle
- Telugu ASR Corpus from HuggingFace
- Wikipedia Telugu Dataset from Wikimedia on HuggingFace
These datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks.
| [
"# Dataset\n\nThis repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:\n\n- Telugu NLP Dataset from Kaggle\n- Telugu ASR Corpus from HuggingFace\n- Wikipedia Telugu Dataset from Wikimedia on HuggingFace\n\nThese datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks."
] | [
"TAGS\n#region-us \n",
"# Dataset\n\nThis repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:\n\n- Telugu NLP Dataset from Kaggle\n- Telugu ASR Corpus from HuggingFace\n- Wikipedia Telugu Dataset from Wikimedia on HuggingFace\n\nThese datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks."
] |
b3dd583ce90282a9084fb4e4b3b7ca8978cc9abb |
# Dataset
This repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:
- [Telugu NLP Dataset from Kaggle](https://www.kaggle.com/datasets/sudalairajkumar/telugu-nlp)
- [Telugu ASR Corpus from HuggingFace](https://huggingface.co/datasets/parambharat/telugu_asr_corpus)
- [Wikipedia Telugu Dataset from Wikimedia on HuggingFace](https://huggingface.co/datasets/wikimedia/wikipedia)
These datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks.
| eswardivi/telugu_dataset | [
"region:us"
] | 2024-02-16T03:35:51+00:00 | {"dataset_info": [{"config_name": "telugu_asr", "features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 47887486, "num_examples": 209270}], "download_size": 20219871, "dataset_size": 47887486}, {"config_name": "telugu_nlp", "features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 387671180, "num_examples": 47415}], "download_size": 150012515, "dataset_size": 387671180}, {"config_name": "wikipedia", "features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 710613522, "num_examples": 87854}], "download_size": 209754217, "dataset_size": 710613522}], "configs": [{"config_name": "telugu_asr", "data_files": [{"split": "train", "path": "telugu_asr/train-*"}]}, {"config_name": "telugu_nlp", "data_files": [{"split": "train", "path": "telugu_nlp/train-*"}]}, {"config_name": "wikipedia", "data_files": [{"split": "train", "path": "wikipedia/train-*"}]}]} | 2024-02-16T03:39:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset
This repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:
- Telugu NLP Dataset from Kaggle
- Telugu ASR Corpus from HuggingFace
- Wikipedia Telugu Dataset from Wikimedia on HuggingFace
These datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks.
| [
"# Dataset\n\nThis repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:\n\n- Telugu NLP Dataset from Kaggle\n- Telugu ASR Corpus from HuggingFace\n- Wikipedia Telugu Dataset from Wikimedia on HuggingFace\n\nThese datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks."
] | [
"TAGS\n#region-us \n",
"# Dataset\n\nThis repository contains the final dataset created using various resources. The primary datasets used for the construction of this final dataset are:\n\n- Telugu NLP Dataset from Kaggle\n- Telugu ASR Corpus from HuggingFace\n- Wikipedia Telugu Dataset from Wikimedia on HuggingFace\n\nThese datasets have been combined to form a comprehensive resource for Telugu Natural Language Processing (NLP) tasks."
] |
cff0a10b314f47a52f9646ce3c38ec3b7d0b4852 |
# Dataset Card for Evaluation run of NLUHOPOE/test-case-0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/test-case-0](https://huggingface.co/NLUHOPOE/test-case-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__test-case-0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T05:25:06.093843](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-0/blob/main/results_2024-02-16T05-25-06.093843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5791278236658676,
"acc_stderr": 0.033494817808173614,
"acc_norm": 0.5837595891503912,
"acc_norm_stderr": 0.03419368461778056,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4880155663864428,
"mc2_stderr": 0.015371746911854285
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.01457558392201967,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.5997809201354312,
"acc_stderr": 0.004889413126208774,
"acc_norm": 0.796355307707628,
"acc_norm_stderr": 0.004018847286468061
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437378,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437378
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101074,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.01461446582196633,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.01461446582196633
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100786,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100786
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3924380704041721,
"acc_stderr": 0.01247124366922911,
"acc_norm": 0.3924380704041721,
"acc_norm_stderr": 0.01247124366922911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02003639376835263,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02003639376835263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547735,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547735
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4880155663864428,
"mc2_stderr": 0.015371746911854285
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.3434420015163002,
"acc_stderr": 0.013079933811800311
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NLUHOPOE__test-case-0 | [
"region:us"
] | 2024-02-16T05:27:27+00:00 | {"pretty_name": "Evaluation run of NLUHOPOE/test-case-0", "dataset_summary": "Dataset automatically created during the evaluation run of model [NLUHOPOE/test-case-0](https://huggingface.co/NLUHOPOE/test-case-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__test-case-0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T05:25:06.093843](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-0/blob/main/results_2024-02-16T05-25-06.093843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5791278236658676,\n \"acc_stderr\": 0.033494817808173614,\n \"acc_norm\": 0.5837595891503912,\n \"acc_norm_stderr\": 0.03419368461778056,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4880155663864428,\n \"mc2_stderr\": 0.015371746911854285\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.01457558392201967,\n \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520769\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5997809201354312,\n \"acc_stderr\": 0.004889413126208774,\n \"acc_norm\": 0.796355307707628,\n \"acc_norm_stderr\": 0.004018847286468061\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548047,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548047\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437378,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437378\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101074,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101074\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.01461446582196633,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.01461446582196633\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100786,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100786\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3924380704041721,\n \"acc_stderr\": 0.01247124366922911,\n \"acc_norm\": 0.3924380704041721,\n \"acc_norm_stderr\": 0.01247124366922911\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02003639376835263,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02003639376835263\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547735,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547735\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4880155663864428,\n \"mc2_stderr\": 0.015371746911854285\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3434420015163002,\n \"acc_stderr\": 0.013079933811800311\n }\n}\n```", "repo_url": "https://huggingface.co/NLUHOPOE/test-case-0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|arc:challenge|25_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|gsm8k|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hellaswag|10_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["**/details_harness|winogrande|5_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T05-25-06.093843.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T05_25_06.093843", "path": ["results_2024-02-16T05-25-06.093843.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T05-25-06.093843.parquet"]}]}]} | 2024-02-16T05:27:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NLUHOPOE/test-case-0
Dataset automatically created during the evaluation run of model NLUHOPOE/test-case-0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T05:25:06.093843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NLUHOPOE/test-case-0\n\n\n\nDataset automatically created during the evaluation run of model NLUHOPOE/test-case-0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T05:25:06.093843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NLUHOPOE/test-case-0\n\n\n\nDataset automatically created during the evaluation run of model NLUHOPOE/test-case-0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T05:25:06.093843(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2a63d669ac8c2994f6e618528dec466208041853 |
This dataset consists ColBERTv2.0 document vectors for the entire TREC-COVID dataset from BeIR. That 128 dimension per token, with 180 tokens for each of 171332 documents.
The dataset was created using A100-40GB sponsored by Qdrant. The code to create these vectors is here: https://colab.research.google.com/drive/1hEhyleSrBz_mPyQJnRc0MwBenDuX1ahY?usp=sharing
This dataset was created for indexing experiments by Qdrant. | Qdrant/ColBERT-TREC-COVID | [
"task_categories:feature-extraction",
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"medical",
"region:us"
] | 2024-02-16T05:35:27+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["feature-extraction"], "pretty_name": "ColBERT TREC COVID", "dataset_info": {"features": [{"name": "documents", "sequence": {"sequence": "float16"}}], "splits": [{"name": "train", "num_bytes": 8019022928, "num_examples": 171332}], "download_size": 5775769873, "dataset_size": 8019022928}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["medical"]} | 2024-02-16T06:10:04+00:00 | [] | [
"en"
] | TAGS
#task_categories-feature-extraction #size_categories-100K<n<1M #language-English #license-mit #medical #region-us
|
This dataset consists ColBERTv2.0 document vectors for the entire TREC-COVID dataset from BeIR. That 128 dimension per token, with 180 tokens for each of 171332 documents.
The dataset was created using A100-40GB sponsored by Qdrant. The code to create these vectors is here: URL
This dataset was created for indexing experiments by Qdrant. | [] | [
"TAGS\n#task_categories-feature-extraction #size_categories-100K<n<1M #language-English #license-mit #medical #region-us \n"
] |
9824f1afb87451b3837e585d85eb93896dfe1107 |
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Matichon Maneegard
- **Shared by [optional]:** Matichon Maneegard
- **Language(s) (NLP):** image-to-text
- **License:** apache-2.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
The dataset was entirely synthetic. It does not contain real information or pertain to any specific person.
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
Using for tranning OCR or Multimodal.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
This dataset contains 98 x 6 = 588 samples, and the labels contain 98 samples.
Each sample will have a different scenario to represent.
The 'train.csv' file contains 11 attributes:
```File_Index, first_name_th, first_name_en, last_name_en, birth_date_th, birth_date_en, religion, first_address_th, second_address_th, third_address_th, forth_address_th```
The 'File_Index' corresponds to the number of the image in the scenario with the training data.
It means that '/Scenario_1/file_1.png' has the same attributes as '/Scenario_2/file_1.png'.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
### Contact
Twitter : (Mati)[https://twitter.com/KMatiDev1]
E-mail : [email protected]
VulturePrime : (VulturePrime)[https://vultureprime.com]
Float16 : (Float16.cloud)[https://float16.cloud]
| matichon/ThaiIDCardSynt | [
"task_categories:image-to-text",
"size_categories:n<1K",
"language:th",
"license:apache-2.0",
"region:us"
] | 2024-02-16T06:18:15+00:00 | {"language": ["th"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["image-to-text"]} | 2024-02-16T07:35:32+00:00 | [] | [
"th"
] | TAGS
#task_categories-image-to-text #size_categories-n<1K #language-Thai #license-apache-2.0 #region-us
|
## Dataset Details
### Dataset Description
- Curated by: Matichon Maneegard
- Shared by [optional]: Matichon Maneegard
- Language(s) (NLP): image-to-text
- License: apache-2.0
### Dataset Sources [optional]
The dataset was entirely synthetic. It does not contain real information or pertain to any specific person.
## Uses
### Direct Use
Using for tranning OCR or Multimodal.
## Dataset Structure
This dataset contains 98 x 6 = 588 samples, and the labels contain 98 samples.
Each sample will have a different scenario to represent.
The 'URL' file contains 11 attributes:
The 'File_Index' corresponds to the number of the image in the scenario with the training data.
It means that '/Scenario_1/file_1.png' has the same attributes as '/Scenario_2/file_1.png'.
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
### Contact
Twitter : (Mati)[URL
E-mail : business@URL
VulturePrime : (VulturePrime)[URL]
Float16 : (URL)[URL]
| [
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Matichon Maneegard\n- Shared by [optional]: Matichon Maneegard\n- Language(s) (NLP): image-to-text\n- License: apache-2.0",
"### Dataset Sources [optional]\n\n\nThe dataset was entirely synthetic. It does not contain real information or pertain to any specific person.",
"## Uses",
"### Direct Use\n\n\n\nUsing for tranning OCR or Multimodal.",
"## Dataset Structure\n\n\nThis dataset contains 98 x 6 = 588 samples, and the labels contain 98 samples. \nEach sample will have a different scenario to represent.\n\nThe 'URL' file contains 11 attributes:\n\n\n\nThe 'File_Index' corresponds to the number of the image in the scenario with the training data. \nIt means that '/Scenario_1/file_1.png' has the same attributes as '/Scenario_2/file_1.png'.",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.",
"### Contact\n\nTwitter : (Mati)[URL\n\nE-mail : business@URL\n\nVulturePrime : (VulturePrime)[URL]\n\nFloat16 : (URL)[URL]"
] | [
"TAGS\n#task_categories-image-to-text #size_categories-n<1K #language-Thai #license-apache-2.0 #region-us \n",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Matichon Maneegard\n- Shared by [optional]: Matichon Maneegard\n- Language(s) (NLP): image-to-text\n- License: apache-2.0",
"### Dataset Sources [optional]\n\n\nThe dataset was entirely synthetic. It does not contain real information or pertain to any specific person.",
"## Uses",
"### Direct Use\n\n\n\nUsing for tranning OCR or Multimodal.",
"## Dataset Structure\n\n\nThis dataset contains 98 x 6 = 588 samples, and the labels contain 98 samples. \nEach sample will have a different scenario to represent.\n\nThe 'URL' file contains 11 attributes:\n\n\n\nThe 'File_Index' corresponds to the number of the image in the scenario with the training data. \nIt means that '/Scenario_1/file_1.png' has the same attributes as '/Scenario_2/file_1.png'.",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.",
"### Contact\n\nTwitter : (Mati)[URL\n\nE-mail : business@URL\n\nVulturePrime : (VulturePrime)[URL]\n\nFloat16 : (URL)[URL]"
] |
048fc1a5f3f8cf3f8d050f1e727bce1233b32f12 |
# Dataset Card for Evaluation run of Undi95/PsyMedRP-v1-20B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Undi95/PsyMedRP-v1-20B](https://huggingface.co/Undi95/PsyMedRP-v1-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T06:33:57.302712](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B/blob/main/results_2024-02-16T06-33-57.302712.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5647260784625223,
"acc_stderr": 0.033553791007284096,
"acc_norm": 0.5721079188379258,
"acc_norm_stderr": 0.03429829853750649,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5444967551355537,
"mc2_stderr": 0.015846880267326138
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221009,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.01428589829293817
},
"harness|hellaswag|10": {
"acc": 0.6552479585739892,
"acc_stderr": 0.004743160034271149,
"acc_norm": 0.8393746265684127,
"acc_norm_stderr": 0.0036643462998943955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404105,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.66,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966266,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654362,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665225,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665225
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.735632183908046,
"acc_stderr": 0.015769984840690525,
"acc_norm": 0.735632183908046,
"acc_norm_stderr": 0.015769984840690525
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602663,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952922,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024117,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014635,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014635
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5444967551355537,
"mc2_stderr": 0.015846880267326138
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527883
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B | [
"region:us"
] | 2024-02-16T06:36:16+00:00 | {"pretty_name": "Evaluation run of Undi95/PsyMedRP-v1-20B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/PsyMedRP-v1-20B](https://huggingface.co/Undi95/PsyMedRP-v1-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T06:33:57.302712](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B/blob/main/results_2024-02-16T06-33-57.302712.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5647260784625223,\n \"acc_stderr\": 0.033553791007284096,\n \"acc_norm\": 0.5721079188379258,\n \"acc_norm_stderr\": 0.03429829853750649,\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5444967551355537,\n \"mc2_stderr\": 0.015846880267326138\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221009,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.01428589829293817\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6552479585739892,\n \"acc_stderr\": 0.004743160034271149,\n \"acc_norm\": 0.8393746265684127,\n \"acc_norm_stderr\": 0.0036643462998943955\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.038118909889404105,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.038118909889404105\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966266,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966266\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654362,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303529,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303529\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.02559819368665225,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.02559819368665225\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.735632183908046,\n \"acc_stderr\": 0.015769984840690525,\n \"acc_norm\": 0.735632183908046,\n \"acc_norm_stderr\": 0.015769984840690525\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602663,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602663\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952922,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024117,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024117\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014635,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014635\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5444967551355537,\n \"mc2_stderr\": 0.015846880267326138\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \"acc_stderr\": 0.009797503180527883\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/PsyMedRP-v1-20B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|arc:challenge|25_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|gsm8k|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hellaswag|10_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T06-33-57.302712.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["**/details_harness|winogrande|5_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T06-33-57.302712.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T06_33_57.302712", "path": ["results_2024-02-16T06-33-57.302712.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T06-33-57.302712.parquet"]}]}]} | 2024-02-16T06:36:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/PsyMedRP-v1-20B
Dataset automatically created during the evaluation run of model Undi95/PsyMedRP-v1-20B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T06:33:57.302712(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Undi95/PsyMedRP-v1-20B\n\n\n\nDataset automatically created during the evaluation run of model Undi95/PsyMedRP-v1-20B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T06:33:57.302712(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/PsyMedRP-v1-20B\n\n\n\nDataset automatically created during the evaluation run of model Undi95/PsyMedRP-v1-20B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T06:33:57.302712(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
64f35b28ffff24cc2eb0773f807368da32873599 | 本数据集共138min,大概包含yousa的50首歌(大部分在2016-2022年),已经过切片处理并筛选,时长在4-15s,共796条wav音频数据。中文占绝大部分,有少量日文及极少英文。
This dataset consists of 138 minutes in total and approximately includes 50 songs by yousa (most of them released between 2016 and 2022). The dataset has been sliced and filtered, with durations ranging from 4 to 15 seconds, resulting in a total of 796 WAV audio files. The majority of the content is in Chinese, with a small amount in Japanese and very little in English. | yousaforever/yousa_data_0 | [
"license:gpl-3.0",
"region:us"
] | 2024-02-16T06:49:58+00:00 | {"license": "gpl-3.0"} | 2024-02-16T06:59:56+00:00 | [] | [] | TAGS
#license-gpl-3.0 #region-us
| 本数据集共138min,大概包含yousa的50首歌(大部分在2016-2022年),已经过切片处理并筛选,时长在4-15s,共796条wav音频数据。中文占绝大部分,有少量日文及极少英文。
This dataset consists of 138 minutes in total and approximately includes 50 songs by yousa (most of them released between 2016 and 2022). The dataset has been sliced and filtered, with durations ranging from 4 to 15 seconds, resulting in a total of 796 WAV audio files. The majority of the content is in Chinese, with a small amount in Japanese and very little in English. | [] | [
"TAGS\n#license-gpl-3.0 #region-us \n"
] |
5cc645f5f3e9116d75dfcc66d1ff3f4a497df607 | 大约9min的正常说话声音,划分为70个切片,可用于训练tts模型。
A 9-minute normal speaking voice divided into 70 slices for training a TTS model. | yousaforever/yousa_data_1 | [
"license:gpl-3.0",
"region:us"
] | 2024-02-16T07:03:45+00:00 | {"license": "gpl-3.0"} | 2024-02-16T07:10:56+00:00 | [] | [] | TAGS
#license-gpl-3.0 #region-us
| 大约9min的正常说话声音,划分为70个切片,可用于训练tts模型。
A 9-minute normal speaking voice divided into 70 slices for training a TTS model. | [] | [
"TAGS\n#license-gpl-3.0 #region-us \n"
] |
192462fe2072e3a1d2c1e32c02170978328bddf1 |
# Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/falcon-7B-case-6](https://huggingface.co/jisukim8873/falcon-7B-case-6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__falcon-7B-case-6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T07:12:28.485530](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-6/blob/main/results_2024-02-16T07-12-28.485530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2999741752010719,
"acc_stderr": 0.032195034392452436,
"acc_norm": 0.30103224915319854,
"acc_norm_stderr": 0.032944763241990214,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.364571668218642,
"mc2_stderr": 0.014117416041879967
},
"harness|arc:challenge|25": {
"acc": 0.4274744027303754,
"acc_stderr": 0.014456862944650654,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.014575583922019665
},
"harness|hellaswag|10": {
"acc": 0.5976897032463653,
"acc_stderr": 0.0048936170149753,
"acc_norm": 0.7849034056960765,
"acc_norm_stderr": 0.004100495978108428
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.040925639582376536,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.040925639582376536
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1349206349206349,
"acc_stderr": 0.030557101589417515,
"acc_norm": 0.1349206349206349,
"acc_norm_stderr": 0.030557101589417515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33225806451612905,
"acc_stderr": 0.02679556084812279,
"acc_norm": 0.33225806451612905,
"acc_norm_stderr": 0.02679556084812279
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026869,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026869
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423095,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652155,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652155
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28990825688073396,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.28990825688073396,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036744,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036744
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021897,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021897
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.03023638994217307,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.03023638994217307
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3537675606641124,
"acc_stderr": 0.017098184708161903,
"acc_norm": 0.3537675606641124,
"acc_norm_stderr": 0.017098184708161903
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.025190181327608422,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.025190181327608422
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3202614379084967,
"acc_stderr": 0.026716118380156844,
"acc_norm": 0.3202614379084967,
"acc_norm_stderr": 0.026716118380156844
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2627118644067797,
"acc_stderr": 0.01124054551499567,
"acc_norm": 0.2627118644067797,
"acc_norm_stderr": 0.01124054551499567
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294292,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.364571668218642,
"mc2_stderr": 0.014117416041879967
},
"harness|winogrande|5": {
"acc": 0.7008681925808997,
"acc_stderr": 0.012868639066091541
},
"harness|gsm8k|5": {
"acc": 0.06141015921152388,
"acc_stderr": 0.006613027536586305
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jisukim8873__falcon-7B-case-6 | [
"region:us"
] | 2024-02-16T07:14:11+00:00 | {"pretty_name": "Evaluation run of jisukim8873/falcon-7B-case-6", "dataset_summary": "Dataset automatically created during the evaluation run of model [jisukim8873/falcon-7B-case-6](https://huggingface.co/jisukim8873/falcon-7B-case-6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__falcon-7B-case-6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T07:12:28.485530](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-6/blob/main/results_2024-02-16T07-12-28.485530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2999741752010719,\n \"acc_stderr\": 0.032195034392452436,\n \"acc_norm\": 0.30103224915319854,\n \"acc_norm_stderr\": 0.032944763241990214,\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.364571668218642,\n \"mc2_stderr\": 0.014117416041879967\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4274744027303754,\n \"acc_stderr\": 0.014456862944650654,\n \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.014575583922019665\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5976897032463653,\n \"acc_stderr\": 0.0048936170149753,\n \"acc_norm\": 0.7849034056960765,\n \"acc_norm_stderr\": 0.004100495978108428\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.040925639582376536,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.040925639582376536\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723817,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723817\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1349206349206349,\n \"acc_stderr\": 0.030557101589417515,\n \"acc_norm\": 0.1349206349206349,\n \"acc_norm_stderr\": 0.030557101589417515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.33225806451612905,\n \"acc_stderr\": 0.02679556084812279,\n \"acc_norm\": 0.33225806451612905,\n \"acc_norm_stderr\": 0.02679556084812279\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026869,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026869\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423095,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423095\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652155,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652155\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28990825688073396,\n \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.28990825688073396,\n \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036744,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036744\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021897,\n \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021897\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.37668161434977576,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.03023638994217307,\n \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.03023638994217307\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3537675606641124,\n \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.3537675606641124,\n \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.025190181327608422,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.025190181327608422\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3202614379084967,\n \"acc_stderr\": 0.026716118380156844,\n \"acc_norm\": 0.3202614379084967,\n \"acc_norm_stderr\": 0.026716118380156844\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.3183279742765273,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953776,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953776\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2627118644067797,\n \"acc_stderr\": 0.01124054551499567,\n \"acc_norm\": 0.2627118644067797,\n \"acc_norm_stderr\": 0.01124054551499567\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294292,\n \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.3034825870646766,\n \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.03631053496488905,\n \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.03631053496488905\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.364571668218642,\n \"mc2_stderr\": 0.014117416041879967\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091541\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06141015921152388,\n \"acc_stderr\": 0.006613027536586305\n }\n}\n```", "repo_url": "https://huggingface.co/jisukim8873/falcon-7B-case-6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|arc:challenge|25_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|gsm8k|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hellaswag|10_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T07-12-28.485530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["**/details_harness|winogrande|5_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T07-12-28.485530.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T07_12_28.485530", "path": ["results_2024-02-16T07-12-28.485530.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T07-12-28.485530.parquet"]}]}]} | 2024-02-16T07:14:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-6
Dataset automatically created during the evaluation run of model jisukim8873/falcon-7B-case-6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T07:12:28.485530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-6\n\n\n\nDataset automatically created during the evaluation run of model jisukim8873/falcon-7B-case-6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T07:12:28.485530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-6\n\n\n\nDataset automatically created during the evaluation run of model jisukim8873/falcon-7B-case-6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T07:12:28.485530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
28c35a4152ed69188fc9eb344579fc577016c925 |
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-laser](https://huggingface.co/Kquant03/Buttercup-V2-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T07:34:11.973720](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser/blob/main/results_2024-02-16T07-34-11.973720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535761549256881,
"acc_stderr": 0.03205604876868876,
"acc_norm": 0.6528640185317818,
"acc_norm_stderr": 0.032733047429496384,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6899750707536572,
"mc2_stderr": 0.01507018824423322
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.7135032861979685,
"acc_stderr": 0.004512002459757956,
"acc_norm": 0.8847839075881299,
"acc_norm_stderr": 0.0031863002304505753
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163255,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604103,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6899750707536572,
"mc2_stderr": 0.01507018824423322
},
"harness|winogrande|5": {
"acc": 0.8626677190213102,
"acc_stderr": 0.009673669315476049
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251653
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser | [
"region:us"
] | 2024-02-16T07:36:29+00:00 | {"pretty_name": "Evaluation run of Kquant03/Buttercup-V2-laser", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-laser](https://huggingface.co/Kquant03/Buttercup-V2-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T07:34:11.973720](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-laser/blob/main/results_2024-02-16T07-34-11.973720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535761549256881,\n \"acc_stderr\": 0.03205604876868876,\n \"acc_norm\": 0.6528640185317818,\n \"acc_norm_stderr\": 0.032733047429496384,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6899750707536572,\n \"mc2_stderr\": 0.01507018824423322\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7135032861979685,\n \"acc_stderr\": 0.004512002459757956,\n \"acc_norm\": 0.8847839075881299,\n \"acc_norm_stderr\": 0.0031863002304505753\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604103,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6899750707536572,\n \"mc2_stderr\": 0.01507018824423322\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8626677190213102,\n \"acc_stderr\": 0.009673669315476049\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \"acc_stderr\": 0.012840345676251653\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Buttercup-V2-laser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|arc:challenge|25_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|gsm8k|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hellaswag|10_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["**/details_harness|winogrande|5_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T07-34-11.973720.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T07_34_11.973720", "path": ["results_2024-02-16T07-34-11.973720.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T07-34-11.973720.parquet"]}]}]} | 2024-02-16T07:36:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-laser
Dataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-laser on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T07:34:11.973720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-laser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T07:34:11.973720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-laser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Buttercup-V2-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T07:34:11.973720(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0acb7d1d0c8448c4322030de54affc03b38accc2 | ### The ReAct Llama Dataset
### Dataset Summary
This dataset contains 3,538 correct ReAct trajectories generated using llama2-70b (Q5_K_M quant).
It follows the format used in the [ReAct paper](https://arxiv.org/pdf/2210.03629.pdf).\
ReAct trajectories were generated using a modified version of the hotpotqa.ipynb file from the [ReAct repo](https://github.com/ysymyth/ReAct).
The model was prompted in the following format (5-shot) to generate these traces:
```
Solve a question answering task with interleaving Thought, Action, Observation steps. Thought can reason about the current situation, and Action can be three types:
(1) Search[entity], which searches the exact entity on Wikipedia and returns the first paragraph if it exists. If not, it will return some similar entities to search.
(2) Lookup[keyword], which returns the next sentence containing keyword in the current passage.
(3) Finish[answer], which returns the answer and finishes the task.
Solve a question answering task with interleaving Thought, Action, Observation steps. Thought can reason about the current situation, and Action can be three types:
(1) Search[entity], which searches the exact entity on Wikipedia and returns the first paragraph if it exists. If not, it will return some similar entities to search.
(2) Lookup[keyword], which returns the next sentence containing keyword in the current passage.
(3) Finish[answer], which returns the answer and finishes the task.
Here are some examples.
Question: What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into?
Thought 1: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado orogeny extends into, then find the elevation range of the area.
Action 1: Search[Colorado orogeny]
Observation 1: The Colorado orogeny was an episode of mountain building (an orogeny) in Colorado and surrounding areas.
Thought 2: It does not mention the eastern sector. So I need to look up eastern sector.
Action 2: Lookup[eastern sector]
Observation 2: (Result 1 / 1) The eastern sector extends into the High Plains and is called the Central Plains orogeny.
Thought 3: The eastern sector of Colorado orogeny extends into the High Plains. So I need to search High Plains and find its elevation range.
Action 3: Search[High Plains]
Observation 3: High Plains refers to one of two distinct land regions:
Thought 4: I need to instead search High Plains (United States).
Action 4: Search[High Plains (United States)]
Observation 4: The High Plains are a subregion of the Great Plains. From east to west, the High Plains rise in elevation from around 1,800 to 7,000 ft (550 to 2,130 m).[3]
Thought 5: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft.
Action 5: Finish[1,800 to 7,000 ft]
Question: Musician and satirist Allie Goertz wrote a song about the "The Simpsons" character Milhouse, who Matt Groening named after who?
Thought 1: The question simplifies to "The Simpsons" character Milhouse is named after who. I only need to search Milhouse and find who it is named after.
Action 1: Search[Milhouse]
Observation 1: Milhouse Mussolini Van Houten is a recurring character in the Fox animated television series The Simpsons voiced by Pamela Hayden and created by Matt Groening.
Thought 2: The paragraph does not tell who Milhouse is named after, maybe I can look up "named after".
Action 2: Lookup[named after]
Observation 2: (Result 1 / 1) Milhouse was named after U.S. president Richard Nixon, whose middle name was Milhous.
Thought 3: Milhouse was named after U.S. president Richard Nixon, so the answer is Richard Nixon.
Action 3: Finish[Richard Nixon]
Question: Which documentary is about Finnish rock groups, Adam Clayton Powell or The Saimaa Gesture?
Thought 1: I need to search Adam Clayton Powell and The Saimaa Gesture, and find which documentary is about Finnish rock groups.
Action 1: Search[Adam Clayton Powell]
Observation 1: Could not find [Adam Clayton Powell]. Similar: ['Adam Clayton Powell III', 'Seventh Avenue (Manhattan)', 'Adam Clayton Powell Jr. State Office Building', 'Isabel Washington Powell', 'Adam Powell', 'Adam Clayton Powell (film)', 'Giancarlo Esposito'].
Thought 2: To find the documentary, I can search Adam Clayton Powell (film).
Action 2: Search[Adam Clayton Powell (film)]
Observation 2: Adam Clayton Powell is a 1989 American documentary film directed by Richard Kilberg.
The film is about the rise and fall of influential African-American politician Adam Clayton Powell Jr.[3][4] It was later aired as part of the PBS series The American Experience.
Thought 3: Adam Clayton Powell (film) is a documentary about an African-American politician, not Finnish rock groups. So the documentary about Finnish rock groups must instead be The Saimaa Gesture.
Action 3: Finish[The Saimaa Gesture]
Question: What profession does Nicholas Ray and Elia Kazan have in common?
Thought 1: I need to search Nicholas Ray and Elia Kazan, find their professions, then find the profession they have in common.
Action 1: Search[Nicholas Ray]
Observation 1: Nicholas Ray (born Raymond Nicholas Kienzle Jr., August 7, 1911 - June 16, 1979) was an American film director, screenwriter, and actor best known for the 1955 film Rebel Without a Cause.
Thought 2: Professions of Nicholas Ray are director, screenwriter, and actor. I need to search Elia Kazan next and find his professions.
Action 2: Search[Elia Kazan]
Observation 2: Elia Kazan was an American film and theatre director, producer, screenwriter and actor.
Thought 3: Professions of Elia Kazan are director, producer, screenwriter, and actor. So profession Nicholas Ray and Elia Kazan have in common is director, screenwriter, and actor.
Action 3: Finish[director, screenwriter, actor]
Question: Which magazine was started first Arthur's Magazine or First for Women?
Thought 1: I need to search Arthur's Magazine and First for Women, and find which was started first.
Action 1: Search[Arthur's Magazine]
Observation 1: Arthur's Magazine (1844-1846) was an American literary periodical published in Philadelphia in the 19th century.
Thought 2: Arthur's Magazine was started in 1844. I need to search First for Women next.
Action 2: Search[First for Women]
Observation 2: First for Women is a woman's magazine published by Bauer Media Group in the USA.[1] The magazine was started in 1989.
Thought 3: First for Women was started in 1989. 1844 (Arthur's Magazine) < 1989 (First for Women), so Arthur's Magazine was started first.
Action 3: Finish[Arthur's Magazine]
Question: Were Pavel Urysohn and Leonid Levin known for the same type of work?
Thought 1: I need to search Pavel Urysohn and Leonid Levin, find their types of work, then find if they are the same.
Action 1: Search[Pavel Urysohn]
Observation 1: Pavel Samuilovich Urysohn (February 3, 1898 - August 17, 1924) was a Soviet mathematician who is best known for his contributions in dimension theory.
Thought 2: Pavel Urysohn is a mathematician. I need to search Leonid Levin next and find its type of work.
Action 2: Search[Leonid Levin]
Observation 2: Leonid Anatolievich Levin is a Soviet-American mathematician and computer scientist.
Thought 3: Leonid Levin is a mathematician and computer scientist. So Pavel Urysohn and Leonid Levin have the same type of work.
Action 3: Finish[yes]
Question: <insert-question-here>
```
The Wikipedia API tool that the language model has access to here is unmodified from the code given in the ReAct repository. | xz56/react-llama | [
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"arxiv:2210.03629",
"region:us"
] | 2024-02-16T07:56:23+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "correct_answer", "dtype": "string"}, {"name": "trajectory", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7686150, "num_examples": 3538}], "download_size": 4306541, "dataset_size": 7686150}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:09:45+00:00 | [
"2210.03629"
] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-apache-2.0 #arxiv-2210.03629 #region-us
| ### The ReAct Llama Dataset
### Dataset Summary
This dataset contains 3,538 correct ReAct trajectories generated using llama2-70b (Q5_K_M quant).
It follows the format used in the ReAct paper.\
ReAct trajectories were generated using a modified version of the URL file from the ReAct repo.
The model was prompted in the following format (5-shot) to generate these traces:
The Wikipedia API tool that the language model has access to here is unmodified from the code given in the ReAct repository. | [
"### The ReAct Llama Dataset",
"### Dataset Summary\nThis dataset contains 3,538 correct ReAct trajectories generated using llama2-70b (Q5_K_M quant).\nIt follows the format used in the ReAct paper.\\\nReAct trajectories were generated using a modified version of the URL file from the ReAct repo.\nThe model was prompted in the following format (5-shot) to generate these traces:\n\nThe Wikipedia API tool that the language model has access to here is unmodified from the code given in the ReAct repository."
] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-apache-2.0 #arxiv-2210.03629 #region-us \n",
"### The ReAct Llama Dataset",
"### Dataset Summary\nThis dataset contains 3,538 correct ReAct trajectories generated using llama2-70b (Q5_K_M quant).\nIt follows the format used in the ReAct paper.\\\nReAct trajectories were generated using a modified version of the URL file from the ReAct repo.\nThe model was prompted in the following format (5-shot) to generate these traces:\n\nThe Wikipedia API tool that the language model has access to here is unmodified from the code given in the ReAct repository."
] |
ad8d5cf34fd2a2ea07613b6904cfa18fa0d245d1 |
[Open-Orca/1million-gpt-4](https://huggingface.co/datasets/Open-Orca/1million-gpt-4) converted to sharegpt and llama chat format
| sanjay920/1million-gpt-4-llama | [
"region:us"
] | 2024-02-16T08:20:36+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}, {"name": "weight", "dtype": "null"}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3630771342, "num_examples": 994896}], "download_size": 1980759415, "dataset_size": 3630771342}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:26:04+00:00 | [] | [] | TAGS
#region-us
|
Open-Orca/1million-gpt-4 converted to sharegpt and llama chat format
| [] | [
"TAGS\n#region-us \n"
] |
acb85808ec4bc4b499f049aad2a7693799b0b1a4 |
## Dataset Description
This dataset is just for testing. It contains GitHub issues and pull requests associated with the 🤗 Datasets [repository](https://github.com/huggingface/datasets). It can be used for semantic search or multilabel text classification. The contents of each GitHub issue are in English.
| lorisrossi/github-issues | [
"task_categories:text-classification",
"task_ids:multi-label-classification",
"annotations_creators:no-annotation",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-02-16T08:27:54+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["found"], "language": ["en"], "license": [], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": [], "task_categories": ["text-classification"], "task_ids": ["multi-label-classification"], "pretty_name": "HuggingFace Datasets GitHub Issues", "tags": [], "dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "repository_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "comments_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "user", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "labels", "list": [{"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "color", "dtype": "string"}, {"name": "default", "dtype": "bool"}, {"name": "description", "dtype": "string"}]}, {"name": "state", "dtype": "string"}, {"name": "locked", "dtype": "bool"}, {"name": "assignee", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "assignees", "list": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "milestone", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "creator", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "open_issues", "dtype": "int64"}, {"name": "closed_issues", "dtype": "int64"}, {"name": "state", "dtype": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "due_on", "dtype": "null"}, {"name": "closed_at", "dtype": "null"}]}, {"name": "comments", "sequence": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "closed_at", "dtype": "timestamp[s]"}, {"name": "author_association", "dtype": "string"}, {"name": "active_lock_reason", "dtype": "null"}, {"name": "body", "dtype": "string"}, {"name": "reactions", "struct": [{"name": "url", "dtype": "string"}, {"name": "total_count", "dtype": "int64"}, {"name": "+1", "dtype": "int64"}, {"name": "-1", "dtype": "int64"}, {"name": "laugh", "dtype": "int64"}, {"name": "hooray", "dtype": "int64"}, {"name": "confused", "dtype": "int64"}, {"name": "heart", "dtype": "int64"}, {"name": "rocket", "dtype": "int64"}, {"name": "eyes", "dtype": "int64"}]}, {"name": "timeline_url", "dtype": "string"}, {"name": "performed_via_github_app", "dtype": "null"}, {"name": "state_reason", "dtype": "string"}, {"name": "draft", "dtype": "bool"}, {"name": "pull_request", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "diff_url", "dtype": "string"}, {"name": "patch_url", "dtype": "string"}, {"name": "merged_at", "dtype": "timestamp[s]"}]}, {"name": "is_pull_request", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 26192134, "num_examples": 3817}], "download_size": 7664986, "dataset_size": 26192134}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:38:39+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_ids-multi-label-classification #annotations_creators-no-annotation #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #language-English #region-us
|
## Dataset Description
This dataset is just for testing. It contains GitHub issues and pull requests associated with the Datasets repository. It can be used for semantic search or multilabel text classification. The contents of each GitHub issue are in English.
| [
"## Dataset Description\nThis dataset is just for testing. It contains GitHub issues and pull requests associated with the Datasets repository. It can be used for semantic search or multilabel text classification. The contents of each GitHub issue are in English."
] | [
"TAGS\n#task_categories-text-classification #task_ids-multi-label-classification #annotations_creators-no-annotation #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #language-English #region-us \n",
"## Dataset Description\nThis dataset is just for testing. It contains GitHub issues and pull requests associated with the Datasets repository. It can be used for semantic search or multilabel text classification. The contents of each GitHub issue are in English."
] |
c235cd77c5d598eca11ee5f1916c2cd6ad62618c |
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T08:43:34.747997](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged/blob/main/results_2024-02-16T08-43-34.747997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6443352125855402,
"acc_stderr": 0.03223578518541491,
"acc_norm": 0.6464316260111327,
"acc_norm_stderr": 0.03288034667596033,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5318297182928406,
"mc2_stderr": 0.015213885422385947
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177275
},
"harness|hellaswag|10": {
"acc": 0.6619199362676758,
"acc_stderr": 0.004720891597174729,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252603,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252603
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.02536060379624256,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.02536060379624256
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5318297182928406,
"mc2_stderr": 0.015213885422385947
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710676
},
"harness|gsm8k|5": {
"acc": 0.6133434420015162,
"acc_stderr": 0.013413955095965307
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged | [
"region:us"
] | 2024-02-16T08:45:55+00:00 | {"pretty_name": "Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T08:43:34.747997](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged/blob/main/results_2024-02-16T08-43-34.747997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6443352125855402,\n \"acc_stderr\": 0.03223578518541491,\n \"acc_norm\": 0.6464316260111327,\n \"acc_norm_stderr\": 0.03288034667596033,\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5318297182928406,\n \"mc2_stderr\": 0.015213885422385947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6619199362676758,\n \"acc_stderr\": 0.004720891597174729,\n \"acc_norm\": 0.8526190001991635,\n \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252603,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252603\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298901,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298901\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5318297182928406,\n \"mc2_stderr\": 0.015213885422385947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \"acc_stderr\": 0.013413955095965307\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|arc:challenge|25_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|gsm8k|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hellaswag|10_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|winogrande|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["results_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T08-43-34.747997.parquet"]}]}]} | 2024-02-16T08:46:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged
Dataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T08:43:34.747997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T08:43:34.747997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T08:43:34.747997(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
05c5abd3971b12e64725d2a6cc9a093aee2b587e | # Next Relation Prediction on the UMLS KG
The dataset trains and evaluates a model that generates the subsequent relation to pursue (if available) based on a given MedQA question and the relations explored thus far, otherwise indicating the end (END). | neural-subgraph-retrieval/umls-next-relation-prediction | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T09:44:07+00:00 | {"license": "apache-2.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question_id", "dtype": "uint16"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 143262, "num_examples": 220}, {"name": "eval", "num_bytes": 18086, "num_examples": 27}, {"name": "test", "num_bytes": 18739, "num_examples": 29}], "download_size": 28244, "dataset_size": 180087}} | 2024-02-16T21:13:48+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Next Relation Prediction on the UMLS KG
The dataset trains and evaluates a model that generates the subsequent relation to pursue (if available) based on a given MedQA question and the relations explored thus far, otherwise indicating the end (END). | [
"# Next Relation Prediction on the UMLS KG\nThe dataset trains and evaluates a model that generates the subsequent relation to pursue (if available) based on a given MedQA question and the relations explored thus far, otherwise indicating the end (END)."
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Next Relation Prediction on the UMLS KG\nThe dataset trains and evaluates a model that generates the subsequent relation to pursue (if available) based on a given MedQA question and the relations explored thus far, otherwise indicating the end (END)."
] |
3aab3801fe1986cc4e3532d928a53bd14e1aef12 | # Alpaca-cleaned-gemini-hun
Magyar nyelvű adatkészelet a a stanford alpaca alapján. A kiniduló modell a [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) volt. Itt az utasítások, illetve ahová kellett a bemenetek (nem üres/vannak benne szavak/nem kód), google fordítoval le lettek fordítva. Mivel a fodítás közel sem tökéletes, ezért az angol nyelvű utasításokat is meghagytam.
Ez után a google gemini-pro modelljével legeneráltam a válaszokat, ezek kerültek ebbe az adathalmazba. Ez nagyjából 20 óráig tartott, de legalább ingyenes volt. Idő közben azonban kijött egy új modell, a gemini-pro-1.5, ami valamivel jobb eredményket hoz. Sajnos ekkor már már ment a generálás és nem akartam már leállítani, de elképzelehtő, hogy az adatkészlet egy része már ezzel készült. Ha lesz időm ezzel végig megyek mégegyszer, illetve javítanám a hibákat. | Bazsalanszky/alpaca-cleaned-gemini-hun | [
"language:hu",
"license:cc-by-4.0",
"region:us"
] | 2024-02-16T09:50:44+00:00 | {"language": ["hu"], "license": "cc-by-4.0", "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "original_instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 52533889, "num_examples": 51760}], "download_size": 30595403, "dataset_size": 52533889}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T12:08:17+00:00 | [] | [
"hu"
] | TAGS
#language-Hungarian #license-cc-by-4.0 #region-us
| # Alpaca-cleaned-gemini-hun
Magyar nyelvű adatkészelet a a stanford alpaca alapján. A kiniduló modell a yahma/alpaca-cleaned volt. Itt az utasítások, illetve ahová kellett a bemenetek (nem üres/vannak benne szavak/nem kód), google fordítoval le lettek fordítva. Mivel a fodítás közel sem tökéletes, ezért az angol nyelvű utasításokat is meghagytam.
Ez után a google gemini-pro modelljével legeneráltam a válaszokat, ezek kerültek ebbe az adathalmazba. Ez nagyjából 20 óráig tartott, de legalább ingyenes volt. Idő közben azonban kijött egy új modell, a gemini-pro-1.5, ami valamivel jobb eredményket hoz. Sajnos ekkor már már ment a generálás és nem akartam már leállítani, de elképzelehtő, hogy az adatkészlet egy része már ezzel készült. Ha lesz időm ezzel végig megyek mégegyszer, illetve javítanám a hibákat. | [
"# Alpaca-cleaned-gemini-hun\n\nMagyar nyelvű adatkészelet a a stanford alpaca alapján. A kiniduló modell a yahma/alpaca-cleaned volt. Itt az utasítások, illetve ahová kellett a bemenetek (nem üres/vannak benne szavak/nem kód), google fordítoval le lettek fordítva. Mivel a fodítás közel sem tökéletes, ezért az angol nyelvű utasításokat is meghagytam.\n\nEz után a google gemini-pro modelljével legeneráltam a válaszokat, ezek kerültek ebbe az adathalmazba. Ez nagyjából 20 óráig tartott, de legalább ingyenes volt. Idő közben azonban kijött egy új modell, a gemini-pro-1.5, ami valamivel jobb eredményket hoz. Sajnos ekkor már már ment a generálás és nem akartam már leállítani, de elképzelehtő, hogy az adatkészlet egy része már ezzel készült. Ha lesz időm ezzel végig megyek mégegyszer, illetve javítanám a hibákat."
] | [
"TAGS\n#language-Hungarian #license-cc-by-4.0 #region-us \n",
"# Alpaca-cleaned-gemini-hun\n\nMagyar nyelvű adatkészelet a a stanford alpaca alapján. A kiniduló modell a yahma/alpaca-cleaned volt. Itt az utasítások, illetve ahová kellett a bemenetek (nem üres/vannak benne szavak/nem kód), google fordítoval le lettek fordítva. Mivel a fodítás közel sem tökéletes, ezért az angol nyelvű utasításokat is meghagytam.\n\nEz után a google gemini-pro modelljével legeneráltam a válaszokat, ezek kerültek ebbe az adathalmazba. Ez nagyjából 20 óráig tartott, de legalább ingyenes volt. Idő közben azonban kijött egy új modell, a gemini-pro-1.5, ami valamivel jobb eredményket hoz. Sajnos ekkor már már ment a generálás és nem akartam már leállítani, de elképzelehtő, hogy az adatkészlet egy része már ezzel készült. Ha lesz időm ezzel végig megyek mégegyszer, illetve javítanám a hibákat."
] |
1c3d25129a5778547e57b01dccb697ae4f422607 |
# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T10:12:02.378813](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged/blob/main/results_2024-02-16T10-12-02.378813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6294515756856921,
"acc_stderr": 0.03258557096163295,
"acc_norm": 0.6334878028725921,
"acc_norm_stderr": 0.033239152993573814,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.39552847129229235,
"mc2_stderr": 0.014053302822098265
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910471
},
"harness|hellaswag|10": {
"acc": 0.6336387173869747,
"acc_stderr": 0.004808251269682433,
"acc_norm": 0.8311093407687712,
"acc_norm_stderr": 0.0037388962449538187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089552,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266868,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.012663412101248333,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.012663412101248333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.39552847129229235,
"mc2_stderr": 0.014053302822098265
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090248
},
"harness|gsm8k|5": {
"acc": 0.4799090219863533,
"acc_stderr": 0.01376136177298902
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged | [
"region:us"
] | 2024-02-16T10:14:24+00:00 | {"pretty_name": "Evaluation run of fzzhang/mistralv1_gsm8k_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T10:12:02.378813](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged/blob/main/results_2024-02-16T10-12-02.378813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6294515756856921,\n \"acc_stderr\": 0.03258557096163295,\n \"acc_norm\": 0.6334878028725921,\n \"acc_norm_stderr\": 0.033239152993573814,\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39552847129229235,\n \"mc2_stderr\": 0.014053302822098265\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910471\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6336387173869747,\n \"acc_stderr\": 0.004808251269682433,\n \"acc_norm\": 0.8311093407687712,\n \"acc_norm_stderr\": 0.0037388962449538187\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089552,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266868,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n \"acc_stderr\": 0.012663412101248333,\n \"acc_norm\": 0.43546284224250326,\n \"acc_norm_stderr\": 0.012663412101248333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39552847129229235,\n \"mc2_stderr\": 0.014053302822098265\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090248\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4799090219863533,\n \"acc_stderr\": 0.01376136177298902\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/mistralv1_gsm8k_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|winogrande|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["results_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T10-12-02.378813.parquet"]}]}]} | 2024-02-16T10:14:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged
Dataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-16T10:12:02.378813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T10:12:02.378813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged\n\n\n\nDataset automatically created during the evaluation run of model fzzhang/mistralv1_gsm8k_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-16T10:12:02.378813(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bbe613e1fa533647369080785fa2069cfac9e766 |
# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test_wanda_240109",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23401038489636644,
"acc_stderr": 0.029968361313724278,
"acc_norm": 0.23351347966222002,
"acc_norm_stderr": 0.0307471687800331,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407305,
"acc_norm": 0.2295221843003413,
"acc_norm_stderr": 0.012288926760890797
},
"harness|hellaswag|10": {
"acc": 0.25542720573590916,
"acc_stderr": 0.004352098082984431,
"acc_norm": 0.2526389165504879,
"acc_norm_stderr": 0.004336375492801798
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.037752516806863715,
"acc_norm": 0.17,
"acc_norm_stderr": 0.037752516806863715
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.014897235229450707,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.014897235229450707
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4988161010260458,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__test_wanda_240109 | [
"region:us"
] | 2024-01-13T17:17:03+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/test_wanda_240109", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test_wanda_240109\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23401038489636644,\n \"acc_stderr\": 0.029968361313724278,\n \"acc_norm\": 0.23351347966222002,\n \"acc_norm_stderr\": 0.0307471687800331,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407305,\n \"acc_norm\": 0.2295221843003413,\n \"acc_norm_stderr\": 0.012288926760890797\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25542720573590916,\n \"acc_stderr\": 0.004352098082984431,\n \"acc_norm\": 0.2526389165504879,\n \"acc_norm_stderr\": 0.004336375492801798\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.037752516806863715,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.037752516806863715\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n \"acc_stderr\": 0.014897235229450707,\n \"acc_norm\": 0.22349936143039592,\n \"acc_norm_stderr\": 0.014897235229450707\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529019\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test_wanda_240109", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|winogrande|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["results_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["results_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-19-19.094893.parquet"]}]}]} | 2024-01-13T17:21:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109
Dataset automatically created during the evaluation run of model alnrg2arg/test_wanda_240109 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:19:19.094893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test_wanda_240109 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:19:19.094893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test_wanda_240109 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:19:19.094893(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2a5109f47f3ee42f69c51145df5d823de592857a |
# Dataset of trieste/トリエステ/的里雅斯特 (Azur Lane)
This is the dataset of trieste/トリエステ/的里雅斯特 (Azur Lane), containing 92 images and their tags.
The core tags of this character are `breasts, hair_over_one_eye, large_breasts, long_hair, green_eyes, pink_hair, ponytail, hair_ornament, mole, hairclip, bangs, mole_under_eye, earrings, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 151.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 71.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 222 | 157.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 126.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 222 | 248.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/trieste_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | onsen, water, 1girl, cleavage, collarbone, looking_at_viewer, solo, blush, wet, nude_cover, sitting, steam, naked_towel, outdoors, night, partially_submerged, sky |
| 1 | 34 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, military_uniform, sideboob, solo, black_skirt, miniskirt, jewelry, aiguillette, epaulettes, pencil_skirt, looking_at_viewer, black_jacket, white_gloves, black_pantyhose, mole_on_body, cape, black_coat, simple_background, standing, cowboy_shot, hair_between_eyes, white_background, revealing_clothes, sidelocks, armpits, holding, clipboard |
| 2 | 26 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | white_shirt, 1girl, looking_at_viewer, solo, short_sleeves, pleated_skirt, red_neckerchief, official_alternate_costume, miniskirt, crop_top, serafuku, black_pantyhose, blue_sailor_collar, blue_skirt, blush, navel, ribbon, midriff, see-through_silhouette, black_skirt, standing, wristwatch |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | onsen | water | 1girl | cleavage | collarbone | looking_at_viewer | solo | blush | wet | nude_cover | sitting | steam | naked_towel | outdoors | night | partially_submerged | sky | military_uniform | sideboob | black_skirt | miniskirt | jewelry | aiguillette | epaulettes | pencil_skirt | black_jacket | white_gloves | black_pantyhose | mole_on_body | cape | black_coat | simple_background | standing | cowboy_shot | hair_between_eyes | white_background | revealing_clothes | sidelocks | armpits | holding | clipboard | white_shirt | short_sleeves | pleated_skirt | red_neckerchief | official_alternate_costume | crop_top | serafuku | blue_sailor_collar | blue_skirt | navel | ribbon | midriff | see-through_silhouette | wristwatch |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-----------|:-------------|:--------------------|:-------|:--------|:------|:-------------|:----------|:--------|:--------------|:-----------|:--------|:----------------------|:------|:-------------------|:-----------|:--------------|:------------|:----------|:--------------|:-------------|:---------------|:---------------|:---------------|:------------------|:---------------|:-------|:-------------|:--------------------|:-----------|:--------------|:--------------------|:-------------------|:--------------------|:------------|:----------|:----------|:------------|:--------------|:----------------|:----------------|:------------------|:-----------------------------|:-----------|:-----------|:---------------------|:-------------|:--------|:---------|:----------|:-------------------------|:-------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 34 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | | X | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 26 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | | X | | | X | X | X | | | | | | | | | | | | X | X | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/trieste_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:17:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:42:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of trieste/トリエステ/的里雅斯特 (Azur Lane)
==========================================
This is the dataset of trieste/トリエステ/的里雅斯特 (Azur Lane), containing 92 images and their tags.
The core tags of this character are 'breasts, hair\_over\_one\_eye, large\_breasts, long\_hair, green\_eyes, pink\_hair, ponytail, hair\_ornament, mole, hairclip, bangs, mole\_under\_eye, earrings, purple\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ea7e009ae71a99466156e7413336be1d0516857c |
# Dataset of guam/グアム/关岛 (Azur Lane)
This is the dataset of guam/グアム/关岛 (Azur Lane), containing 54 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, large_breasts, bangs, blue_eyes, very_long_hair, twintails, symbol-shaped_pupils, animal_ears, hat, purple_eyes, rabbit_ears, hair_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 54 | 116.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 54 | 52.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 147 | 120.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 54 | 96.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 147 | 194.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guam_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, blush, cleavage, fake_animal_ears, looking_at_viewer, open_mouth, playboy_bunny, solo, white_gloves, facial_tattoo, fang, simple_background, thighs, top_hat, two_side_up, white_background, :d, blue_leotard, thighhighs, collarbone, detached_collar, heart-shaped_pupils, holding, huge_breasts, ribbon |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, covered_navel, fingerless_gloves, smile, solo, long_sleeves, looking_at_viewer, open_mouth, black_thighhighs, blush, hairclip, highleg_leotard, leotard_under_clothes, one_eye_closed, sidelocks, thighs, \m/, cowboy_shot, skindentation, standing, star_(symbol) |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, blush, looking_at_viewer, smile, solo, :q, fingerless_gloves, +_+, hairclip, headgear, heart, simple_background, star_(symbol), upper_body, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | cleavage | fake_animal_ears | looking_at_viewer | open_mouth | playboy_bunny | solo | white_gloves | facial_tattoo | fang | simple_background | thighs | top_hat | two_side_up | white_background | :d | blue_leotard | thighhighs | collarbone | detached_collar | heart-shaped_pupils | holding | huge_breasts | ribbon | black_gloves | covered_navel | fingerless_gloves | smile | long_sleeves | black_thighhighs | hairclip | highleg_leotard | leotard_under_clothes | one_eye_closed | sidelocks | \m/ | cowboy_shot | skindentation | standing | star_(symbol) | :q | +_+ | headgear | heart | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-----------|:-------------------|:--------------------|:-------------|:----------------|:-------|:---------------|:----------------|:-------|:--------------------|:---------|:----------|:--------------|:-------------------|:-----|:---------------|:-------------|:-------------|:------------------|:----------------------|:----------|:---------------|:---------|:---------------|:----------------|:--------------------|:--------|:---------------|:-------------------|:-----------|:------------------|:------------------------|:-----------------|:------------|:------|:--------------|:----------------|:-----------|:----------------|:-----|:------|:-----------|:--------|:-------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | | X | X | | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | X | | | X | | | | X | | | | X | | | | | | | | | | X | | X | X | | | X | | | | | | | | | X | X | X | X | X | X |
| CyberHarem/guam_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:17:41+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:32:41+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of guam/グアム/关岛 (Azur Lane)
==================================
This is the dataset of guam/グアム/关岛 (Azur Lane), containing 54 images and their tags.
The core tags of this character are 'blonde\_hair, breasts, long\_hair, large\_breasts, bangs, blue\_eyes, very\_long\_hair, twintails, symbol-shaped\_pupils, animal\_ears, hat, purple\_eyes, rabbit\_ears, hair\_ornament, hair\_between\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4333d8fda7ce5c54cccfbbba597353a66b646554 |
# Dataset Card for Evaluation run of flemmingmiguel/NeuDist-Ro-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/NeuDist-Ro-7B](https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:17:37.802131](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B/blob/main/results_2024-01-13T17-17-37.802131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564397980772864,
"acc_stderr": 0.03204722059494361,
"acc_norm": 0.6561722533035016,
"acc_norm_stderr": 0.032711908051799604,
"mc1": 0.49571603427172584,
"mc1_stderr": 0.01750285857737128,
"mc2": 0.6493023269912708,
"mc2_stderr": 0.015276465453752726
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688065,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266127
},
"harness|hellaswag|10": {
"acc": 0.6977693686516631,
"acc_stderr": 0.004582861219020889,
"acc_norm": 0.8748257319259112,
"acc_norm_stderr": 0.0033024011069263197
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.01654240195463191,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.01654240195463191
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49571603427172584,
"mc1_stderr": 0.01750285857737128,
"mc2": 0.6493023269912708,
"mc2_stderr": 0.015276465453752726
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.01250359248181895
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B | [
"region:us"
] | 2024-01-13T17:19:56+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/NeuDist-Ro-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/NeuDist-Ro-7B](https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:17:37.802131](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B/blob/main/results_2024-01-13T17-17-37.802131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564397980772864,\n \"acc_stderr\": 0.03204722059494361,\n \"acc_norm\": 0.6561722533035016,\n \"acc_norm_stderr\": 0.032711908051799604,\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.01750285857737128,\n \"mc2\": 0.6493023269912708,\n \"mc2_stderr\": 0.015276465453752726\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688065,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266127\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6977693686516631,\n \"acc_stderr\": 0.004582861219020889,\n \"acc_norm\": 0.8748257319259112,\n \"acc_norm_stderr\": 0.0033024011069263197\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992002,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992002\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.01654240195463191,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.01654240195463191\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.01750285857737128,\n \"mc2\": 0.6493023269912708,\n \"mc2_stderr\": 0.015276465453752726\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \"acc_stderr\": 0.01250359248181895\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|winogrande|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["results_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-17-37.802131.parquet"]}]}]} | 2024-01-13T17:20:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of flemmingmiguel/NeuDist-Ro-7B
Dataset automatically created during the evaluation run of model flemmingmiguel/NeuDist-Ro-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:17:37.802131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of flemmingmiguel/NeuDist-Ro-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/NeuDist-Ro-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:17:37.802131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of flemmingmiguel/NeuDist-Ro-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/NeuDist-Ro-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:17:37.802131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f26e6a73877bb3292ba339d7ef3ee169e0eab72e |
# Dataset Card for Evaluation run of kodonho/SolarM-SakuraSolar-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/SolarM-SakuraSolar-SLERP](https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:26:45.129484](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP/blob/main/results_2024-01-13T17-26-45.129484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6661841092869378,
"acc_stderr": 0.031618934361982654,
"acc_norm": 0.6670035390095326,
"acc_norm_stderr": 0.03226162867427471,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.01734702445010748,
"mc2": 0.7209867000245426,
"mc2_stderr": 0.014980299607085815
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688067,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428173
},
"harness|hellaswag|10": {
"acc": 0.714797849034057,
"acc_stderr": 0.0045058790846068415,
"acc_norm": 0.8846843258315077,
"acc_norm_stderr": 0.003187497509087418
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270105,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.01734702445010748,
"mc2": 0.7209867000245426,
"mc2_stderr": 0.014980299607085815
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838899
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP | [
"region:us"
] | 2024-01-13T17:29:01+00:00 | {"pretty_name": "Evaluation run of kodonho/SolarM-SakuraSolar-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/SolarM-SakuraSolar-SLERP](https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:26:45.129484](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP/blob/main/results_2024-01-13T17-26-45.129484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6661841092869378,\n \"acc_stderr\": 0.031618934361982654,\n \"acc_norm\": 0.6670035390095326,\n \"acc_norm_stderr\": 0.03226162867427471,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010748,\n \"mc2\": 0.7209867000245426,\n \"mc2_stderr\": 0.014980299607085815\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688067,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714797849034057,\n \"acc_stderr\": 0.0045058790846068415,\n \"acc_norm\": 0.8846843258315077,\n \"acc_norm_stderr\": 0.003187497509087418\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n \"acc_stderr\": 0.016337268694270105,\n \"acc_norm\": 0.39329608938547483,\n \"acc_norm_stderr\": 0.016337268694270105\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010748,\n \"mc2\": 0.7209867000245426,\n \"mc2_stderr\": 0.014980299607085815\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \"acc_stderr\": 0.013166337192115683\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|winogrande|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["results_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-26-45.129484.parquet"]}]}]} | 2024-01-13T17:29:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kodonho/SolarM-SakuraSolar-SLERP
Dataset automatically created during the evaluation run of model kodonho/SolarM-SakuraSolar-SLERP on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:26:45.129484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kodonho/SolarM-SakuraSolar-SLERP\n\n\n\nDataset automatically created during the evaluation run of model kodonho/SolarM-SakuraSolar-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:26:45.129484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kodonho/SolarM-SakuraSolar-SLERP\n\n\n\nDataset automatically created during the evaluation run of model kodonho/SolarM-SakuraSolar-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:26:45.129484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7d0c0d94e45731858af0830be29ef6fd91826abd |
# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6659359197324706,
"acc_stderr": 0.03167249441105516,
"acc_norm": 0.6667077779566729,
"acc_norm_stderr": 0.032318448519432046,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173306,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520766
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551707,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.003217184906847944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822475,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822475
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441194,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146366,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146366
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370634
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP | [
"region:us"
] | 2024-01-13T17:34:57+00:00 | {"pretty_name": "Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6659359197324706,\n \"acc_stderr\": 0.03167249441105516,\n \"acc_norm\": 0.6667077779566729,\n \"acc_norm_stderr\": 0.032318448519432046,\n \"mc1\": 0.5679314565483476,\n \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173306,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520766\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n \"acc_stderr\": 0.004525960965551707,\n \"acc_norm\": 0.882194781915953,\n \"acc_norm_stderr\": 0.003217184906847944\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n \"acc_stderr\": 0.016361354769822475,\n \"acc_norm\": 0.39664804469273746,\n \"acc_norm_stderr\": 0.016361354769822475\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n \"acc_stderr\": 0.012769845366441194,\n \"acc_norm\": 0.49608865710560623,\n \"acc_norm_stderr\": 0.012769845366441194\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146366,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146366\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370634\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146877\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["results_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-32-35.779900.parquet"]}]}]} | 2024-01-13T17:35:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
Dataset automatically created during the evaluation run of model kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:32:35.779900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:32:35.779900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:32:35.779900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b2a292050a3edab5957ecf96ac6a4abd44617711 |
# Dataset: PhilEO Downstream Tasks
A novel 400GB Sentinel-2 dataset of the PhilEO Bench containing labels for the three downstream tasks of building density estimation, road segmentation, and land cover classification.
## Dataset Details
### Dataset Description
The PhilEO dataset is a 400GB global dataset of Sentinel-2 images and has labels for roads, buildings, and land cover, where these are the three downstream tasks. The data is sampled from geographically diverse regions around the globe including: Denmark, East Africa, Egypt, Guinea, Europe, Ghana, Israel, Japan, Nigeria, North America, Senegal, South America, Tanzania, and Uganda. Each region has up to 200 tiles of varying sizes. Some locations have been revisited up to 3 times.
The data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.
- **Curated by:** ESA Phi-lab
- **License:** MIT
## Uses
The dataset can be used to evaluate any EO Foundation Model.
### Dataset Sources
The basic links for the dataset:
- **Repository:** http://huggingface.co/datasets/ESA-philab/PhilEO-downstream
- **Paper:** http://arxiv.org/pdf/2401.04464.pdf
- **Project Website:** http://phileo-bench.github.io
- **Code GitHub:** http://github.com/ESA-PhiLab/PhilEO-Bench
- **Dataset also in:** http://www.eotdl.com/datasets/PhilEO-downstream
- **arXiv:** http://arxiv.org/abs/2401.04464
## Citation
Casper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, and Bertrand Le Saux, “PhilEO Bench: Evaluating Geo-Spatial Foundation Models,” arXiv:2401.04464, 2024.
| ESA-philab/PhilEO-downstream | [
"license:mit",
"arxiv:2401.04464",
"region:us"
] | 2024-01-13T17:39:24+00:00 | {"license": "mit"} | 2024-02-03T13:41:09+00:00 | [
"2401.04464"
] | [] | TAGS
#license-mit #arxiv-2401.04464 #region-us
|
# Dataset: PhilEO Downstream Tasks
A novel 400GB Sentinel-2 dataset of the PhilEO Bench containing labels for the three downstream tasks of building density estimation, road segmentation, and land cover classification.
## Dataset Details
### Dataset Description
The PhilEO dataset is a 400GB global dataset of Sentinel-2 images and has labels for roads, buildings, and land cover, where these are the three downstream tasks. The data is sampled from geographically diverse regions around the globe including: Denmark, East Africa, Egypt, Guinea, Europe, Ghana, Israel, Japan, Nigeria, North America, Senegal, South America, Tanzania, and Uganda. Each region has up to 200 tiles of varying sizes. Some locations have been revisited up to 3 times.
The data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.
- Curated by: ESA Phi-lab
- License: MIT
## Uses
The dataset can be used to evaluate any EO Foundation Model.
### Dataset Sources
The basic links for the dataset:
- Repository: URL
- Paper: URL
- Project Website: URL
- Code GitHub: URL
- Dataset also in: URL
- arXiv: URL
Casper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, and Bertrand Le Saux, “PhilEO Bench: Evaluating Geo-Spatial Foundation Models,” arXiv:2401.04464, 2024.
| [
"# Dataset: PhilEO Downstream Tasks\n\nA novel 400GB Sentinel-2 dataset of the PhilEO Bench containing labels for the three downstream tasks of building density estimation, road segmentation, and land cover classification.",
"## Dataset Details",
"### Dataset Description\n\nThe PhilEO dataset is a 400GB global dataset of Sentinel-2 images and has labels for roads, buildings, and land cover, where these are the three downstream tasks. The data is sampled from geographically diverse regions around the globe including: Denmark, East Africa, Egypt, Guinea, Europe, Ghana, Israel, Japan, Nigeria, North America, Senegal, South America, Tanzania, and Uganda. Each region has up to 200 tiles of varying sizes. Some locations have been revisited up to 3 times.\n\nThe data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.\n\n- Curated by: ESA Phi-lab\n- License: MIT",
"## Uses\n\nThe dataset can be used to evaluate any EO Foundation Model.",
"### Dataset Sources\n\nThe basic links for the dataset:\n\n- Repository: URL \n- Paper: URL \n- Project Website: URL \n- Code GitHub: URL \n- Dataset also in: URL \n- arXiv: URL \n\nCasper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, and Bertrand Le Saux, “PhilEO Bench: Evaluating Geo-Spatial Foundation Models,” arXiv:2401.04464, 2024."
] | [
"TAGS\n#license-mit #arxiv-2401.04464 #region-us \n",
"# Dataset: PhilEO Downstream Tasks\n\nA novel 400GB Sentinel-2 dataset of the PhilEO Bench containing labels for the three downstream tasks of building density estimation, road segmentation, and land cover classification.",
"## Dataset Details",
"### Dataset Description\n\nThe PhilEO dataset is a 400GB global dataset of Sentinel-2 images and has labels for roads, buildings, and land cover, where these are the three downstream tasks. The data is sampled from geographically diverse regions around the globe including: Denmark, East Africa, Egypt, Guinea, Europe, Ghana, Israel, Japan, Nigeria, North America, Senegal, South America, Tanzania, and Uganda. Each region has up to 200 tiles of varying sizes. Some locations have been revisited up to 3 times.\n\nThe data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.\n\n- Curated by: ESA Phi-lab\n- License: MIT",
"## Uses\n\nThe dataset can be used to evaluate any EO Foundation Model.",
"### Dataset Sources\n\nThe basic links for the dataset:\n\n- Repository: URL \n- Paper: URL \n- Project Website: URL \n- Code GitHub: URL \n- Dataset also in: URL \n- arXiv: URL \n\nCasper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, and Bertrand Le Saux, “PhilEO Bench: Evaluating Geo-Spatial Foundation Models,” arXiv:2401.04464, 2024."
] |
b87dad1d5e99405375b0238d34a53acb4ef779b3 |
# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549679088142555,
"acc_stderr": 0.03191312416103038,
"acc_norm": 0.6559474034222305,
"acc_norm_stderr": 0.03256025642473883,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6649073889663414,
"acc_stderr": 0.0047105814966393374,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.0035425443194051424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964283,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964283
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649405,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649405
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8403575989782887,
"acc_stderr": 0.013097934513263005,
"acc_norm": 0.8403575989782887,
"acc_norm_stderr": 0.013097934513263005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490298,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729336,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377197,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515303
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.012919408108656423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B | [
"region:us"
] | 2024-01-13T17:44:12+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/Distilled-HermesChat-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549679088142555,\n \"acc_stderr\": 0.03191312416103038,\n \"acc_norm\": 0.6559474034222305,\n \"acc_norm_stderr\": 0.03256025642473883,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6649073889663414,\n \"acc_stderr\": 0.0047105814966393374,\n \"acc_norm\": 0.8521210914160526,\n \"acc_norm_stderr\": 0.0035425443194051424\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964283,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964283\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649405,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649405\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8403575989782887,\n \"acc_stderr\": 0.013097934513263005,\n \"acc_norm\": 0.8403575989782887,\n \"acc_norm_stderr\": 0.013097934513263005\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n \"acc_stderr\": 0.015166544550490298,\n \"acc_norm\": 0.28938547486033517,\n \"acc_norm_stderr\": 0.015166544550490298\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729336,\n \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515303\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \"acc_stderr\": 0.012919408108656423\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["results_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-41-54.536456.parquet"]}]}]} | 2024-01-13T17:44:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B
Dataset automatically created during the evaluation run of model flemmingmiguel/Distilled-HermesChat-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T17:41:54.536456(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/Distilled-HermesChat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:41:54.536456(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/Distilled-HermesChat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T17:41:54.536456(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
48ca86fe0da4800889ca8279b64abfa7f190b183 |
# Dataset of amazon/アマゾン/女将 (Azur Lane)
This is the dataset of amazon/アマゾン/女将 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, blue_eyes, ahoge, fang, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 46.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 29.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 100 | 59.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 42.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 100 | 77.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/amazon_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, gloves, looking_at_viewer, solo, cape, open_mouth, sword, belt, black_thighhighs, smile, blush, pleated_skirt, uniform, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | looking_at_viewer | solo | cape | open_mouth | sword | belt | black_thighhighs | smile | blush | pleated_skirt | uniform | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------------|:-------|:-------|:-------------|:--------|:-------|:-------------------|:--------|:--------|:----------------|:----------|:-------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/amazon_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:46:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:56:27+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of amazon/アマゾン/女将 (Azur Lane)
=====================================
This is the dataset of amazon/アマゾン/女将 (Azur Lane), containing 43 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, twintails, blue\_eyes, ahoge, fang, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |